Sample records for quantitative analysis tool

  1. PANDA-view: An easy-to-use tool for statistical analysis and visualization of quantitative proteomics data.

    PubMed

    Chang, Cheng; Xu, Kaikun; Guo, Chaoping; Wang, Jinxia; Yan, Qi; Zhang, Jian; He, Fuchu; Zhu, Yunping

    2018-05-22

    Compared with the numerous software tools developed for identification and quantification of -omics data, there remains a lack of suitable tools for both downstream analysis and data visualization. To help researchers better understand the biological meanings in their -omics data, we present an easy-to-use tool, named PANDA-view, for both statistical analysis and visualization of quantitative proteomics data and other -omics data. PANDA-view contains various kinds of analysis methods such as normalization, missing value imputation, statistical tests, clustering and principal component analysis, as well as the most commonly-used data visualization methods including an interactive volcano plot. Additionally, it provides user-friendly interfaces for protein-peptide-spectrum representation of the quantitative proteomics data. PANDA-view is freely available at https://sourceforge.net/projects/panda-view/. 1987ccpacer@163.com and zhuyunping@gmail.com. Supplementary data are available at Bioinformatics online.

  2. DAnTE: a statistical tool for quantitative analysis of –omics data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polpitiya, Ashoka D.; Qian, Weijun; Jaitly, Navdeep

    2008-05-03

    DAnTE (Data Analysis Tool Extension) is a statistical tool designed to address challenges unique to quantitative bottom-up, shotgun proteomics data. This tool has also been demonstrated for microarray data and can easily be extended to other high-throughput data types. DAnTE features selected normalization methods, missing value imputation algorithms, peptide to protein rollup methods, an extensive array of plotting functions, and a comprehensive ANOVA scheme that can handle unbalanced data and random effects. The Graphical User Interface (GUI) is designed to be very intuitive and user friendly.

  3. SMART: A Propositional Logic-Based Trade Analysis and Risk Assessment Tool for a Complex Mission

    NASA Technical Reports Server (NTRS)

    Ono, Masahiro; Nicholas, Austin; Alibay, Farah; Parrish, Joseph

    2015-01-01

    This paper introduces a new trade analysis software called the Space Mission Architecture and Risk Analysis Tool (SMART). This tool supports a high-level system trade study on a complex mission, such as a potential Mars Sample Return (MSR) mission, in an intuitive and quantitative manner. In a complex mission, a common approach to increase the probability of success is to have redundancy and prepare backups. Quantitatively evaluating the utility of adding redundancy to a system is important but not straightforward, particularly when the failure of parallel subsystems are correlated.

  4. Evaluation of a web based informatics system with data mining tools for predicting outcomes with quantitative imaging features in stroke rehabilitation clinical trials

    NASA Astrophysics Data System (ADS)

    Wang, Ximing; Kim, Bokkyu; Park, Ji Hoon; Wang, Erik; Forsyth, Sydney; Lim, Cody; Ravi, Ragini; Karibyan, Sarkis; Sanchez, Alexander; Liu, Brent

    2017-03-01

    Quantitative imaging biomarkers are used widely in clinical trials for tracking and evaluation of medical interventions. Previously, we have presented a web based informatics system utilizing quantitative imaging features for predicting outcomes in stroke rehabilitation clinical trials. The system integrates imaging features extraction tools and a web-based statistical analysis tool. The tools include a generalized linear mixed model(GLMM) that can investigate potential significance and correlation based on features extracted from clinical data and quantitative biomarkers. The imaging features extraction tools allow the user to collect imaging features and the GLMM module allows the user to select clinical data and imaging features such as stroke lesion characteristics from the database as regressors and regressands. This paper discusses the application scenario and evaluation results of the system in a stroke rehabilitation clinical trial. The system was utilized to manage clinical data and extract imaging biomarkers including stroke lesion volume, location and ventricle/brain ratio. The GLMM module was validated and the efficiency of data analysis was also evaluated.

  5. Putting tools in the toolbox: Development of a free, open-source toolbox for quantitative image analysis of porous media.

    NASA Astrophysics Data System (ADS)

    Iltis, G.; Caswell, T. A.; Dill, E.; Wilkins, S.; Lee, W. K.

    2014-12-01

    X-ray tomographic imaging of porous media has proven to be a valuable tool for investigating and characterizing the physical structure and state of both natural and synthetic porous materials, including glass bead packs, ceramics, soil and rock. Given that most synchrotron facilities have user programs which grant academic researchers access to facilities and x-ray imaging equipment free of charge, a key limitation or hindrance for small research groups interested in conducting x-ray imaging experiments is the financial cost associated with post-experiment data analysis. While the cost of high performance computing hardware continues to decrease, expenses associated with licensing commercial software packages for quantitative image analysis continue to increase, with current prices being as high as $24,000 USD, for a single user license. As construction of the Nation's newest synchrotron accelerator nears completion, a significant effort is being made here at the National Synchrotron Light Source II (NSLS-II), Brookhaven National Laboratory (BNL), to provide an open-source, experiment-to-publication toolbox that reduces the financial and technical 'activation energy' required for performing sophisticated quantitative analysis of multidimensional porous media data sets, collected using cutting-edge x-ray imaging techniques. Implementation focuses on leveraging existing open-source projects and developing additional tools for quantitative analysis. We will present an overview of the software suite that is in development here at BNL including major design decisions, a demonstration of several test cases illustrating currently available quantitative tools for analysis and characterization of multidimensional porous media image data sets and plans for their future development.

  6. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.

  7. Implementing a Quantitative Analysis Design Tool for Future Generation Interfaces

    DTIC Science & Technology

    2012-03-01

    with Remotely Piloted Aircraft (RPA) has resulted in the need of a platform to evaluate interface design. The Vigilant Spirit Control Station ( VSCS ...Spirit interface. A modified version of the HCI Index was successfully applied to perform a quantitative analysis of the baseline VSCS interface and...time of the original VSCS interface. These results revealed the effectiveness of the tool and demonstrated in the design of future generation

  8. Deficient Contractor Business Systems: Applying the Value at Risk (VaR) Model to Earned Value Management Systems

    DTIC Science & Technology

    2013-06-30

    QUANTITATIVE RISK ANALYSIS The use of quantitative cost risk analysis tools can be valuable in measuring numerical risk to the government ( Galway , 2004...assessment of the EVMS itself. Galway (2004) practically linked project quantitative risk assessment to EVM by focusing on cost, schedule, and...www.amazon.com Galway , L. (2004, February). Quantitative risk analysis for project management: A critical review (RAND Working Paper WR-112-RC

  9. A Multidimensional Analysis Tool for Visualizing Online Interactions

    ERIC Educational Resources Information Center

    Kim, Minjeong; Lee, Eunchul

    2012-01-01

    This study proposes and verifies the performance of an analysis tool for visualizing online interactions. A review of the most widely used methods for analyzing online interactions, including quantitative analysis, content analysis, and social network analysis methods, indicates these analysis methods have some limitations resulting from their…

  10. Information Technology Tools Analysis in Quantitative Courses of IT-Management (Case Study: M.Sc.-Tehran University)

    ERIC Educational Resources Information Center

    Eshlaghy, Abbas Toloie; Kaveh, Haydeh

    2009-01-01

    The purpose of this study was to determine the most suitable ICT-based education and define the most suitable e-content creation tools for quantitative courses in the IT-management Masters program. ICT-based tools and technologies are divided in to three categories: the creation of e-content, the offering of e-content, and access to e-content. In…

  11. Spotsizer: High-throughput quantitative analysis of microbial growth.

    PubMed

    Bischof, Leanne; Převorovský, Martin; Rallis, Charalampos; Jeffares, Daniel C; Arzhaeva, Yulia; Bähler, Jürg

    2016-10-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license.

  12. Digital Holography, a metrological tool for quantitative analysis: Trends and future applications

    NASA Astrophysics Data System (ADS)

    Paturzo, Melania; Pagliarulo, Vito; Bianco, Vittorio; Memmolo, Pasquale; Miccio, Lisa; Merola, Francesco; Ferraro, Pietro

    2018-05-01

    A review on the last achievements of Digital Holography is reported in this paper, showing that this powerful method can be a key metrological tool for the quantitative analysis and non-invasive inspection of a variety of materials, devices and processes. Nowadays, its range of applications has been greatly extended, including the study of live biological matter and biomedical applications. This paper overviews the main progresses and future perspectives of digital holography, showing new optical configurations and investigating the numerical issues to be tackled for the processing and display of quantitative data.

  13. MilQuant: a free, generic software tool for isobaric tagging-based quantitation.

    PubMed

    Zou, Xiao; Zhao, Minzhi; Shen, Hongyan; Zhao, Xuyang; Tong, Yuanpeng; Wang, Qingsong; Wei, Shicheng; Ji, Jianguo

    2012-09-18

    Isobaric tagging techniques such as iTRAQ and TMT are widely used in quantitative proteomics and especially useful for samples that demand in vitro labeling. Due to diversity in choices of MS acquisition approaches, identification algorithms, and relative abundance deduction strategies, researchers are faced with a plethora of possibilities when it comes to data analysis. However, the lack of generic and flexible software tool often makes it cumbersome for researchers to perform the analysis entirely as desired. In this paper, we present MilQuant, mzXML-based isobaric labeling quantitator, a pipeline of freely available programs that supports native acquisition files produced by all mass spectrometer types and collection approaches currently used in isobaric tagging based MS data collection. Moreover, aside from effective normalization and abundance ratio deduction algorithms, MilQuant exports various intermediate results along each step of the pipeline, making it easy for researchers to customize the analysis. The functionality of MilQuant was demonstrated by four distinct datasets from different laboratories. The compatibility and extendibility of MilQuant makes it a generic and flexible tool that can serve as a full solution to data analysis of isobaric tagging-based quantitation. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Measuring laboratory-based influenza surveillance capacity: development of the 'International Influenza Laboratory Capacity Review' Tool.

    PubMed

    Muir-Paulik, S A; Johnson, L E A; Kennedy, P; Aden, T; Villanueva, J; Reisdorf, E; Humes, R; Moen, A C

    2016-01-01

    The 2005 International Health Regulations (IHR 2005) emphasized the importance of laboratory capacity to detect emerging diseases including novel influenza viruses. To support IHR 2005 requirements and the need to enhance influenza laboratory surveillance capacity, the Association of Public Health Laboratories (APHL) and the Centers for Disease Control and Prevention (CDC) Influenza Division developed the International Influenza Laboratory Capacity Review (Tool). Data from 37 assessments were reviewed and analyzed to verify that the quantitative analysis results accurately depicted a laboratory's capacity and capabilities. Subject matter experts in influenza and laboratory practice used an iterative approach to develop the Tool incorporating feedback and lessons learnt through piloting and implementation. To systematically analyze assessment data, a quantitative framework for analysis was added to the Tool. The review indicated that changes in scores consistently reflected enhanced or decreased capacity. The review process also validated the utility of adding a quantitative analysis component to the assessments and the benefit of establishing a baseline from which to compare future assessments in a standardized way. Use of the Tool has provided APHL, CDC and each assessed laboratory with a standardized analysis of the laboratory's capacity. The information generated is used to improve laboratory systems for laboratory testing and enhance influenza surveillance globally. We describe the development of the Tool and lessons learnt. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Quantitative fractography by digital image processing: NIH Image macro tools for stereo pair analysis and 3-D reconstruction.

    PubMed

    Hein, L R

    2001-10-01

    A set of NIH Image macro programs was developed to make qualitative and quantitative analyses from digital stereo pictures produced by scanning electron microscopes. These tools were designed for image alignment, anaglyph representation, animation, reconstruction of true elevation surfaces, reconstruction of elevation profiles, true-scale elevation mapping and, for the quantitative approach, surface area and roughness calculations. Limitations on time processing, scanning techniques and programming concepts are also discussed.

  16. A survey of tools for the analysis of quantitative PCR (qPCR) data.

    PubMed

    Pabinger, Stephan; Rödiger, Stefan; Kriegner, Albert; Vierlinger, Klemens; Weinhäusel, Andreas

    2014-09-01

    Real-time quantitative polymerase-chain-reaction (qPCR) is a standard technique in most laboratories used for various applications in basic research. Analysis of qPCR data is a crucial part of the entire experiment, which has led to the development of a plethora of methods. The released tools either cover specific parts of the workflow or provide complete analysis solutions. Here, we surveyed 27 open-access software packages and tools for the analysis of qPCR data. The survey includes 8 Microsoft Windows, 5 web-based, 9 R-based and 5 tools from other platforms. Reviewed packages and tools support the analysis of different qPCR applications, such as RNA quantification, DNA methylation, genotyping, identification of copy number variations, and digital PCR. We report an overview of the functionality, features and specific requirements of the individual software tools, such as data exchange formats, availability of a graphical user interface, included procedures for graphical data presentation, and offered statistical methods. In addition, we provide an overview about quantification strategies, and report various applications of qPCR. Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  17. 3D Slicer as an Image Computing Platform for the Quantitative Imaging Network

    PubMed Central

    Fedorov, Andriy; Beichel, Reinhard; Kalpathy-Cramer, Jayashree; Finet, Julien; Fillion-Robin, Jean-Christophe; Pujol, Sonia; Bauer, Christian; Jennings, Dominique; Fennessy, Fiona; Sonka, Milan; Buatti, John; Aylward, Stephen; Miller, James V.; Pieper, Steve; Kikinis, Ron

    2012-01-01

    Quantitative analysis has tremendous but mostly unrealized potential in healthcare to support objective and accurate interpretation of the clinical imaging. In 2008, the National Cancer Institute began building the Quantitative Imaging Network (QIN) initiative with the goal of advancing quantitative imaging in the context of personalized therapy and evaluation of treatment response. Computerized analysis is an important component contributing to reproducibility and efficiency of the quantitative imaging techniques. The success of quantitative imaging is contingent on robust analysis methods and software tools to bring these methods from bench to bedside. 3D Slicer is a free open source software application for medical image computing. As a clinical research tool, 3D Slicer is similar to a radiology workstation that supports versatile visualizations but also provides advanced functionality such as automated segmentation and registration for a variety of application domains. Unlike a typical radiology workstation, 3D Slicer is free and is not tied to specific hardware. As a programming platform, 3D Slicer facilitates translation and evaluation of the new quantitative methods by allowing the biomedical researcher to focus on the implementation of the algorithm, and providing abstractions for the common tasks of data communication, visualization and user interface development. Compared to other tools that provide aspects of this functionality, 3D Slicer is fully open source and can be readily extended and redistributed. In addition, 3D Slicer is designed to facilitate the development of new functionality in the form of 3D Slicer extensions. In this paper, we present an overview of 3D Slicer as a platform for prototyping, development and evaluation of image analysis tools for clinical research applications. To illustrate the utility of the platform in the scope of QIN, we discuss several use cases of 3D Slicer by the existing QIN teams, and we elaborate on the future directions that can further facilitate development and validation of imaging biomarkers using 3D Slicer. PMID:22770690

  18. A Critical Appraisal of Techniques, Software Packages, and Standards for Quantitative Proteomic Analysis

    PubMed Central

    Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.

    2012-01-01

    Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616

  19. Cardiac imaging: working towards fully-automated machine analysis & interpretation.

    PubMed

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-03-01

    Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.

  20. MIiSR: Molecular Interactions in Super-Resolution Imaging Enables the Analysis of Protein Interactions, Dynamics and Formation of Multi-protein Structures.

    PubMed

    Caetano, Fabiana A; Dirk, Brennan S; Tam, Joshua H K; Cavanagh, P Craig; Goiko, Maria; Ferguson, Stephen S G; Pasternak, Stephen H; Dikeakos, Jimmy D; de Bruyn, John R; Heit, Bryan

    2015-12-01

    Our current understanding of the molecular mechanisms which regulate cellular processes such as vesicular trafficking has been enabled by conventional biochemical and microscopy techniques. However, these methods often obscure the heterogeneity of the cellular environment, thus precluding a quantitative assessment of the molecular interactions regulating these processes. Herein, we present Molecular Interactions in Super Resolution (MIiSR) software which provides quantitative analysis tools for use with super-resolution images. MIiSR combines multiple tools for analyzing intermolecular interactions, molecular clustering and image segmentation. These tools enable quantification, in the native environment of the cell, of molecular interactions and the formation of higher-order molecular complexes. The capabilities and limitations of these analytical tools are demonstrated using both modeled data and examples derived from the vesicular trafficking system, thereby providing an established and validated experimental workflow capable of quantitatively assessing molecular interactions and molecular complex formation within the heterogeneous environment of the cell.

  1. SWATH2stats: An R/Bioconductor Package to Process and Convert Quantitative SWATH-MS Proteomics Data for Downstream Analysis Tools.

    PubMed

    Blattmann, Peter; Heusel, Moritz; Aebersold, Ruedi

    2016-01-01

    SWATH-MS is an acquisition and analysis technique of targeted proteomics that enables measuring several thousand proteins with high reproducibility and accuracy across many samples. OpenSWATH is popular open-source software for peptide identification and quantification from SWATH-MS data. For downstream statistical and quantitative analysis there exist different tools such as MSstats, mapDIA and aLFQ. However, the transfer of data from OpenSWATH to the downstream statistical tools is currently technically challenging. Here we introduce the R/Bioconductor package SWATH2stats, which allows convenient processing of the data into a format directly readable by the downstream analysis tools. In addition, SWATH2stats allows annotation, analyzing the variation and the reproducibility of the measurements, FDR estimation, and advanced filtering before submitting the processed data to downstream tools. These functionalities are important to quickly analyze the quality of the SWATH-MS data. Hence, SWATH2stats is a new open-source tool that summarizes several practical functionalities for analyzing, processing, and converting SWATH-MS data and thus facilitates the efficient analysis of large-scale SWATH/DIA datasets.

  2. Tissue microarrays and quantitative tissue-based image analysis as a tool for oncology biomarker and diagnostic development.

    PubMed

    Dolled-Filhart, Marisa P; Gustavson, Mark D

    2012-11-01

    Translational oncology has been improved by using tissue microarrays (TMAs), which facilitate biomarker analysis of large cohorts on a single slide. This has allowed for rapid analysis and validation of potential biomarkers for prognostic and predictive value, as well as for evaluation of biomarker prevalence. Coupled with quantitative analysis of immunohistochemical (IHC) staining, objective and standardized biomarker data from tumor samples can further advance companion diagnostic approaches for the identification of drug-responsive or resistant patient subpopulations. This review covers the advantages, disadvantages and applications of TMAs for biomarker research. Research literature and reviews of TMAs and quantitative image analysis methodology have been surveyed for this review (with an AQUA® analysis focus). Applications such as multi-marker diagnostic development and pathway-based biomarker subpopulation analyses are described. Tissue microarrays are a useful tool for biomarker analyses including prevalence surveys, disease progression assessment and addressing potential prognostic or predictive value. By combining quantitative image analysis with TMAs, analyses will be more objective and reproducible, allowing for more robust IHC-based diagnostic test development. Quantitative multi-biomarker IHC diagnostic tests that can predict drug response will allow for greater success of clinical trials for targeted therapies and provide more personalized clinical decision making.

  3. Kinetic Analysis of Amylase Using Quantitative Benedict's and Iodine Starch Reagents

    ERIC Educational Resources Information Center

    Cochran, Beverly; Lunday, Deborah; Miskevich, Frank

    2008-01-01

    Quantitative analysis of carbohydrates is a fundamental analytical tool used in many aspects of biology and chemistry. We have adapted a technique developed by Mathews et al. using an inexpensive scanner and open-source image analysis software to quantify amylase activity using both the breakdown of starch and the appearance of glucose. Breakdown…

  4. On aerodynamic wake analysis and its relation to total aerodynamic drag in a wind tunnel environment

    NASA Astrophysics Data System (ADS)

    Guterres, Rui M.

    The present work was developed with the goal of advancing the state of the art in the application of three-dimensional wake data analysis to the quantification of aerodynamic drag on a body in a low speed wind tunnel environment. Analysis of the existing tools, their strengths and limitations is presented. Improvements to the existing analysis approaches were made. Software tools were developed to integrate the analysis into a practical tool. A comprehensive derivation of the equations needed for drag computations based on three dimensional separated wake data is developed. A set of complete steps ranging from the basic mathematical concept to the applicable engineering equations is presented. An extensive experimental study was conducted. Three representative body types were studied in varying ground effect conditions. A detailed qualitative wake analysis using wake imaging and two and three dimensional flow visualization was performed. Several significant features of the flow were identified and their relation to the total aerodynamic drag established. A comprehensive wake study of this type is shown to be in itself a powerful tool for the analysis of the wake aerodynamics and its relation to body drag. Quantitative wake analysis techniques were developed. Significant post processing and data conditioning tools and precision analysis were developed. The quality of the data is shown to be in direct correlation with the accuracy of the computed aerodynamic drag. Steps are taken to identify the sources of uncertainty. These are quantified when possible and the accuracy of the computed results is seen to significantly improve. When post processing alone does not resolve issues related to precision and accuracy, solutions are proposed. The improved quantitative wake analysis is applied to the wake data obtained. Guidelines are established that will lead to more successful implementation of these tools in future research programs. Close attention is paid to implementation of issues that are of crucial importance for the accuracy of the results and that are not detailed in the literature. The impact of ground effect on the flows in hand is qualitatively and quantitatively studied. Its impact on the accuracy of the computations as well as the wall drag incompatibility with the theoretical model followed are discussed. The newly developed quantitative analysis provides significantly increased accuracy. The aerodynamic drag coefficient is computed within one percent of balance measured value for the best cases.

  5. Analyzing the texture changes in the quantitative phase maps of adipocytes

    NASA Astrophysics Data System (ADS)

    Roitshtain, Darina; Sharabani-Yosef, Orna; Gefen, Amit; Shaked, Natan T.

    2016-03-01

    We present a new analysis tool for studying texture changes in the quantitative phase maps of live cells acquired by wide-field interferometry. The sensitivity of wide-field interferometry systems to small changes in refractive index enables visualizing cells and inner cell organelles without the using fluorescent dyes or other cell-invasive approaches, which may affect the measurement and require external labeling. Our label-free texture-analysis tool is based directly on the optical path delay profile of the sample and does not necessitate decoupling refractive index and thickness in the cell quantitative phase profile; thus, relevant parameters can be calculated using a single-frame acquisition. Our experimental system includes low-coherence wide-field interferometer, combined with simultaneous florescence microscopy system for validation. We used this system and analysis tool for studying lipid droplets formation in adipocytes. The latter demonstration is relevant for various cellular functions such as lipid metabolism, protein storage and degradation to viral replication. These processes are functionally linked to several physiological and pathological conditions, including obesity and metabolic diseases. Quantification of these biological phenomena based on the texture changes in the cell phase map has a potential as a new cellular diagnosis tool.

  6. ANTONIA perfusion and stroke. A software tool for the multi-purpose analysis of MR perfusion-weighted datasets and quantitative ischemic stroke assessment.

    PubMed

    Forkert, N D; Cheng, B; Kemmling, A; Thomalla, G; Fiehler, J

    2014-01-01

    The objective of this work is to present the software tool ANTONIA, which has been developed to facilitate a quantitative analysis of perfusion-weighted MRI (PWI) datasets in general as well as the subsequent multi-parametric analysis of additional datasets for the specific purpose of acute ischemic stroke patient dataset evaluation. Three different methods for the analysis of DSC or DCE PWI datasets are currently implemented in ANTONIA, which can be case-specifically selected based on the study protocol. These methods comprise a curve fitting method as well as a deconvolution-based and deconvolution-free method integrating a previously defined arterial input function. The perfusion analysis is extended for the purpose of acute ischemic stroke analysis by additional methods that enable an automatic atlas-based selection of the arterial input function, an analysis of the perfusion-diffusion and DWI-FLAIR mismatch as well as segmentation-based volumetric analyses. For reliability evaluation, the described software tool was used by two observers for quantitative analysis of 15 datasets from acute ischemic stroke patients to extract the acute lesion core volume, FLAIR ratio, perfusion-diffusion mismatch volume with manually as well as automatically selected arterial input functions, and follow-up lesion volume. The results of this evaluation revealed that the described software tool leads to highly reproducible results for all parameters if the automatic arterial input function selection method is used. Due to the broad selection of processing methods that are available in the software tool, ANTONIA is especially helpful to support image-based perfusion and acute ischemic stroke research projects.

  7. MsViz: A Graphical Software Tool for In-Depth Manual Validation and Quantitation of Post-translational Modifications.

    PubMed

    Martín-Campos, Trinidad; Mylonas, Roman; Masselot, Alexandre; Waridel, Patrice; Petricevic, Tanja; Xenarios, Ioannis; Quadroni, Manfredo

    2017-08-04

    Mass spectrometry (MS) has become the tool of choice for the large scale identification and quantitation of proteins and their post-translational modifications (PTMs). This development has been enabled by powerful software packages for the automated analysis of MS data. While data on PTMs of thousands of proteins can nowadays be readily obtained, fully deciphering the complexity and combinatorics of modification patterns even on a single protein often remains challenging. Moreover, functional investigation of PTMs on a protein of interest requires validation of the localization and the accurate quantitation of its changes across several conditions, tasks that often still require human evaluation. Software tools for large scale analyses are highly efficient but are rarely conceived for interactive, in-depth exploration of data on individual proteins. We here describe MsViz, a web-based and interactive software tool that supports manual validation of PTMs and their relative quantitation in small- and medium-size experiments. The tool displays sequence coverage information, peptide-spectrum matches, tandem MS spectra and extracted ion chromatograms through a single, highly intuitive interface. We found that MsViz greatly facilitates manual data inspection to validate PTM location and quantitate modified species across multiple samples.

  8. TANGO: a generic tool for high-throughput 3D image analysis for studying nuclear organization.

    PubMed

    Ollion, Jean; Cochennec, Julien; Loll, François; Escudé, Christophe; Boudier, Thomas

    2013-07-15

    The cell nucleus is a highly organized cellular organelle that contains the genetic material. The study of nuclear architecture has become an important field of cellular biology. Extracting quantitative data from 3D fluorescence imaging helps understand the functions of different nuclear compartments. However, such approaches are limited by the requirement for processing and analyzing large sets of images. Here, we describe Tools for Analysis of Nuclear Genome Organization (TANGO), an image analysis tool dedicated to the study of nuclear architecture. TANGO is a coherent framework allowing biologists to perform the complete analysis process of 3D fluorescence images by combining two environments: ImageJ (http://imagej.nih.gov/ij/) for image processing and quantitative analysis and R (http://cran.r-project.org) for statistical processing of measurement results. It includes an intuitive user interface providing the means to precisely build a segmentation procedure and set-up analyses, without possessing programming skills. TANGO is a versatile tool able to process large sets of images, allowing quantitative study of nuclear organization. TANGO is composed of two programs: (i) an ImageJ plug-in and (ii) a package (rtango) for R. They are both free and open source, available (http://biophysique.mnhn.fr/tango) for Linux, Microsoft Windows and Macintosh OSX. Distribution is under the GPL v.2 licence. thomas.boudier@snv.jussieu.fr Supplementary data are available at Bioinformatics online.

  9. Mass spectrometry as a quantitative tool in plant metabolomics

    PubMed Central

    Jorge, Tiago F.; Mata, Ana T.

    2016-01-01

    Metabolomics is a research field used to acquire comprehensive information on the composition of a metabolite pool to provide a functional screen of the cellular state. Studies of the plant metabolome include the analysis of a wide range of chemical species with very diverse physico-chemical properties, and therefore powerful analytical tools are required for the separation, characterization and quantification of this vast compound diversity present in plant matrices. In this review, challenges in the use of mass spectrometry (MS) as a quantitative tool in plant metabolomics experiments are discussed, and important criteria for the development and validation of MS-based analytical methods provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644967

  10. Cardiac imaging: working towards fully-automated machine analysis & interpretation

    PubMed Central

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-01-01

    Introduction Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation. PMID:28277804

  11. Oqtans: the RNA-seq workbench in the cloud for complete and reproducible quantitative transcriptome analysis.

    PubMed

    Sreedharan, Vipin T; Schultheiss, Sebastian J; Jean, Géraldine; Kahles, André; Bohnert, Regina; Drewe, Philipp; Mudrakarta, Pramod; Görnitz, Nico; Zeller, Georg; Rätsch, Gunnar

    2014-05-01

    We present Oqtans, an open-source workbench for quantitative transcriptome analysis, that is integrated in Galaxy. Its distinguishing features include customizable computational workflows and a modular pipeline architecture that facilitates comparative assessment of tool and data quality. Oqtans integrates an assortment of machine learning-powered tools into Galaxy, which show superior or equal performance to state-of-the-art tools. Implemented tools comprise a complete transcriptome analysis workflow: short-read alignment, transcript identification/quantification and differential expression analysis. Oqtans and Galaxy facilitate persistent storage, data exchange and documentation of intermediate results and analysis workflows. We illustrate how Oqtans aids the interpretation of data from different experiments in easy to understand use cases. Users can easily create their own workflows and extend Oqtans by integrating specific tools. Oqtans is available as (i) a cloud machine image with a demo instance at cloud.oqtans.org, (ii) a public Galaxy instance at galaxy.cbio.mskcc.org, (iii) a git repository containing all installed software (oqtans.org/git); most of which is also available from (iv) the Galaxy Toolshed and (v) a share string to use along with Galaxy CloudMan.

  12. Prototype Development of a Tradespace Analysis Tool for Spaceflight Medical Resources.

    PubMed

    Antonsen, Erik L; Mulcahy, Robert A; Rubin, David; Blue, Rebecca S; Canga, Michael A; Shah, Ronak

    2018-02-01

    The provision of medical care in exploration-class spaceflight is limited by mass, volume, and power constraints, as well as limitations of available skillsets of crewmembers. A quantitative means of exploring the risks and benefits of inclusion or exclusion of onboard medical capabilities may help to inform the development of an appropriate medical system. A pilot project was designed to demonstrate the utility of an early tradespace analysis tool for identifying high-priority resources geared toward properly equipping an exploration mission medical system. Physician subject matter experts identified resources, tools, and skillsets required, as well as associated criticality scores of the same, to meet terrestrial, U.S.-specific ideal medical solutions for conditions concerning for exploration-class spaceflight. A database of diagnostic and treatment actions and resources was created based on this input and weighed against the probabilities of mission-specific medical events to help identify common and critical elements needed in a future exploration medical capability. Analysis of repository data demonstrates the utility of a quantitative method of comparing various medical resources and skillsets for future missions. Directed database queries can provide detailed comparative estimates concerning likelihood of resource utilization within a given mission and the weighted utility of tangible and intangible resources. This prototype tool demonstrates one quantitative approach to the complex needs and limitations of an exploration medical system. While this early version identified areas for refinement in future version development, more robust analysis tools may help to inform the development of a comprehensive medical system for future exploration missions.Antonsen EL, Mulcahy RA, Rubin D, Blue RS, Canga MA, Shah R. Prototype development of a tradespace analysis tool for spaceflight medical resources. Aerosp Med Hum Perform. 2018; 89(2):108-114.

  13. Quantitative Analysis Tools and Digital Phantoms for Deformable Image Registration Quality Assurance.

    PubMed

    Kim, Haksoo; Park, Samuel B; Monroe, James I; Traughber, Bryan J; Zheng, Yiran; Lo, Simon S; Yao, Min; Mansur, David; Ellis, Rodney; Machtay, Mitchell; Sohn, Jason W

    2015-08-01

    This article proposes quantitative analysis tools and digital phantoms to quantify intrinsic errors of deformable image registration (DIR) systems and establish quality assurance (QA) procedures for clinical use of DIR systems utilizing local and global error analysis methods with clinically realistic digital image phantoms. Landmark-based image registration verifications are suitable only for images with significant feature points. To address this shortfall, we adapted a deformation vector field (DVF) comparison approach with new analysis techniques to quantify the results. Digital image phantoms are derived from data sets of actual patient images (a reference image set, R, a test image set, T). Image sets from the same patient taken at different times are registered with deformable methods producing a reference DVFref. Applying DVFref to the original reference image deforms T into a new image R'. The data set, R', T, and DVFref, is from a realistic truth set and therefore can be used to analyze any DIR system and expose intrinsic errors by comparing DVFref and DVFtest. For quantitative error analysis, calculating and delineating differences between DVFs, 2 methods were used, (1) a local error analysis tool that displays deformation error magnitudes with color mapping on each image slice and (2) a global error analysis tool that calculates a deformation error histogram, which describes a cumulative probability function of errors for each anatomical structure. Three digital image phantoms were generated from three patients with a head and neck, a lung and a liver cancer. The DIR QA was evaluated using the case with head and neck. © The Author(s) 2014.

  14. Relating interesting quantitative time series patterns with text events and text features

    NASA Astrophysics Data System (ADS)

    Wanner, Franz; Schreck, Tobias; Jentner, Wolfgang; Sharalieva, Lyubka; Keim, Daniel A.

    2013-12-01

    In many application areas, the key to successful data analysis is the integrated analysis of heterogeneous data. One example is the financial domain, where time-dependent and highly frequent quantitative data (e.g., trading volume and price information) and textual data (e.g., economic and political news reports) need to be considered jointly. Data analysis tools need to support an integrated analysis, which allows studying the relationships between textual news documents and quantitative properties of the stock market price series. In this paper, we describe a workflow and tool that allows a flexible formation of hypotheses about text features and their combinations, which reflect quantitative phenomena observed in stock data. To support such an analysis, we combine the analysis steps of frequent quantitative and text-oriented data using an existing a-priori method. First, based on heuristics we extract interesting intervals and patterns in large time series data. The visual analysis supports the analyst in exploring parameter combinations and their results. The identified time series patterns are then input for the second analysis step, in which all identified intervals of interest are analyzed for frequent patterns co-occurring with financial news. An a-priori method supports the discovery of such sequential temporal patterns. Then, various text features like the degree of sentence nesting, noun phrase complexity, the vocabulary richness, etc. are extracted from the news to obtain meta patterns. Meta patterns are defined by a specific combination of text features which significantly differ from the text features of the remaining news data. Our approach combines a portfolio of visualization and analysis techniques, including time-, cluster- and sequence visualization and analysis functionality. We provide two case studies, showing the effectiveness of our combined quantitative and textual analysis work flow. The workflow can also be generalized to other application domains such as data analysis of smart grids, cyber physical systems or the security of critical infrastructure, where the data consists of a combination of quantitative and textual time series data.

  15. Infrared Spectroscopy as a Versatile Analytical Tool for the Quantitative Determination of Antioxidants in Agricultural Products, Foods and Plants

    PubMed Central

    Cozzolino, Daniel

    2015-01-01

    Spectroscopic methods provide with very useful qualitative and quantitative information about the biochemistry and chemistry of antioxidants. Near infrared (NIR) and mid infrared (MIR) spectroscopy are considered as powerful, fast, accurate and non-destructive analytical tools that can be considered as a replacement of traditional chemical analysis. In recent years, several reports can be found in the literature demonstrating the usefulness of these methods in the analysis of antioxidants in different organic matrices. This article reviews recent applications of infrared (NIR and MIR) spectroscopy in the analysis of antioxidant compounds in a wide range of samples such as agricultural products, foods and plants. PMID:26783838

  16. Smartphone-based multispectral imaging: system development and potential for mobile skin diagnosis.

    PubMed

    Kim, Sewoong; Cho, Dongrae; Kim, Jihun; Kim, Manjae; Youn, Sangyeon; Jang, Jae Eun; Je, Minkyu; Lee, Dong Hun; Lee, Boreom; Farkas, Daniel L; Hwang, Jae Youn

    2016-12-01

    We investigate the potential of mobile smartphone-based multispectral imaging for the quantitative diagnosis and management of skin lesions. Recently, various mobile devices such as a smartphone have emerged as healthcare tools. They have been applied for the early diagnosis of nonmalignant and malignant skin diseases. Particularly, when they are combined with an advanced optical imaging technique such as multispectral imaging and analysis, it would be beneficial for the early diagnosis of such skin diseases and for further quantitative prognosis monitoring after treatment at home. Thus, we demonstrate here the development of a smartphone-based multispectral imaging system with high portability and its potential for mobile skin diagnosis. The results suggest that smartphone-based multispectral imaging and analysis has great potential as a healthcare tool for quantitative mobile skin diagnosis.

  17. A grid for a precise analysis of daily activities.

    PubMed

    Wojtasik, V; Olivier, C; Lekeu, F; Quittre, A; Adam, S; Salmon, E

    2010-01-01

    Assessment of daily living activities is essential in patients with Alzheimer's disease. Most current tools quantitatively assess overall ability but provide little qualitative information on individual difficulties. Only a few tools allow therapists to evaluate stereotyped activities and record different types of errors. We capitalised on the Kitchen Activity Assessment to design a widely applicable analysis grid that provides both qualitative and quantitative data on activity performance. A cooking activity was videotaped in 15 patients with dementia and assessed according to the different steps in the execution of the task. The evaluations obtained with our grid showed good correlations between raters, between versions of the grid and between sessions. Moreover, the degree of independence obtained with our analysis of the task correlated with the Kitchen Activity Assessment score and with a global score of cognitive functioning. We conclude that assessment of a daily living activity with this analysis grid is reproducible and relatively independent of the therapist, and thus provides quantitative and qualitative information useful for both evaluating and caring for demented patients.

  18. Modeling and Analysis of Space Based Transceivers

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Liebetreu, John; Moore, Michael S.; Price, Jeremy C.; Abbott, Ben

    2005-01-01

    This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.

  19. Modeling and Analysis of Space Based Transceivers

    NASA Technical Reports Server (NTRS)

    Moore, Michael S.; Price, Jeremy C.; Abbott, Ben; Liebetreu, John; Reinhart, Richard C.; Kacpura, Thomas J.

    2007-01-01

    This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.

  20. Quantitative analysis of diffusion tensor orientation: theoretical framework.

    PubMed

    Wu, Yu-Chien; Field, Aaron S; Chung, Moo K; Badie, Benham; Alexander, Andrew L

    2004-11-01

    Diffusion-tensor MRI (DT-MRI) yields information about the magnitude, anisotropy, and orientation of water diffusion of brain tissues. Although white matter tractography and eigenvector color maps provide visually appealing displays of white matter tract organization, they do not easily lend themselves to quantitative and statistical analysis. In this study, a set of visual and quantitative tools for the investigation of tensor orientations in the human brain was developed. Visual tools included rose diagrams, which are spherical coordinate histograms of the major eigenvector directions, and 3D scatterplots of the major eigenvector angles. A scatter matrix of major eigenvector directions was used to describe the distribution of major eigenvectors in a defined anatomic region. A measure of eigenvector dispersion was developed to describe the degree of eigenvector coherence in the selected region. These tools were used to evaluate directional organization and the interhemispheric symmetry of DT-MRI data in five healthy human brains and two patients with infiltrative diseases of the white matter tracts. In normal anatomical white matter tracts, a high degree of directional coherence and interhemispheric symmetry was observed. The infiltrative diseases appeared to alter the eigenvector properties of affected white matter tracts, showing decreased eigenvector coherence and interhemispheric symmetry. This novel approach distills the rich, 3D information available from the diffusion tensor into a form that lends itself to quantitative analysis and statistical hypothesis testing. (c) 2004 Wiley-Liss, Inc.

  1. LFQuant: a label-free fast quantitative analysis tool for high-resolution LC-MS/MS proteomics data.

    PubMed

    Zhang, Wei; Zhang, Jiyang; Xu, Changming; Li, Ning; Liu, Hui; Ma, Jie; Zhu, Yunping; Xie, Hongwei

    2012-12-01

    Database searching based methods for label-free quantification aim to reconstruct the peptide extracted ion chromatogram based on the identification information, which can limit the search space and thus make the data processing much faster. The random effect of the MS/MS sampling can be remedied by cross-assignment among different runs. Here, we present a new label-free fast quantitative analysis tool, LFQuant, for high-resolution LC-MS/MS proteomics data based on database searching. It is designed to accept raw data in two common formats (mzXML and Thermo RAW), and database search results from mainstream tools (MASCOT, SEQUEST, and X!Tandem), as input data. LFQuant can handle large-scale label-free data with fractionation such as SDS-PAGE and 2D LC. It is easy to use and provides handy user interfaces for data loading, parameter setting, quantitative analysis, and quantitative data visualization. LFQuant was compared with two common quantification software packages, MaxQuant and IDEAL-Q, on the replication data set and the UPS1 standard data set. The results show that LFQuant performs better than them in terms of both precision and accuracy, and consumes significantly less processing time. LFQuant is freely available under the GNU General Public License v3.0 at http://sourceforge.net/projects/lfquant/. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis.

    PubMed

    Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.

  3. Smartphone-based multispectral imaging: system development and potential for mobile skin diagnosis

    PubMed Central

    Kim, Sewoong; Cho, Dongrae; Kim, Jihun; Kim, Manjae; Youn, Sangyeon; Jang, Jae Eun; Je, Minkyu; Lee, Dong Hun; Lee, Boreom; Farkas, Daniel L.; Hwang, Jae Youn

    2016-01-01

    We investigate the potential of mobile smartphone-based multispectral imaging for the quantitative diagnosis and management of skin lesions. Recently, various mobile devices such as a smartphone have emerged as healthcare tools. They have been applied for the early diagnosis of nonmalignant and malignant skin diseases. Particularly, when they are combined with an advanced optical imaging technique such as multispectral imaging and analysis, it would be beneficial for the early diagnosis of such skin diseases and for further quantitative prognosis monitoring after treatment at home. Thus, we demonstrate here the development of a smartphone-based multispectral imaging system with high portability and its potential for mobile skin diagnosis. The results suggest that smartphone-based multispectral imaging and analysis has great potential as a healthcare tool for quantitative mobile skin diagnosis. PMID:28018743

  4. Quantitative molecular analysis in mantle cell lymphoma.

    PubMed

    Brízová, H; Hilská, I; Mrhalová, M; Kodet, R

    2011-07-01

    A molecular analysis has three major roles in modern oncopathology--as an aid in the differential diagnosis, in molecular monitoring of diseases, and in estimation of the potential prognosis. In this report we review the application of the molecular analysis in a group of patients with mantle cell lymphoma (MCL). We demonstrate that detection of the cyclin D1 mRNA level is a molecular marker in 98% of patients with MCL. Cyclin D1 quantitative monitoring is specific and sensitive for the differential diagnosis and for the molecular monitoring of the disease in the bone marrow. Moreover, the dynamics of cyclin D1 in bone marrow reflects the disease development and it predicts the clinical course. We employed the molecular analysis for a precise quantitative detection of proliferation markers, Ki-67, topoisomerase IIalpha, and TPX2, that are described as effective prognostic factors. Using the molecular approach it is possible to measure the proliferation rate in a reproducible, standard way which is an essential prerequisite for using the proliferation activity as a routine clinical tool. Comparing with immunophenotyping we may conclude that the quantitative PCR-based analysis is a useful, reliable, rapid, reproducible, sensitive and specific method broadening our diagnostic tools in hematopathology. In comparison to interphase FISH in paraffin sections quantitative PCR is less technically demanding and less time-consuming and furthermore it is more sensitive in detecting small changes in the mRNA level. Moreover, quantitative PCR is the only technology which provides precise and reproducible quantitative information about the expression level. Therefore it may be used to demonstrate the decrease or increase of a tumor-specific marker in bone marrow in comparison with a previously aspirated specimen. Thus, it has a powerful potential to monitor the course of the disease in correlation with clinical data.

  5. A quantitative assessment of alkaptonuria: testing the reliability of two disease severity scoring systems.

    PubMed

    Cox, Trevor F; Ranganath, Lakshminarayan

    2011-12-01

    Alkaptonuria (AKU) is due to excessive homogentisic acid accumulation in body fluids due to lack of enzyme homogentisate dioxygenase leading in turn to varied clinical manifestations mainly by a process of conversion of HGA to a polymeric melanin-like pigment known as ochronosis. A potential treatment, a drug called nitisinone, to decrease formation of HGA is available. However, successful demonstration of its efficacy in modifying the natural history of AKU requires an effective quantitative assessment tool. We have described two potential tools that could be used to quantitate disease burden in AKU. One tool describes scoring the clinical features that includes clinical assessments, investigations and questionnaires in 15 patients with AKU. The second tool describes a scoring system that only includes items obtained from questionnaires used in 44 people with AKU. Statistical analyses were carried out on the two patient datasets to assess the AKU tools; these included the calculation of Chronbach's alpha, multidimensional scaling and simple linear regression analysis. The conclusion was that there was good evidence that the tools could be adopted as AKU assessment tools, but perhaps with further refinement before being used in the practical setting of a clinical trial.

  6. Quantitative analysis and comparative study of four cities green pattern in API system on the background of big data

    NASA Astrophysics Data System (ADS)

    Xin, YANG; Si-qi, WU; Qi, ZHANG

    2018-05-01

    Beijing, London, Paris, New York are typical cities in the world, so comparative study of four cities green pattern is very important to find out gap and advantage and to learn from each other. The paper will provide basis and new ideas for development of metropolises in China. On the background of big data, API (Application Programming Interface) system can provide extensive and accurate basic data to study urban green pattern in different geographical environment in domestic and foreign. On the basis of this, Average nearest neighbor tool, Kernel density tool and Standard Ellipse tool in ArcGIS platform can process and summarize data and realize quantitative analysis of green pattern. The paper summarized uniqueness of four cities green pattern and reasons of formation on basis of numerical comparison.

  7. MASH Suite Pro: A Comprehensive Software Tool for Top-Down Proteomics*

    PubMed Central

    Cai, Wenxuan; Guner, Huseyin; Gregorich, Zachery R.; Chen, Albert J.; Ayaz-Guner, Serife; Peng, Ying; Valeja, Santosh G.; Liu, Xiaowen; Ge, Ying

    2016-01-01

    Top-down mass spectrometry (MS)-based proteomics is arguably a disruptive technology for the comprehensive analysis of all proteoforms arising from genetic variation, alternative splicing, and posttranslational modifications (PTMs). However, the complexity of top-down high-resolution mass spectra presents a significant challenge for data analysis. In contrast to the well-developed software packages available for data analysis in bottom-up proteomics, the data analysis tools in top-down proteomics remain underdeveloped. Moreover, despite recent efforts to develop algorithms and tools for the deconvolution of top-down high-resolution mass spectra and the identification of proteins from complex mixtures, a multifunctional software platform, which allows for the identification, quantitation, and characterization of proteoforms with visual validation, is still lacking. Herein, we have developed MASH Suite Pro, a comprehensive software tool for top-down proteomics with multifaceted functionality. MASH Suite Pro is capable of processing high-resolution MS and tandem MS (MS/MS) data using two deconvolution algorithms to optimize protein identification results. In addition, MASH Suite Pro allows for the characterization of PTMs and sequence variations, as well as the relative quantitation of multiple proteoforms in different experimental conditions. The program also provides visualization components for validation and correction of the computational outputs. Furthermore, MASH Suite Pro facilitates data reporting and presentation via direct output of the graphics. Thus, MASH Suite Pro significantly simplifies and speeds up the interpretation of high-resolution top-down proteomics data by integrating tools for protein identification, quantitation, characterization, and visual validation into a customizable and user-friendly interface. We envision that MASH Suite Pro will play an integral role in advancing the burgeoning field of top-down proteomics. PMID:26598644

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Suresh, Niraj; Stephens, Sean A.; Adams, Lexor

    Plant roots play a critical role in plant-soil-microbe interactions that occur in the rhizosphere, as well as processes with important implications to climate change and forest management. Quantitative size information on roots in their native environment is invaluable for studying root growth and environmental processes involving the plant. X ray computed tomography (XCT) has been demonstrated to be an effective tool for in situ root scanning and analysis. Our group at the Environmental Molecular Sciences Laboratory (EMSL) has developed an XCT-based tool to image and quantitatively analyze plant root structures in their native soil environment. XCT data collected on amore » Prairie dropseed (Sporobolus heterolepis) specimen was used to visualize its root structure. A combination of open-source software RooTrak and DDV were employed to segment the root from the soil, and calculate its isosurface, respectively. Our own computer script named 3DRoot-SV was developed and used to calculate root volume and surface area from a triangular mesh. The process utilizing a unique combination of tools, from imaging to quantitative root analysis, including the 3DRoot-SV computer script, is described.« less

  9. Performing statistical analyses on quantitative data in Taverna workflows: an example using R and maxdBrowse to identify differentially-expressed genes from microarray data.

    PubMed

    Li, Peter; Castrillo, Juan I; Velarde, Giles; Wassink, Ingo; Soiland-Reyes, Stian; Owen, Stuart; Withers, David; Oinn, Tom; Pocock, Matthew R; Goble, Carole A; Oliver, Stephen G; Kell, Douglas B

    2008-08-07

    There has been a dramatic increase in the amount of quantitative data derived from the measurement of changes at different levels of biological complexity during the post-genomic era. However, there are a number of issues associated with the use of computational tools employed for the analysis of such data. For example, computational tools such as R and MATLAB require prior knowledge of their programming languages in order to implement statistical analyses on data. Combining two or more tools in an analysis may also be problematic since data may have to be manually copied and pasted between separate user interfaces for each tool. Furthermore, this transfer of data may require a reconciliation step in order for there to be interoperability between computational tools. Developments in the Taverna workflow system have enabled pipelines to be constructed and enacted for generic and ad hoc analyses of quantitative data. Here, we present an example of such a workflow involving the statistical identification of differentially-expressed genes from microarray data followed by the annotation of their relationships to cellular processes. This workflow makes use of customised maxdBrowse web services, a system that allows Taverna to query and retrieve gene expression data from the maxdLoad2 microarray database. These data are then analysed by R to identify differentially-expressed genes using the Taverna RShell processor which has been developed for invoking this tool when it has been deployed as a service using the RServe library. In addition, the workflow uses Beanshell scripts to reconcile mismatches of data between services as well as to implement a form of user interaction for selecting subsets of microarray data for analysis as part of the workflow execution. A new plugin system in the Taverna software architecture is demonstrated by the use of renderers for displaying PDF files and CSV formatted data within the Taverna workbench. Taverna can be used by data analysis experts as a generic tool for composing ad hoc analyses of quantitative data by combining the use of scripts written in the R programming language with tools exposed as services in workflows. When these workflows are shared with colleagues and the wider scientific community, they provide an approach for other scientists wanting to use tools such as R without having to learn the corresponding programming language to analyse their own data.

  10. Performing statistical analyses on quantitative data in Taverna workflows: An example using R and maxdBrowse to identify differentially-expressed genes from microarray data

    PubMed Central

    Li, Peter; Castrillo, Juan I; Velarde, Giles; Wassink, Ingo; Soiland-Reyes, Stian; Owen, Stuart; Withers, David; Oinn, Tom; Pocock, Matthew R; Goble, Carole A; Oliver, Stephen G; Kell, Douglas B

    2008-01-01

    Background There has been a dramatic increase in the amount of quantitative data derived from the measurement of changes at different levels of biological complexity during the post-genomic era. However, there are a number of issues associated with the use of computational tools employed for the analysis of such data. For example, computational tools such as R and MATLAB require prior knowledge of their programming languages in order to implement statistical analyses on data. Combining two or more tools in an analysis may also be problematic since data may have to be manually copied and pasted between separate user interfaces for each tool. Furthermore, this transfer of data may require a reconciliation step in order for there to be interoperability between computational tools. Results Developments in the Taverna workflow system have enabled pipelines to be constructed and enacted for generic and ad hoc analyses of quantitative data. Here, we present an example of such a workflow involving the statistical identification of differentially-expressed genes from microarray data followed by the annotation of their relationships to cellular processes. This workflow makes use of customised maxdBrowse web services, a system that allows Taverna to query and retrieve gene expression data from the maxdLoad2 microarray database. These data are then analysed by R to identify differentially-expressed genes using the Taverna RShell processor which has been developed for invoking this tool when it has been deployed as a service using the RServe library. In addition, the workflow uses Beanshell scripts to reconcile mismatches of data between services as well as to implement a form of user interaction for selecting subsets of microarray data for analysis as part of the workflow execution. A new plugin system in the Taverna software architecture is demonstrated by the use of renderers for displaying PDF files and CSV formatted data within the Taverna workbench. Conclusion Taverna can be used by data analysis experts as a generic tool for composing ad hoc analyses of quantitative data by combining the use of scripts written in the R programming language with tools exposed as services in workflows. When these workflows are shared with colleagues and the wider scientific community, they provide an approach for other scientists wanting to use tools such as R without having to learn the corresponding programming language to analyse their own data. PMID:18687127

  11. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    NASA Technical Reports Server (NTRS)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  12. Tannin structural elucidation and quantitative ³¹P NMR analysis. 2. Hydrolyzable tannins and proanthocyanidins.

    PubMed

    Melone, Federica; Saladino, Raffaele; Lange, Heiko; Crestini, Claudia

    2013-10-02

    An unprecedented analytical method that allows simultaneous structural and quantitative characterization of all functional groups present in tannins is reported. In situ labeling of all labile H groups (aliphatic and phenolic hydroxyls and carboxylic acids) with a phosphorus-containing reagent (Cl-TMDP) followed by quantitative ³¹P NMR acquisition constitutes a novel fast and reliable analytical tool for the analysis of tannins and proanthocyanidins with significant implications for the fields of food and feed analyses, tannery, and the development of natural polyphenolics containing products.

  13. Label-free quantitative proteomic analysis of human plasma-derived microvesicles to find protein signatures of abdominal aortic aneurysms.

    PubMed

    Martinez-Pinna, Roxana; Gonzalez de Peredo, Anne; Monsarrat, Bernard; Burlet-Schiltz, Odile; Martin-Ventura, Jose Luis

    2014-08-01

    To find potential biomarkers of abdominal aortic aneurysms (AAA), we performed a differential proteomic study based on human plasma-derived microvesicles. Exosomes and microparticles isolated from plasma of AAA patients and control subjects (n = 10 each group) were analyzed by a label-free quantitative MS-based strategy. Homemade and publicly available software packages have been used for MS data analysis. The application of two kinds of bioinformatic tools allowed us to find differential protein profiles from AAA patients. Some of these proteins found by the two analysis methods belong to main pathological mechanisms of AAA such as oxidative stress, immune-inflammation, and thrombosis. Data analysis from label-free MS-based experiments requires the use of sophisticated bioinformatic approaches to perform quantitative studies from complex protein mixtures. The application of two of these bioinformatic tools provided us a preliminary list of differential proteins found in plasma-derived microvesicles not previously associated to AAA, which could help us to understand the pathological mechanisms related to this disease. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Informatics methods to enable sharing of quantitative imaging research data.

    PubMed

    Levy, Mia A; Freymann, John B; Kirby, Justin S; Fedorov, Andriy; Fennessy, Fiona M; Eschrich, Steven A; Berglund, Anders E; Fenstermacher, David A; Tan, Yongqiang; Guo, Xiaotao; Casavant, Thomas L; Brown, Bartley J; Braun, Terry A; Dekker, Andre; Roelofs, Erik; Mountz, James M; Boada, Fernando; Laymon, Charles; Oborski, Matt; Rubin, Daniel L

    2012-11-01

    The National Cancer Institute Quantitative Research Network (QIN) is a collaborative research network whose goal is to share data, algorithms and research tools to accelerate quantitative imaging research. A challenge is the variability in tools and analysis platforms used in quantitative imaging. Our goal was to understand the extent of this variation and to develop an approach to enable sharing data and to promote reuse of quantitative imaging data in the community. We performed a survey of the current tools in use by the QIN member sites for representation and storage of their QIN research data including images, image meta-data and clinical data. We identified existing systems and standards for data sharing and their gaps for the QIN use case. We then proposed a system architecture to enable data sharing and collaborative experimentation within the QIN. There are a variety of tools currently used by each QIN institution. We developed a general information system architecture to support the QIN goals. We also describe the remaining architecture gaps we are developing to enable members to share research images and image meta-data across the network. As a research network, the QIN will stimulate quantitative imaging research by pooling data, algorithms and research tools. However, there are gaps in current functional requirements that will need to be met by future informatics development. Special attention must be given to the technical requirements needed to translate these methods into the clinical research workflow to enable validation and qualification of these novel imaging biomarkers. Copyright © 2012 Elsevier Inc. All rights reserved.

  15. Fluorescence, Absorption, and Excitation Spectra of Polycyclic Aromatic Hydrocarbons as a Tool for Quantitative Analysis

    ERIC Educational Resources Information Center

    Rivera-Figueroa, A. M.; Ramazan, K. A.; Finlayson-Pitts, B. J.

    2004-01-01

    A quantitative and qualitative study of the interplay between absorption, fluorescence, and excitation spectra of pollutants called polycyclic aromatic hydrocarbons (PAHs) is conducted. The study of five PAH displays the correlation of the above-mentioned properties along with the associated molecular changes.

  16. Evaluation of reference genes in Vibrio parahaemolyticus for gene expression analysis using quantitative RT-PCR

    USDA-ARS?s Scientific Manuscript database

    Vibrio parahaemolyticus is a significant human pathogen capable of causing foodborne gastroenteritis associated with the consumption of contaminated raw or undercooked seafood. Quantitative RT-PCR (qRT-PCR) is a useful tool for studying gene expression in V. parahaemolyticus to characterize the viru...

  17. A multi-center study benchmarks software tools for label-free proteome quantification

    PubMed Central

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  18. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  19. INTRODUCTION TO THE LANDSCAPE ANALYSIS TOOLS ARCVIEW EXTENSION

    EPA Science Inventory

    Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. watershed or county). ...

  20. Visualizing Qualitative Information

    ERIC Educational Resources Information Center

    Slone, Debra J.

    2009-01-01

    The abundance of qualitative data in today's society and the need to easily scrutinize, digest, and share this information calls for effective visualization and analysis tools. Yet, no existing qualitative tools have the analytic power, visual effectiveness, and universality of familiar quantitative instruments like bar charts, scatter-plots, and…

  1. Validating a tool to measure auxiliary nurse midwife and nurse motivation in rural Nepal.

    PubMed

    Morrison, Joanna; Batura, Neha; Thapa, Rita; Basnyat, Regina; Skordis-Worrall, Jolene

    2015-05-12

    A global shortage of health workers in rural areas increases the salience of motivating and supporting existing health workers. Understandings of motivation may vary in different settings, and it is important to use measurement methods that are contextually appropriate. We identified a measurement tool, previously used in Kenya, and explored its validity and reliability to measure the motivation of auxiliary nurse midwives (ANM) and staff nurses (SN) in rural Nepal. Qualitative and quantitative methods were used to assess the content validity, the construct validity, the internal consistency and the reliability of the tool. We translated the tool into Nepali and it was administered to 137 ANMs and SNs in three districts. We collected qualitative data from 78 nursing personnel and district- and central-level stakeholders using interviews and focus group discussions. We calculated motivation scores for ANMs and SNs using the quantitative data and conducted statistical tests for validity and reliability. Motivation scores were compared with qualitative data. Descriptive exploratory analysis compared mean motivation scores by ANM and SN sociodemographic characteristics. The concept of self-efficacy was added to the tool before data collection. Motivation was revealed through conscientiousness. Teamwork and the exertion of extra effort were not adequately captured by the tool, but important in illustrating motivation. The statement on punctuality was problematic in quantitative analysis, and attendance was more expressive of motivation. The calculated motivation scores usually reflected ANM and SN interview data, with some variation in other stakeholder responses. The tool scored within acceptable limits in validity and reliability testing and was able to distinguish motivation of nursing personnel with different sociodemographic characteristics. We found that with minor modifications, the tool provided valid and internally consistent measures of motivation among ANMs and SNs in this context. We recommend the use of this tool in similar contexts, with the addition of statements about self-efficacy, teamwork and exertion of extra effort. Absenteeism should replace the punctuality statement, and statements should be worded both positively and negatively to mitigate positive response bias. Collection of qualitative data on motivation creates a more nuanced understanding of quantitative scores.

  2. 78 FR 69839 - Building Technologies Office Prioritization Tool

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-21

    ... innovative and cost-effective energy saving solutions: Supporting research and development of high impact... Description The tool was designed to inform programmatic decision-making and facilitate the setting of... quantitative analysis to assure only the highest impact measures are the focus of further effort. The approach...

  3. A Meta-analysis Method to Advance Design of Technology-Based Learning Tool: Combining Qualitative and Quantitative Research to Understand Learning in Relation to Different Technology Features

    NASA Astrophysics Data System (ADS)

    Zhang, Lin

    2014-02-01

    Educators design and create various technology tools to scaffold students' learning. As more and more technology designs are incorporated into learning, growing attention has been paid to the study of technology-based learning tool. This paper discusses the emerging issues, such as how can learning effectiveness be understood in relation to different technology features? And how can pieces of qualitative and quantitative results be integrated to achieve a broader understanding of technology designs? To address these issues, this paper proposes a meta-analysis method. Detailed explanations about the structure of the methodology and its scientific mechanism are provided for discussions and suggestions. This paper ends with an in-depth discussion on the concerns and questions that educational researchers might raise, such as how this methodology takes care of learning contexts.

  4. Systems Biology-Driven Hypotheses Tested In Vivo: The Need to Advancing Molecular Imaging Tools.

    PubMed

    Verma, Garima; Palombo, Alessandro; Grigioni, Mauro; La Monaca, Morena; D'Avenio, Giuseppe

    2018-01-01

    Processing and interpretation of biological images may provide invaluable insights on complex, living systems because images capture the overall dynamics as a "whole." Therefore, "extraction" of key, quantitative morphological parameters could be, at least in principle, helpful in building a reliable systems biology approach in understanding living objects. Molecular imaging tools for system biology models have attained widespread usage in modern experimental laboratories. Here, we provide an overview on advances in the computational technology and different instrumentations focused on molecular image processing and analysis. Quantitative data analysis through various open source software and algorithmic protocols will provide a novel approach for modeling the experimental research program. Besides this, we also highlight the predictable future trends regarding methods for automatically analyzing biological data. Such tools will be very useful to understand the detailed biological and mathematical expressions under in-silico system biology processes with modeling properties.

  5. ELISA-BASE: An Integrated Bioinformatics Tool for Analyzing and Tracking ELISA Microarray Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Amanda M.; Collett, James L.; Seurynck-Servoss, Shannon L.

    ELISA-BASE is an open-source database for capturing, organizing and analyzing protein enzyme-linked immunosorbent assay (ELISA) microarray data. ELISA-BASE is an extension of the BioArray Soft-ware Environment (BASE) database system, which was developed for DNA microarrays. In order to make BASE suitable for protein microarray experiments, we developed several plugins for importing and analyzing quantitative ELISA microarray data. Most notably, our Protein Microarray Analysis Tool (ProMAT) for processing quantita-tive ELISA data is now available as a plugin to the database.

  6. Geoscience data visualization and analysis using GeoMapApp

    NASA Astrophysics Data System (ADS)

    Ferrini, Vicki; Carbotte, Suzanne; Ryan, William; Chan, Samantha

    2013-04-01

    Increased availability of geoscience data resources has resulted in new opportunities for developing visualization and analysis tools that not only promote data integration and synthesis, but also facilitate quantitative cross-disciplinary access to data. Interdisciplinary investigations, in particular, frequently require visualizations and quantitative access to specialized data resources across disciplines, which has historically required specialist knowledge of data formats and software tools. GeoMapApp (www.geomapapp.org) is a free online data visualization and analysis tool that provides direct quantitative access to a wide variety of geoscience data for a broad international interdisciplinary user community. While GeoMapApp provides access to online data resources, it can also be packaged to work offline through the deployment of a small portable hard drive. This mode of operation can be particularly useful during field programs to provide functionality and direct access to data when a network connection is not possible. Hundreds of data sets from a variety of repositories are directly accessible in GeoMapApp, without the need for the user to understand the specifics of file formats or data reduction procedures. Available data include global and regional gridded data, images, as well as tabular and vector datasets. In addition to basic visualization and data discovery functionality, users are provided with simple tools for creating customized maps and visualizations and to quantitatively interrogate data. Specialized data portals with advanced functionality are also provided for power users to further analyze data resources and access underlying component datasets. Users may import and analyze their own geospatial datasets by loading local versions of geospatial data and can access content made available through Web Feature Services (WFS) and Web Map Services (WMS). Once data are loaded in GeoMapApp, a variety options are provided to export data and/or 2D/3D visualizations into common formats including grids, images, text files, spreadsheets, etc. Examples of interdisciplinary investigations that make use of GeoMapApp visualization and analysis functionality will be provided.

  7. Analysis and classification of the tools for assessing the risks associated with industrial machines.

    PubMed

    Paques, Joseph-Jean; Gauthier, François; Perez, Alejandro

    2007-01-01

    To assess and plan future risk-analysis research projects, 275 documents describing methods and tools for assessing the risks associated with industrial machines or with other sectors such as the military, and the nuclear and aeronautics industries, etc., were collected. These documents were in the format of published books or papers, standards, technical guides and company procedures collected throughout industry. From the collected documents, 112 documents were selected for analysis; 108 methods applied or potentially applicable for assessing the risks associated with industrial machines were analyzed and classified. This paper presents the main quantitative results of the analysis of the methods and tools.

  8. Enhancing the Characterization of Epistemic Uncertainties in PM2.5 Risk Analyses.

    PubMed

    Smith, Anne E; Gans, Will

    2015-03-01

    The Environmental Benefits Mapping and Analysis Program (BenMAP) is a software tool developed by the U.S. Environmental Protection Agency (EPA) that is widely used inside and outside of EPA to produce quantitative estimates of public health risks from fine particulate matter (PM2.5 ). This article discusses the purpose and appropriate role of a risk analysis tool to support risk management deliberations, and evaluates the functions of BenMAP in this context. It highlights the importance in quantitative risk analyses of characterization of epistemic uncertainty, or outright lack of knowledge, about the true risk relationships being quantified. This article describes and quantitatively illustrates sensitivities of PM2.5 risk estimates to several key forms of epistemic uncertainty that pervade those calculations: the risk coefficient, shape of the risk function, and the relative toxicity of individual PM2.5 constituents. It also summarizes findings from a review of U.S.-based epidemiological evidence regarding the PM2.5 risk coefficient for mortality from long-term exposure. That review shows that the set of risk coefficients embedded in BenMAP substantially understates the range in the literature. We conclude that BenMAP would more usefully fulfill its role as a risk analysis support tool if its functions were extended to better enable and prompt its users to characterize the epistemic uncertainties in their risk calculations. This requires expanded automatic sensitivity analysis functions and more recognition of the full range of uncertainty in risk coefficients. © 2014 Society for Risk Analysis.

  9. Quantitative genetic tools for insecticide resistance risk assessment: estimating the heritability of resistance

    Treesearch

    Michael J. Firko; Jane Leslie Hayes

    1990-01-01

    Quantitative genetic studies of resistance can provide estimates of genetic parameters not available with other types of genetic analyses. Three methods are discussed for estimating the amount of additive genetic variation in resistance to individual insecticides and subsequent estimation of heritability (h2) of resistance. Sibling analysis and...

  10. Quantitative nanoscopy: Tackling sampling limitations in (S)TEM imaging of polymers and composites.

    PubMed

    Gnanasekaran, Karthikeyan; Snel, Roderick; de With, Gijsbertus; Friedrich, Heiner

    2016-01-01

    Sampling limitations in electron microscopy questions whether the analysis of a bulk material is representative, especially while analyzing hierarchical morphologies that extend over multiple length scales. We tackled this problem by automatically acquiring a large series of partially overlapping (S)TEM images with sufficient resolution, subsequently stitched together to generate a large-area map using an in-house developed acquisition toolbox (TU/e Acquisition ToolBox) and stitching module (TU/e Stitcher). In addition, we show that quantitative image analysis of the large scale maps provides representative information that can be related to the synthesis and process conditions of hierarchical materials, which moves electron microscopy analysis towards becoming a bulk characterization tool. We demonstrate the power of such an analysis by examining two different multi-phase materials that are structured over multiple length scales. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Targeted Quantitation of Proteins by Mass Spectrometry

    PubMed Central

    2013-01-01

    Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement. PMID:23517332

  12. Targeted quantitation of proteins by mass spectrometry.

    PubMed

    Liebler, Daniel C; Zimmerman, Lisa J

    2013-06-04

    Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement.

  13. Quantitative Analysis of the Rubric as an Assessment Tool: An Empirical Study of Student Peer-Group Rating

    ERIC Educational Resources Information Center

    Hafner, John C.; Hafner, Patti M.

    2003-01-01

    Although the rubric has emerged as one of the most popular assessment tools in progressive educational programs, there is an unfortunate dearth of information in the literature quantifying the actual effectiveness of the rubric as an assessment tool "in the hands of the students." This study focuses on the validity and reliability of the rubric as…

  14. Advantages of Social Network Analysis in Educational Research

    ERIC Educational Resources Information Center

    Ushakov, K. M.; Kukso, K. N.

    2015-01-01

    Currently one of the main tools for the large scale studies of schools is statistical analysis. Although it is the most common method and it offers greatest opportunities for analysis, there are other quantitative methods for studying schools, such as network analysis. We discuss the potential advantages that network analysis has for educational…

  15. COMPASS: a suite of pre- and post-search proteomics software tools for OMSSA

    PubMed Central

    Wenger, Craig D.; Phanstiel, Douglas H.; Lee, M. Violet; Bailey, Derek J.; Coon, Joshua J.

    2011-01-01

    Here we present the Coon OMSSA Proteomic Analysis Software Suite (COMPASS): a free and open-source software pipeline for high-throughput analysis of proteomics data, designed around the Open Mass Spectrometry Search Algorithm. We detail a synergistic set of tools for protein database generation, spectral reduction, peptide false discovery rate analysis, peptide quantitation via isobaric labeling, protein parsimony and protein false discovery rate analysis, and protein quantitation. We strive for maximum ease of use, utilizing graphical user interfaces and working with data files in the original instrument vendor format. Results are stored in plain text comma-separated values files, which are easy to view and manipulate with a text editor or spreadsheet program. We illustrate the operation and efficacy of COMPASS through the use of two LC–MS/MS datasets. The first is a dataset of a highly annotated mixture of standard proteins and manually validated contaminants that exhibits the identification workflow. The second is a dataset of yeast peptides, labeled with isobaric stable isotope tags and mixed in known ratios, to demonstrate the quantitative workflow. For these two datasets, COMPASS performs equivalently or better than the current de facto standard, the Trans-Proteomic Pipeline. PMID:21298793

  16. GUIDOS: tools for the assessment of pattern, connectivity, and fragmentation

    NASA Astrophysics Data System (ADS)

    Vogt, Peter

    2013-04-01

    Pattern, connectivity, and fragmentation can be considered as pillars for a quantitative analysis of digital landscape images. The free software toolbox GUIDOS (http://forest.jrc.ec.europa.eu/download/software/guidos) includes a variety of dedicated methodologies for the quantitative assessment of these features. Amongst others, Morphological Spatial Pattern Analysis (MSPA) is used for an intuitive description of image pattern structures and the automatic detection of connectivity pathways. GUIDOS includes tools for the detection and quantitative assessment of key nodes and links as well as to define connectedness in raster images and to setup appropriate input files for an enhanced network analysis using Conefor Sensinode. Finally, fragmentation is usually defined from a species point of view but a generic and quantifiable indicator is needed to measure fragmentation and its changes. Some preliminary results for different conceptual approaches will be shown for a sample dataset. Complemented by pre- and post-processing routines and a complete GIS environment the portable GUIDOS Toolbox may facilitate a holistic assessment in risk assessment studies, landscape planning, and conservation/restoration policies. Alternatively, individual analysis components may contribute to or enhance studies conducted with other software packages in landscape ecology.

  17. CALIPSO: an interactive image analysis software package for desktop PACS workstations

    NASA Astrophysics Data System (ADS)

    Ratib, Osman M.; Huang, H. K.

    1990-07-01

    The purpose of this project is to develop a low cost workstation for quantitative analysis of multimodality images using a Macintosh II personal computer. In the current configuration the Macintosh operates as a stand alone workstation where images are imported either from a central PACS server through a standard Ethernet network or recorded through video digitizer board. The CALIPSO software developed contains a large variety ofbasic image display and manipulation tools. We focused our effort however on the design and implementation ofquantitative analysis methods that can be applied to images from different imaging modalities. Analysis modules currently implemented include geometric and densitometric volumes and ejection fraction calculation from radionuclide and cine-angiograms Fourier analysis ofcardiac wall motion vascular stenosis measurement color coded parametric display of regional flow distribution from dynamic coronary angiograms automatic analysis ofmyocardial distribution ofradiolabelled tracers from tomoscintigraphic images. Several of these analysis tools were selected because they use similar color coded andparametric display methods to communicate quantitative data extracted from the images. 1. Rationale and objectives of the project Developments of Picture Archiving and Communication Systems (PACS) in clinical environment allow physicians and radiologists to assess radiographic images directly through imaging workstations (''). This convenient access to the images is often limited by the number of workstations available due in part to their high cost. There is also an increasing need for quantitative analysis ofthe images. During thepast decade

  18. The Fathering Indicators Framework: A Tool for Quantitative and Qualitative Analysis.

    ERIC Educational Resources Information Center

    Gadsden, Vivian, Ed.; Fagan, Jay, Ed.; Ray, Aisha, Ed.; Davis, James Earl, Ed.

    The Fathering Indicators Framework (FIF) is an evaluation tool designed to help researchers, practitioners, and policymakers conceptualize, examine, and measure change in fathering behaviors in relation to child and family well-being. This report provides a detailed overview of the research and theory informing the development of the FIF. The FIF…

  19. An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis

    NASA Astrophysics Data System (ADS)

    Kim, Yongmin; Alexander, Thomas

    1986-06-01

    In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.

  20. On-line analysis capabilities developed to support the AFW wind-tunnel tests

    NASA Technical Reports Server (NTRS)

    Wieseman, Carol D.; Hoadley, Sherwood T.; Mcgraw, Sandra M.

    1992-01-01

    A variety of on-line analysis tools were developed to support two active flexible wing (AFW) wind-tunnel tests. These tools were developed to verify control law execution, to satisfy analysis requirements of the control law designers, to provide measures of system stability in a real-time environment, and to provide project managers with a quantitative measure of controller performance. Descriptions and purposes of the developed capabilities are presented along with examples. Procedures for saving and transferring data for near real-time analysis, and descriptions of the corresponding data interface programs are also presented. The on-line analysis tools worked well before, during, and after the wind tunnel test and proved to be a vital and important part of the entire test effort.

  1. A Meta-Analysis Method to Advance Design of Technology-Based Learning Tool: Combining Qualitative and Quantitative Research to Understand Learning in Relation to Different Technology Features

    ERIC Educational Resources Information Center

    Zhang, Lin

    2014-01-01

    Educators design and create various technology tools to scaffold students' learning. As more and more technology designs are incorporated into learning, growing attention has been paid to the study of technology-based learning tool. This paper discusses the emerging issues, such as how can learning effectiveness be understood in relation to…

  2. Validation of Quantitative Multimodality Analysis of Telomerase Activity in Urine Cells as a Noninvasive Diagnostic and Prognostic Tool for Prostate Cancer

    DTIC Science & Technology

    2005-08-01

    present study, who was previously misdiagnosed with BPH and inflammation, eventually has revealed the prostate cancer with the Gleason score 7. Therefore...Noninvasive Diagnostic and Prognostic Tool for Prostate Cancer ...5a. CONTRACT NUMBER Urine Cells as a Noninvasive Diagnostic and Prognostic Tool for Prostate Cancer 5b. GRANT NUMBER W81XWH-04-1-0774 5c

  3. Condenser: a statistical aggregation tool for multi-sample quantitative proteomic data from Matrix Science Mascot Distiller™.

    PubMed

    Knudsen, Anders Dahl; Bennike, Tue; Kjeldal, Henrik; Birkelund, Svend; Otzen, Daniel Erik; Stensballe, Allan

    2014-05-30

    We describe Condenser, a freely available, comprehensive open-source tool for merging multidimensional quantitative proteomics data from the Matrix Science Mascot Distiller Quantitation Toolbox into a common format ready for subsequent bioinformatic analysis. A number of different relative quantitation technologies, such as metabolic (15)N and amino acid stable isotope incorporation, label-free and chemical-label quantitation are supported. The program features multiple options for curative filtering of the quantified peptides, allowing the user to choose data quality thresholds appropriate for the current dataset, and ensure the quality of the calculated relative protein abundances. Condenser also features optional global normalization, peptide outlier removal, multiple testing and calculation of t-test statistics for highlighting and evaluating proteins with significantly altered relative protein abundances. Condenser provides an attractive addition to the gold-standard quantitative workflow of Mascot Distiller, allowing easy handling of larger multi-dimensional experiments. Source code, binaries, test data set and documentation are available at http://condenser.googlecode.com/. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Quantitative analyses for elucidating mechanisms of cell fate commitment in the mouse blastocyst

    NASA Astrophysics Data System (ADS)

    Saiz, Néstor; Kang, Minjung; Puliafito, Alberto; Schrode, Nadine; Xenopoulos, Panagiotis; Lou, Xinghua; Di Talia, Stefano; Hadjantonakis, Anna-Katerina

    2015-03-01

    In recent years we have witnessed a shift from qualitative image analysis towards higher resolution, quantitative analyses of imaging data in developmental biology. This shift has been fueled by technological advances in both imaging and analysis software. We have recently developed a tool for accurate, semi-automated nuclear segmentation of imaging data from early mouse embryos and embryonic stem cells. We have applied this software to the study of the first lineage decisions that take place during mouse development and established analysis pipelines for both static and time-lapse imaging experiments. In this paper we summarize the conclusions from these studies to illustrate how quantitative, single-cell level analysis of imaging data can unveil biological processes that cannot be revealed by traditional qualitative studies.

  5. Putative regulatory sites unraveled by network-embedded thermodynamic analysis of metabolome data

    PubMed Central

    Kümmel, Anne; Panke, Sven; Heinemann, Matthias

    2006-01-01

    As one of the most recent members of the omics family, large-scale quantitative metabolomics data are currently complementing our systems biology data pool and offer the chance to integrate the metabolite level into the functional analysis of cellular networks. Network-embedded thermodynamic analysis (NET analysis) is presented as a framework for mechanistic and model-based analysis of these data. By coupling the data to an operating metabolic network via the second law of thermodynamics and the metabolites' Gibbs energies of formation, NET analysis allows inferring functional principles from quantitative metabolite data; for example it identifies reactions that are subject to active allosteric or genetic regulation as exemplified with quantitative metabolite data from Escherichia coli and Saccharomyces cerevisiae. Moreover, the optimization framework of NET analysis was demonstrated to be a valuable tool to systematically investigate data sets for consistency, for the extension of sub-omic metabolome data sets and for resolving intracompartmental concentrations from cell-averaged metabolome data. Without requiring any kind of kinetic modeling, NET analysis represents a perfectly scalable and unbiased approach to uncover insights from quantitative metabolome data. PMID:16788595

  6. Multicomponent quantitative spectroscopic analysis without reference substances based on ICA modelling.

    PubMed

    Monakhova, Yulia B; Mushtakova, Svetlana P

    2017-05-01

    A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.

  7. Understanding online health information: Evaluation, tools, and strategies.

    PubMed

    Beaunoyer, Elisabeth; Arsenault, Marianne; Lomanowska, Anna M; Guitton, Matthieu J

    2017-02-01

    Considering the status of the Internet as a prominent source of health information, assessing online health material has become a central issue in patient education. We describe the strategies available to evaluate the characteristics of online health information, including readability, emotional content, understandability, usability. Popular tools used in assessment of readability, emotional content and comprehensibility of online health information were reviewed. Tools designed to evaluate both printed and online material were considered. Readability tools are widely used in online health material evaluation and are highly covariant. Assessment of emotional content of online health-related communications via sentiment analysis tools is becoming more popular. Understandability and usability tools have been developed specifically for health-related material, but each tool has important limitations and has been tested on a limited number of health issues. Despite the availability of numerous assessment tools, their overall reliability differs between readability (high) and understandability (low). Approaches combining multiple assessment tools and involving both quantitative and qualitative observations would optimize assessment strategies. Effective assessment of online health information should rely on mixed strategies combining quantitative and qualitative evaluations. Assessment tools should be selected according to their functional properties and compatibility with target material. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. DICOM for quantitative imaging biomarker development: a standards based approach to sharing clinical data and structured PET/CT analysis results in head and neck cancer research.

    PubMed

    Fedorov, Andriy; Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R

    2016-01-01

    Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM(®)) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions. Conversion and visualization tools utilizing this toolkit were developed. The encoded objects were validated for consistency and interoperability. The resulting dataset was deposited in the QIN-HEADNECK collection of The Cancer Imaging Archive (TCIA). Supporting tools for data analysis and DICOM conversion were made available as free open-source software. Discussion. We presented a detailed investigation of the development and application of the DICOM model, as well as the supporting open-source tools and toolkits, to accommodate representation of the research data in QI biomarker development. We demonstrated that the DICOM standard can be used to represent the types of data relevant in HNC QI biomarker development, and encode their complex relationships. The resulting annotated objects are amenable to data mining applications, and are interoperable with a variety of systems that support the DICOM standard.

  9. Quantitative determination of a-Arbutin, ß-Arbutin, Kojic acid, nicotinamide, hydroquinone, resorcinol, 4-methoxyphenol, 4-ethoxyphenol and ascorbic acid from skin whitening Products by HPLC-UV

    USDA-ARS?s Scientific Manuscript database

    Development of an analytical method for the simultaneous determination of multifarious skin whitening agents will provide an efficient tool to analyze skin whitening cosmetics. An HPLC-UV method was developed for quantitative analysis of six commonly used whitening agents, a-arbutin, ß-arbutin, koji...

  10. The emotional coaching model: quantitative and qualitative research into relationships, communication and decisions in physical and sports rehabilitation

    PubMed Central

    RESPIZZI, STEFANO; COVELLI, ELISABETTA

    2015-01-01

    The emotional coaching model uses quantitative and qualitative elements to demonstrate some assumptions relevant to new methods of treatment in physical rehabilitation, considering emotional, cognitive and behavioral aspects in patients, whether or not they are sportsmen. Through quantitative tools (Tampa Kinesiophobia Scale, Emotional Interview Test, Previous Re-Injury Test, and reports on test scores) and qualitative tools (training contracts and relationships of emotional alliance or “contagion”), we investigate initial assumptions regarding: the presence of a cognitive and emotional mental state of impasse in patients at the beginning of the rehabilitation pathway; the curative value of the emotional alliance or “emotional contagion” relationship between healthcare provider and patient; the link between the patient’s pathology and type of contact with his own body and emotions; analysis of the psychosocial variables for the prediction of possible cases of re-injury for patients who have undergone or are afraid to undergo reconstruction of the anterior cruciate ligament (ACL). Although this approach is still in the experimental stage, the scores of the administered tests show the possibility of integrating quantitative and qualitative tools to investigate and develop a patient’s physical, mental and emotional resources during the course of his rehabilitation. Furthermore, it seems possible to identify many elements characterizing patients likely to undergo episodes of re-injury or to withdraw totally from sporting activity. In particular, such patients are competitive athletes, who fear or have previously undergone ACL reconstruction. The theories referred to (the transactional analysis theory, self-determination theory) and the tools used demonstrate the usefulness of continuing this research in order to build a shared coaching model treatment aimed at all patients, sportspeople or otherwise, which is not only physical but also emotional, cognitive and behavioral. PMID:26904525

  11. A Quantitative Analysis of Open Source Software's Acceptability as Production-Quality Code

    ERIC Educational Resources Information Center

    Fischer, Michael

    2011-01-01

    The difficulty in writing defect-free software has been long acknowledged both by academia and industry. A constant battle occurs as developers seek to craft software that works within aggressive business schedules and deadlines. Many tools and techniques are used in attempt to manage these software projects. Software metrics are a tool that has…

  12. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    NASA Astrophysics Data System (ADS)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  13. Qualitative and Quantitative Pedigree Analysis: Graph Theory, Computer Software, and Case Studies.

    ERIC Educational Resources Information Center

    Jungck, John R.; Soderberg, Patti

    1995-01-01

    Presents a series of elementary mathematical tools for re-representing pedigrees, pedigree generators, pedigree-driven database management systems, and case studies for exploring genetic relationships. (MKR)

  14. Maturity Curve of Systems Engineering

    DTIC Science & Technology

    2008-12-01

    b. Analysis of Data .......................................................... 41 4. Fuzzy Logic...the collection and analysis of data . (Hart, 1998) 13 1. Methodology Overview A qualitative approach in acquiring and managing the data was used...for this analysis . A quantitative tool was used to examine and evaluate the data . The qualitative approach was intended to sort the acquired traits

  15. Quantitative Story Telling: Initial steps towards bridging perspectives and tools for a robust nexus assessment

    NASA Astrophysics Data System (ADS)

    Cabello, Violeta

    2017-04-01

    This communication will present the advancement of an innovative analytical framework for the analysis of Water-Energy-Food-Climate Nexus termed Quantitative Story Telling (QST). The methodology is currently under development within the H2020 project MAGIC - Moving Towards Adaptive Governance in Complexity: Informing Nexus Security (www.magic-nexus.eu). The key innovation of QST is that it bridges qualitative and quantitative analytical tools into an iterative research process in which each step is built and validated in interaction with stakeholders. The qualitative analysis focusses on the identification of the narratives behind the development of relevant WEFC-Nexus policies and innovations. The quantitative engine is the Multi-Scale Analysis of Societal and Ecosystem Metabolism (MuSIASEM), a resource accounting toolkit capable of integrating multiple analytical dimensions at different scales through relational analysis. Although QST may not be labelled a data-driven but a story-driven approach, I will argue that improving models per se may not lead to an improved understanding of WEF-Nexus problems unless we are capable of generating more robust narratives to frame them. The communication will cover an introduction to MAGIC project, the basic concepts of QST and a case study focussed on agricultural production in a semi-arid region in Southern Spain. Data requirements for this case study and the limitations to find, access or estimate them will be presented alongside a reflection on the relation between analytical scales and data availability.

  16. SSBD: a database of quantitative data of spatiotemporal dynamics of biological phenomena

    PubMed Central

    Tohsato, Yukako; Ho, Kenneth H. L.; Kyoda, Koji; Onami, Shuichi

    2016-01-01

    Motivation: Rapid advances in live-cell imaging analysis and mathematical modeling have produced a large amount of quantitative data on spatiotemporal dynamics of biological objects ranging from molecules to organisms. There is now a crucial need to bring these large amounts of quantitative biological dynamics data together centrally in a coherent and systematic manner. This will facilitate the reuse of this data for further analysis. Results: We have developed the Systems Science of Biological Dynamics database (SSBD) to store and share quantitative biological dynamics data. SSBD currently provides 311 sets of quantitative data for single molecules, nuclei and whole organisms in a wide variety of model organisms from Escherichia coli to Mus musculus. The data are provided in Biological Dynamics Markup Language format and also through a REST API. In addition, SSBD provides 188 sets of time-lapse microscopy images from which the quantitative data were obtained and software tools for data visualization and analysis. Availability and Implementation: SSBD is accessible at http://ssbd.qbic.riken.jp. Contact: sonami@riken.jp PMID:27412095

  17. SSBD: a database of quantitative data of spatiotemporal dynamics of biological phenomena.

    PubMed

    Tohsato, Yukako; Ho, Kenneth H L; Kyoda, Koji; Onami, Shuichi

    2016-11-15

    Rapid advances in live-cell imaging analysis and mathematical modeling have produced a large amount of quantitative data on spatiotemporal dynamics of biological objects ranging from molecules to organisms. There is now a crucial need to bring these large amounts of quantitative biological dynamics data together centrally in a coherent and systematic manner. This will facilitate the reuse of this data for further analysis. We have developed the Systems Science of Biological Dynamics database (SSBD) to store and share quantitative biological dynamics data. SSBD currently provides 311 sets of quantitative data for single molecules, nuclei and whole organisms in a wide variety of model organisms from Escherichia coli to Mus musculus The data are provided in Biological Dynamics Markup Language format and also through a REST API. In addition, SSBD provides 188 sets of time-lapse microscopy images from which the quantitative data were obtained and software tools for data visualization and analysis. SSBD is accessible at http://ssbd.qbic.riken.jp CONTACT: sonami@riken.jp. © The Author 2016. Published by Oxford University Press.

  18. Atrioventricular junction (AVJ) motion tracking: a software tool with ITK/VTK/Qt.

    PubMed

    Pengdong Xiao; Shuang Leng; Xiaodan Zhao; Hua Zou; Ru San Tan; Wong, Philip; Liang Zhong

    2016-08-01

    The quantitative measurement of the Atrioventricular Junction (AVJ) motion is an important index for ventricular functions of one cardiac cycle including systole and diastole. In this paper, a software tool that can conduct AVJ motion tracking from cardiovascular magnetic resonance (CMR) images is presented by using Insight Segmentation and Registration Toolkit (ITK), The Visualization Toolkit (VTK) and Qt. The software tool is written in C++ by using Visual Studio Community 2013 integrated development environment (IDE) containing both an editor and a Microsoft complier. The software package has been successfully implemented. From the software engineering practice, it is concluded that ITK, VTK, and Qt are very handy software systems to implement automatic image analysis functions for CMR images such as quantitative measure of motion by visual tracking.

  19. Quantitative Analysis Of Three-dimensional Branching Systems From X-ray Computed Microtomography Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKinney, Adriana L.; Varga, Tamas

    Branching structures such as lungs, blood vessels and plant roots play a critical role in life. Growth, structure, and function of these branching structures have an immense effect on our lives. Therefore, quantitative size information on such structures in their native environment is invaluable for studying their growth and the effect of the environment on them. X-ray computed tomography (XCT) has been an effective tool for in situ imaging and analysis of branching structures. We developed a costless tool that approximates the surface and volume of branching structures. Our methodology of noninvasive imaging, segmentation and extraction of quantitative information ismore » demonstrated through the analysis of a plant root in its soil medium from 3D tomography data. XCT data collected on a grass specimen was used to visualize its root structure. A suite of open-source software was employed to segment the root from the soil and determine its isosurface, which was used to calculate its volume and surface. This methodology of processing 3D data is applicable to other branching structures even when the structure of interest is of similar x-ray attenuation to its environment and difficulties arise with sample segmentation.« less

  20. A dynamic regression analysis tool for quantitative assessment of bacterial growth written in Python.

    PubMed

    Hoeflinger, Jennifer L; Hoeflinger, Daniel E; Miller, Michael J

    2017-01-01

    Herein, an open-source method to generate quantitative bacterial growth data from high-throughput microplate assays is described. The bacterial lag time, maximum specific growth rate, doubling time and delta OD are reported. Our method was validated by carbohydrate utilization of lactobacilli, and visual inspection revealed 94% of regressions were deemed excellent. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. An open tool for input function estimation and quantification of dynamic PET FDG brain scans.

    PubMed

    Bertrán, Martín; Martínez, Natalia; Carbajal, Guillermo; Fernández, Alicia; Gómez, Álvaro

    2016-08-01

    Positron emission tomography (PET) analysis of clinical studies is mostly restricted to qualitative evaluation. Quantitative analysis of PET studies is highly desirable to be able to compute an objective measurement of the process of interest in order to evaluate treatment response and/or compare patient data. But implementation of quantitative analysis generally requires the determination of the input function: the arterial blood or plasma activity which indicates how much tracer is available for uptake in the brain. The purpose of our work was to share with the community an open software tool that can assist in the estimation of this input function, and the derivation of a quantitative map from the dynamic PET study. Arterial blood sampling during the PET study is the gold standard method to get the input function, but is uncomfortable and risky for the patient so it is rarely used in routine studies. To overcome the lack of a direct input function, different alternatives have been devised and are available in the literature. These alternatives derive the input function from the PET image itself (image-derived input function) or from data gathered from previous similar studies (population-based input function). In this article, we present ongoing work that includes the development of a software tool that integrates several methods with novel strategies for the segmentation of blood pools and parameter estimation. The tool is available as an extension to the 3D Slicer software. Tests on phantoms were conducted in order to validate the implemented methods. We evaluated the segmentation algorithms over a range of acquisition conditions and vasculature size. Input function estimation algorithms were evaluated against ground truth of the phantoms, as well as on their impact over the final quantification map. End-to-end use of the tool yields quantification maps with [Formula: see text] relative error in the estimated influx versus ground truth on phantoms. The main contribution of this article is the development of an open-source, free to use tool that encapsulates several well-known methods for the estimation of the input function and the quantification of dynamic PET FDG studies. Some alternative strategies are also proposed and implemented in the tool for the segmentation of blood pools and parameter estimation. The tool was tested on phantoms with encouraging results that suggest that even bloodless estimators could provide a viable alternative to blood sampling for quantification using graphical analysis. The open tool is a promising opportunity for collaboration among investigators and further validation on real studies.

  2. Segmentation and Quantitative Analysis of Epithelial Tissues.

    PubMed

    Aigouy, Benoit; Umetsu, Daiki; Eaton, Suzanne

    2016-01-01

    Epithelia are tissues that regulate exchanges with the environment. They are very dynamic and can acquire virtually any shape; at the cellular level, they are composed of cells tightly connected by junctions. Most often epithelia are amenable to live imaging; however, the large number of cells composing an epithelium and the absence of informatics tools dedicated to epithelial analysis largely prevented tissue scale studies. Here we present Tissue Analyzer, a free tool that can be used to segment and analyze epithelial cells and monitor tissue dynamics.

  3. Quantitative determination and validation of octreotide acetate using 1 H-NMR spectroscopy with internal standard method.

    PubMed

    Yu, Chen; Zhang, Qian; Xu, Peng-Yao; Bai, Yin; Shen, Wen-Bin; Di, Bin; Su, Meng-Xiang

    2018-01-01

    Quantitative nuclear magnetic resonance (qNMR) is a well-established technique in quantitative analysis. We presented a validated 1 H-qNMR method for assay of octreotide acetate, a kind of cyclic octopeptide. Deuterium oxide was used to remove the undesired exchangeable peaks, which was referred to as proton exchange, in order to make the quantitative signals isolated in the crowded spectrum of the peptide and ensure precise quantitative analysis. Gemcitabine hydrochloride was chosen as the suitable internal standard. Experimental conditions, including relaxation delay time, the numbers of scans, and pulse angle, were optimized first. Then method validation was carried out in terms of selectivity, stability, linearity, precision, and robustness. The assay result was compared with that by means of high performance liquid chromatography, which is provided by Chinese Pharmacopoeia. The statistical F test, Student's t test, and nonparametric test at 95% confidence level indicate that there was no significant difference between these two methods. qNMR is a simple and accurate quantitative tool with no need for specific corresponding reference standards. It has the potential of the quantitative analysis of other peptide drugs and standardization of the corresponding reference standards. Copyright © 2017 John Wiley & Sons, Ltd.

  4. Quantiprot - a Python package for quantitative analysis of protein sequences.

    PubMed

    Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold

    2017-07-17

    The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.

  5. A Quantitative Three-Dimensional Image Analysis Tool for Maximal Acquisition of Spatial Heterogeneity Data.

    PubMed

    Allenby, Mark C; Misener, Ruth; Panoskaltsis, Nicki; Mantalaris, Athanasios

    2017-02-01

    Three-dimensional (3D) imaging techniques provide spatial insight into environmental and cellular interactions and are implemented in various fields, including tissue engineering, but have been restricted by limited quantification tools that misrepresent or underutilize the cellular phenomena captured. This study develops image postprocessing algorithms pairing complex Euclidean metrics with Monte Carlo simulations to quantitatively assess cell and microenvironment spatial distributions while utilizing, for the first time, the entire 3D image captured. Although current methods only analyze a central fraction of presented confocal microscopy images, the proposed algorithms can utilize 210% more cells to calculate 3D spatial distributions that can span a 23-fold longer distance. These algorithms seek to leverage the high sample cost of 3D tissue imaging techniques by extracting maximal quantitative data throughout the captured image.

  6. Quantitative Imaging In Pathology (QUIP) | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    This site hosts web accessible applications, tools and data designed to support analysis, management, and exploration of whole slide tissue images for cancer research. The following tools are included: caMicroscope: A digital pathology data management and visualization plaform that enables interactive viewing of whole slide tissue images and segmentation results. caMicroscope can be also used independently of QUIP. FeatureExplorer: An interactive tool to allow patient-level feature exploration across multiple dimensions.

  7. Using Image Modelling to Teach Newton's Laws with the Ollie Trick

    ERIC Educational Resources Information Center

    Dias, Marco Adriano; Carvalho, Paulo Simeão; Vianna, Deise Miranda

    2016-01-01

    Image modelling is a video-based teaching tool that is a combination of strobe images and video analysis. This tool can enable a qualitative and a quantitative approach to the teaching of physics, in a much more engaging and appealling way than the traditional expositive practice. In a specific scenario shown in this paper, the Ollie trick, we…

  8. Evaluating biomarkers for prognostic enrichment of clinical trials.

    PubMed

    Kerr, Kathleen F; Roth, Jeremy; Zhu, Kehao; Thiessen-Philbrook, Heather; Meisner, Allison; Wilson, Francis Perry; Coca, Steven; Parikh, Chirag R

    2017-12-01

    A potential use of biomarkers is to assist in prognostic enrichment of clinical trials, where only patients at relatively higher risk for an outcome of interest are eligible for the trial. We investigated methods for evaluating biomarkers for prognostic enrichment. We identified five key considerations when considering a biomarker and a screening threshold for prognostic enrichment: (1) clinical trial sample size, (2) calendar time to enroll the trial, (3) total patient screening costs and the total per-patient trial costs, (4) generalizability of trial results, and (5) ethical evaluation of trial eligibility criteria. Items (1)-(3) are amenable to quantitative analysis. We developed the Biomarker Prognostic Enrichment Tool for evaluating biomarkers for prognostic enrichment at varying levels of screening stringency. We demonstrate that both modestly prognostic and strongly prognostic biomarkers can improve trial metrics using Biomarker Prognostic Enrichment Tool. Biomarker Prognostic Enrichment Tool is available as a webtool at http://prognosticenrichment.com and as a package for the R statistical computing platform. In some clinical settings, even biomarkers with modest prognostic performance can be useful for prognostic enrichment. In addition to the quantitative analysis provided by Biomarker Prognostic Enrichment Tool, investigators must consider the generalizability of trial results and evaluate the ethics of trial eligibility criteria.

  9. QPROT: Statistical method for testing differential expression using protein-level intensity data in label-free quantitative proteomics.

    PubMed

    Choi, Hyungwon; Kim, Sinae; Fermin, Damian; Tsou, Chih-Chiang; Nesvizhskii, Alexey I

    2015-11-03

    We introduce QPROT, a statistical framework and computational tool for differential protein expression analysis using protein intensity data. QPROT is an extension of the QSPEC suite, originally developed for spectral count data, adapted for the analysis using continuously measured protein-level intensity data. QPROT offers a new intensity normalization procedure and model-based differential expression analysis, both of which account for missing data. Determination of differential expression of each protein is based on the standardized Z-statistic based on the posterior distribution of the log fold change parameter, guided by the false discovery rate estimated by a well-known Empirical Bayes method. We evaluated the classification performance of QPROT using the quantification calibration data from the clinical proteomic technology assessment for cancer (CPTAC) study and a recently published Escherichia coli benchmark dataset, with evaluation of FDR accuracy in the latter. QPROT is a statistical framework with computational software tool for comparative quantitative proteomics analysis. It features various extensions of QSPEC method originally built for spectral count data analysis, including probabilistic treatment of missing values in protein intensity data. With the increasing popularity of label-free quantitative proteomics data, the proposed method and accompanying software suite will be immediately useful for many proteomics laboratories. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Current trends in quantitative proteomics - an update.

    PubMed

    Li, H; Han, J; Pan, J; Liu, T; Parker, C E; Borchers, C H

    2017-05-01

    Proteins can provide insights into biological processes at the functional level, so they are very promising biomarker candidates. The quantification of proteins in biological samples has been routinely used for the diagnosis of diseases and monitoring the treatment. Although large-scale protein quantification in complex samples is still a challenging task, a great amount of effort has been made to advance the technologies that enable quantitative proteomics. Seven years ago, in 2009, we wrote an article about the current trends in quantitative proteomics. In writing this current paper, we realized that, today, we have an even wider selection of potential tools for quantitative proteomics. These tools include new derivatization reagents, novel sampling formats, new types of analyzers and scanning techniques, and recently developed software to assist in assay development and data analysis. In this review article, we will discuss these innovative methods, and their current and potential applications in proteomics. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  11. The Spectral Image Processing System (SIPS) - Interactive visualization and analysis of imaging spectrometer data

    NASA Technical Reports Server (NTRS)

    Kruse, F. A.; Lefkoff, A. B.; Boardman, J. W.; Heidebrecht, K. B.; Shapiro, A. T.; Barloon, P. J.; Goetz, A. F. H.

    1993-01-01

    The Center for the Study of Earth from Space (CSES) at the University of Colorado, Boulder, has developed a prototype interactive software system called the Spectral Image Processing System (SIPS) using IDL (the Interactive Data Language) on UNIX-based workstations. SIPS is designed to take advantage of the combination of high spectral resolution and spatial data presentation unique to imaging spectrometers. It streamlines analysis of these data by allowing scientists to rapidly interact with entire datasets. SIPS provides visualization tools for rapid exploratory analysis and numerical tools for quantitative modeling. The user interface is X-Windows-based, user friendly, and provides 'point and click' operation. SIPS is being used for multidisciplinary research concentrating on use of physically based analysis methods to enhance scientific results from imaging spectrometer data. The objective of this continuing effort is to develop operational techniques for quantitative analysis of imaging spectrometer data and to make them available to the scientific community prior to the launch of imaging spectrometer satellite systems such as the Earth Observing System (EOS) High Resolution Imaging Spectrometer (HIRIS).

  12. The Use of Modelling for Theory Building in Qualitative Analysis

    ERIC Educational Resources Information Center

    Briggs, Ann R. J.

    2007-01-01

    The purpose of this article is to exemplify and enhance the place of modelling as a qualitative process in educational research. Modelling is widely used in quantitative research as a tool for analysis, theory building and prediction. Statistical data lend themselves to graphical representation of values, interrelationships and operational…

  13. Approaches to the Analysis of School Costs, an Introduction.

    ERIC Educational Resources Information Center

    Payzant, Thomas

    A review and general discussion of quantitative and qualitative techniques for the analysis of economic problems outside of education is presented to help educators discover new tools for planning, allocating, and evaluating educational resources. The pamphlet covers some major components of cost accounting, cost effectiveness, cost-benefit…

  14. Addressing multi-label imbalance problem of surgical tool detection using CNN.

    PubMed

    Sahu, Manish; Mukhopadhyay, Anirban; Szengel, Angelika; Zachow, Stefan

    2017-06-01

    A fully automated surgical tool detection framework is proposed for endoscopic video streams. State-of-the-art surgical tool detection methods rely on supervised one-vs-all or multi-class classification techniques, completely ignoring the co-occurrence relationship of the tools and the associated class imbalance. In this paper, we formulate tool detection as a multi-label classification task where tool co-occurrences are treated as separate classes. In addition, imbalance on tool co-occurrences is analyzed and stratification techniques are employed to address the imbalance during convolutional neural network (CNN) training. Moreover, temporal smoothing is introduced as an online post-processing step to enhance runtime prediction. Quantitative analysis is performed on the M2CAI16 tool detection dataset to highlight the importance of stratification, temporal smoothing and the overall framework for tool detection. The analysis on tool imbalance, backed by the empirical results, indicates the need and superiority of the proposed framework over state-of-the-art techniques.

  15. A Tutorial on Multiblock Discriminant Correspondence Analysis (MUDICA): A New Method for Analyzing Discourse Data from Clinical Populations

    ERIC Educational Resources Information Center

    Williams, Lynne J.; Abdi, Herve; French, Rebecca; Orange, Joseph B.

    2010-01-01

    Purpose: In communication disorders research, clinical groups are frequently described based on patterns of performance, but researchers often study only a few participants described by many quantitative and qualitative variables. These data are difficult to handle with standard inferential tools (e.g., analysis of variance or factor analysis)…

  16. ImatraNMR: novel software for batch integration and analysis of quantitative NMR spectra.

    PubMed

    Mäkelä, A V; Heikkilä, O; Kilpeläinen, I; Heikkinen, S

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D (1)H and (13)C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request. Copyright © 2011 Elsevier Inc. All rights reserved.

  17. Web-based tools for modelling and analysis of multivariate data: California ozone pollution activity

    PubMed Central

    Dinov, Ivo D.; Christou, Nicolas

    2014-01-01

    This article presents a hands-on web-based activity motivated by the relation between human health and ozone pollution in California. This case study is based on multivariate data collected monthly at 20 locations in California between 1980 and 2006. Several strategies and tools for data interrogation and exploratory data analysis, model fitting and statistical inference on these data are presented. All components of this case study (data, tools, activity) are freely available online at: http://wiki.stat.ucla.edu/socr/index.php/SOCR_MotionCharts_CAOzoneData. Several types of exploratory (motion charts, box-and-whisker plots, spider charts) and quantitative (inference, regression, analysis of variance (ANOVA)) data analyses tools are demonstrated. Two specific human health related questions (temporal and geographic effects of ozone pollution) are discussed as motivational challenges. PMID:24465054

  18. Web-based tools for modelling and analysis of multivariate data: California ozone pollution activity.

    PubMed

    Dinov, Ivo D; Christou, Nicolas

    2011-09-01

    This article presents a hands-on web-based activity motivated by the relation between human health and ozone pollution in California. This case study is based on multivariate data collected monthly at 20 locations in California between 1980 and 2006. Several strategies and tools for data interrogation and exploratory data analysis, model fitting and statistical inference on these data are presented. All components of this case study (data, tools, activity) are freely available online at: http://wiki.stat.ucla.edu/socr/index.php/SOCR_MotionCharts_CAOzoneData. Several types of exploratory (motion charts, box-and-whisker plots, spider charts) and quantitative (inference, regression, analysis of variance (ANOVA)) data analyses tools are demonstrated. Two specific human health related questions (temporal and geographic effects of ozone pollution) are discussed as motivational challenges.

  19. Quantitative Assessment of Arrhythmia Using Non-linear Approach: A Non-invasive Prognostic Tool

    NASA Astrophysics Data System (ADS)

    Chakraborty, Monisha; Ghosh, Dipak

    2017-12-01

    Accurate prognostic tool to identify severity of Arrhythmia is yet to be investigated, owing to the complexity of the ECG signal. In this paper, we have shown that quantitative assessment of Arrhythmia is possible using non-linear technique based on "Hurst Rescaled Range Analysis". Although the concept of applying "non-linearity" for studying various cardiac dysfunctions is not entirely new, the novel objective of this paper is to identify the severity of the disease, monitoring of different medicine and their dose, and also to assess the efficiency of different medicine. The approach presented in this work is simple which in turn will help doctors in efficient disease management. In this work, Arrhythmia ECG time series are collected from MIT-BIH database. Normal ECG time series are acquired using POLYPARA system. Both time series are analyzed in thelight of non-linear approach following the method "Rescaled Range Analysis". The quantitative parameter, "Fractal Dimension" (D) is obtained from both types of time series. The major finding is that Arrhythmia ECG poses lower values of D as compared to normal. Further, this information can be used to access the severity of Arrhythmia quantitatively, which is a new direction of prognosis as well as adequate software may be developed for the use of medical practice.

  20. Quantitative Assessment of Arrhythmia Using Non-linear Approach: A Non-invasive Prognostic Tool

    NASA Astrophysics Data System (ADS)

    Chakraborty, Monisha; Ghosh, Dipak

    2018-04-01

    Accurate prognostic tool to identify severity of Arrhythmia is yet to be investigated, owing to the complexity of the ECG signal. In this paper, we have shown that quantitative assessment of Arrhythmia is possible using non-linear technique based on "Hurst Rescaled Range Analysis". Although the concept of applying "non-linearity" for studying various cardiac dysfunctions is not entirely new, the novel objective of this paper is to identify the severity of the disease, monitoring of different medicine and their dose, and also to assess the efficiency of different medicine. The approach presented in this work is simple which in turn will help doctors in efficient disease management. In this work, Arrhythmia ECG time series are collected from MIT-BIH database. Normal ECG time series are acquired using POLYPARA system. Both time series are analyzed in thelight of non-linear approach following the method "Rescaled Range Analysis". The quantitative parameter, "Fractal Dimension" (D) is obtained from both types of time series. The major finding is that Arrhythmia ECG poses lower values of D as compared to normal. Further, this information can be used to access the severity of Arrhythmia quantitatively, which is a new direction of prognosis as well as adequate software may be developed for the use of medical practice.

  1. Advanced body composition assessment: from body mass index to body composition profiling.

    PubMed

    Borga, Magnus; West, Janne; Bell, Jimmy D; Harvey, Nicholas C; Romu, Thobias; Heymsfield, Steven B; Dahlqvist Leinhard, Olof

    2018-06-01

    This paper gives a brief overview of common non-invasive techniques for body composition analysis and a more in-depth review of a body composition assessment method based on fat-referenced quantitative MRI. Earlier published studies of this method are summarized, and a previously unpublished validation study, based on 4753 subjects from the UK Biobank imaging cohort, comparing the quantitative MRI method with dual-energy X-ray absorptiometry (DXA) is presented. For whole-body measurements of adipose tissue (AT) or fat and lean tissue (LT), DXA and quantitative MRIs show excellent agreement with linear correlation of 0.99 and 0.97, and coefficient of variation (CV) of 4.5 and 4.6 per cent for fat (computed from AT) and LT, respectively, but the agreement was found significantly lower for visceral adipose tissue, with a CV of >20 per cent. The additional ability of MRI to also measure muscle volumes, muscle AT infiltration and ectopic fat, in combination with rapid scanning protocols and efficient image analysis tools, makes quantitative MRI a powerful tool for advanced body composition assessment. © American Federation for Medical Research (unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. Motion based parsing for video from observational psychology

    NASA Astrophysics Data System (ADS)

    Kokaram, Anil; Doyle, Erika; Lennon, Daire; Joyeux, Laurent; Fuller, Ray

    2006-01-01

    In Psychology it is common to conduct studies involving the observation of humans undertaking some task. The sessions are typically recorded on video and used for subjective visual analysis. The subjective analysis is tedious and time consuming, not only because much useless video material is recorded but also because subjective measures of human behaviour are not necessarily repeatable. This paper presents tools using content based video analysis that allow automated parsing of video from one such study involving Dyslexia. The tools rely on implicit measures of human motion that can be generalised to other applications in the domain of human observation. Results comparing quantitative assessment of human motion with subjective assessment are also presented, illustrating that the system is a useful scientific tool.

  3. A computational image analysis glossary for biologists.

    PubMed

    Roeder, Adrienne H K; Cunha, Alexandre; Burl, Michael C; Meyerowitz, Elliot M

    2012-09-01

    Recent advances in biological imaging have resulted in an explosion in the quality and quantity of images obtained in a digital format. Developmental biologists are increasingly acquiring beautiful and complex images, thus creating vast image datasets. In the past, patterns in image data have been detected by the human eye. Larger datasets, however, necessitate high-throughput objective analysis tools to computationally extract quantitative information from the images. These tools have been developed in collaborations between biologists, computer scientists, mathematicians and physicists. In this Primer we present a glossary of image analysis terms to aid biologists and briefly discuss the importance of robust image analysis in developmental studies.

  4. Biological Dynamics Markup Language (BDML): an open format for representing quantitative biological dynamics data

    PubMed Central

    Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H. L.; Onami, Shuichi

    2015-01-01

    Motivation: Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. Results: We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. Availability and implementation: A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Contact: sonami@riken.jp Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:25414366

  5. Biological Dynamics Markup Language (BDML): an open format for representing quantitative biological dynamics data.

    PubMed

    Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H L; Onami, Shuichi

    2015-04-01

    Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  6. DICOM for quantitative imaging biomarker development: a standards based approach to sharing clinical data and structured PET/CT analysis results in head and neck cancer research

    PubMed Central

    Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R.

    2016-01-01

    Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM®) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions. Conversion and visualization tools utilizing this toolkit were developed. The encoded objects were validated for consistency and interoperability. The resulting dataset was deposited in the QIN-HEADNECK collection of The Cancer Imaging Archive (TCIA). Supporting tools for data analysis and DICOM conversion were made available as free open-source software. Discussion. We presented a detailed investigation of the development and application of the DICOM model, as well as the supporting open-source tools and toolkits, to accommodate representation of the research data in QI biomarker development. We demonstrated that the DICOM standard can be used to represent the types of data relevant in HNC QI biomarker development, and encode their complex relationships. The resulting annotated objects are amenable to data mining applications, and are interoperable with a variety of systems that support the DICOM standard. PMID:27257542

  7. TSCAN: Pseudo-time reconstruction and evaluation in single-cell RNA-seq analysis

    PubMed Central

    Ji, Zhicheng; Ji, Hongkai

    2016-01-01

    When analyzing single-cell RNA-seq data, constructing a pseudo-temporal path to order cells based on the gradual transition of their transcriptomes is a useful way to study gene expression dynamics in a heterogeneous cell population. Currently, a limited number of computational tools are available for this task, and quantitative methods for comparing different tools are lacking. Tools for Single Cell Analysis (TSCAN) is a software tool developed to better support in silico pseudo-Time reconstruction in Single-Cell RNA-seq ANalysis. TSCAN uses a cluster-based minimum spanning tree (MST) approach to order cells. Cells are first grouped into clusters and an MST is then constructed to connect cluster centers. Pseudo-time is obtained by projecting each cell onto the tree, and the ordered sequence of cells can be used to study dynamic changes of gene expression along the pseudo-time. Clustering cells before MST construction reduces the complexity of the tree space. This often leads to improved cell ordering. It also allows users to conveniently adjust the ordering based on prior knowledge. TSCAN has a graphical user interface (GUI) to support data visualization and user interaction. Furthermore, quantitative measures are developed to objectively evaluate and compare different pseudo-time reconstruction methods. TSCAN is available at https://github.com/zji90/TSCAN and as a Bioconductor package. PMID:27179027

  8. TSCAN: Pseudo-time reconstruction and evaluation in single-cell RNA-seq analysis.

    PubMed

    Ji, Zhicheng; Ji, Hongkai

    2016-07-27

    When analyzing single-cell RNA-seq data, constructing a pseudo-temporal path to order cells based on the gradual transition of their transcriptomes is a useful way to study gene expression dynamics in a heterogeneous cell population. Currently, a limited number of computational tools are available for this task, and quantitative methods for comparing different tools are lacking. Tools for Single Cell Analysis (TSCAN) is a software tool developed to better support in silico pseudo-Time reconstruction in Single-Cell RNA-seq ANalysis. TSCAN uses a cluster-based minimum spanning tree (MST) approach to order cells. Cells are first grouped into clusters and an MST is then constructed to connect cluster centers. Pseudo-time is obtained by projecting each cell onto the tree, and the ordered sequence of cells can be used to study dynamic changes of gene expression along the pseudo-time. Clustering cells before MST construction reduces the complexity of the tree space. This often leads to improved cell ordering. It also allows users to conveniently adjust the ordering based on prior knowledge. TSCAN has a graphical user interface (GUI) to support data visualization and user interaction. Furthermore, quantitative measures are developed to objectively evaluate and compare different pseudo-time reconstruction methods. TSCAN is available at https://github.com/zji90/TSCAN and as a Bioconductor package. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  9. Using enterprise architecture to analyse how organisational structure impact motivation and learning

    NASA Astrophysics Data System (ADS)

    Närman, Pia; Johnson, Pontus; Gingnell, Liv

    2016-06-01

    When technology, environment, or strategies change, organisations need to adjust their structures accordingly. These structural changes do not always enhance the organisational performance as intended partly because organisational developers do not understand the consequences of structural changes in performance. This article presents a model-based analysis framework for quantitative analysis of the effect of organisational structure on organisation performance in terms of employee motivation and learning. The model is based on Mintzberg's work on organisational structure. The quantitative analysis is formalised using the Object Constraint Language (OCL) and the Unified Modelling Language (UML) and implemented in an enterprise architecture tool.

  10. Quantitative 3D breast magnetic resonance imaging fibroglandular tissue analysis and correlation with qualitative assessments: a feasibility study.

    PubMed

    Ha, Richard; Mema, Eralda; Guo, Xiaotao; Mango, Victoria; Desperito, Elise; Ha, Jason; Wynn, Ralph; Zhao, Binsheng

    2016-04-01

    The amount of fibroglandular tissue (FGT) has been linked to breast cancer risk based on mammographic density studies. Currently, the qualitative assessment of FGT on mammogram (MG) and magnetic resonance imaging (MRI) is prone to intra and inter-observer variability. The purpose of this study is to develop an objective quantitative FGT measurement tool for breast MRI that could provide significant clinical value. An IRB approved study was performed. Sixty breast MRI cases with qualitative assessment of mammographic breast density and MRI FGT were randomly selected for quantitative analysis from routine breast MRIs performed at our institution from 1/2013 to 12/2014. Blinded to the qualitative data, whole breast and FGT contours were delineated on T1-weighted pre contrast sagittal images using an in-house, proprietary segmentation algorithm which combines the region-based active contours and a level set approach. FGT (%) was calculated by: [segmented volume of FGT (mm(3))/(segmented volume of whole breast (mm(3))] ×100. Statistical correlation analysis was performed between quantified FGT (%) on MRI and qualitative assessments of mammographic breast density and MRI FGT. There was a significant positive correlation between quantitative MRI FGT assessment and qualitative MRI FGT (r=0.809, n=60, P<0.001) and mammographic density assessment (r=0.805, n=60, P<0.001). There was a significant correlation between qualitative MRI FGT assessment and mammographic density assessment (r=0.725, n=60, P<0.001). The four qualitative assessment categories of FGT correlated with the calculated mean quantitative FGT (%) of 4.61% (95% CI, 0-12.3%), 8.74% (7.3-10.2%), 18.1% (15.1-21.1%), 37.4% (29.5-45.3%). Quantitative measures of FGT (%) were computed with data derived from breast MRI and correlated significantly with conventional qualitative assessments. This quantitative technique may prove to be a valuable tool in clinical use by providing computer generated standardized measurements with limited intra or inter-observer variability.

  11. Software for quantitative analysis of radiotherapy: overview, requirement analysis and design solutions.

    PubMed

    Zhang, Lanlan; Hub, Martina; Mang, Sarah; Thieke, Christian; Nix, Oliver; Karger, Christian P; Floca, Ralf O

    2013-06-01

    Radiotherapy is a fast-developing discipline which plays a major role in cancer care. Quantitative analysis of radiotherapy data can improve the success of the treatment and support the prediction of outcome. In this paper, we first identify functional, conceptional and general requirements on a software system for quantitative analysis of radiotherapy. Further we present an overview of existing radiotherapy analysis software tools and check them against the stated requirements. As none of them could meet all of the demands presented herein, we analyzed possible conceptional problems and present software design solutions and recommendations to meet the stated requirements (e.g. algorithmic decoupling via dose iterator pattern; analysis database design). As a proof of concept we developed a software library "RTToolbox" following the presented design principles. The RTToolbox is available as open source library and has already been tested in a larger-scale software system for different use cases. These examples demonstrate the benefit of the presented design principles. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  12. Oufti: An integrated software package for high-accuracy, high-throughput quantitative microscopy analysis

    PubMed Central

    Paintdakhi, Ahmad; Parry, Bradley; Campos, Manuel; Irnov, Irnov; Elf, Johan; Surovtsev, Ivan; Jacobs-Wagner, Christine

    2016-01-01

    Summary With the realization that bacteria display phenotypic variability among cells and exhibit complex subcellular organization critical for cellular function and behavior, microscopy has re-emerged as a primary tool in bacterial research during the last decade. However, the bottleneck in today’s single-cell studies is quantitative image analysis of cells and fluorescent signals. Here, we address current limitations through the development of Oufti, a stand-alone, open-source software package for automated measurements of microbial cells and fluorescence signals from microscopy images. Oufti provides computational solutions for tracking touching cells in confluent samples, handles various cell morphologies, offers algorithms for quantitative analysis of both diffraction and non-diffraction-limited fluorescence signals, and is scalable for high-throughput analysis of massive datasets, all with subpixel precision. All functionalities are integrated in a single package. The graphical user interface, which includes interactive modules for segmentation, image analysis, and post-processing analysis, makes the software broadly accessible to users irrespective of their computational skills. PMID:26538279

  13. A Dimensionally Reduced Clustering Methodology for Heterogeneous Occupational Medicine Data Mining.

    PubMed

    Saâdaoui, Foued; Bertrand, Pierre R; Boudet, Gil; Rouffiac, Karine; Dutheil, Frédéric; Chamoux, Alain

    2015-10-01

    Clustering is a set of techniques of the statistical learning aimed at finding structures of heterogeneous partitions grouping homogenous data called clusters. There are several fields in which clustering was successfully applied, such as medicine, biology, finance, economics, etc. In this paper, we introduce the notion of clustering in multifactorial data analysis problems. A case study is conducted for an occupational medicine problem with the purpose of analyzing patterns in a population of 813 individuals. To reduce the data set dimensionality, we base our approach on the Principal Component Analysis (PCA), which is the statistical tool most commonly used in factorial analysis. However, the problems in nature, especially in medicine, are often based on heterogeneous-type qualitative-quantitative measurements, whereas PCA only processes quantitative ones. Besides, qualitative data are originally unobservable quantitative responses that are usually binary-coded. Hence, we propose a new set of strategies allowing to simultaneously handle quantitative and qualitative data. The principle of this approach is to perform a projection of the qualitative variables on the subspaces spanned by quantitative ones. Subsequently, an optimal model is allocated to the resulting PCA-regressed subspaces.

  14. SedCT: MATLAB™ tools for standardized and quantitative processing of sediment core computed tomography (CT) data collected using a medical CT scanner

    NASA Astrophysics Data System (ADS)

    Reilly, B. T.; Stoner, J. S.; Wiest, J.

    2017-08-01

    Computed tomography (CT) of sediment cores allows for high-resolution images, three-dimensional volumes, and down core profiles. These quantitative data are generated through the attenuation of X-rays, which are sensitive to sediment density and atomic number, and are stored in pixels as relative gray scale values or Hounsfield units (HU). We present a suite of MATLAB™ tools specifically designed for routine sediment core analysis as a means to standardize and better quantify the products of CT data collected on medical CT scanners. SedCT uses a graphical interface to process Digital Imaging and Communications in Medicine (DICOM) files, stitch overlapping scanned intervals, and create down core HU profiles in a manner robust to normal coring imperfections. Utilizing a random sampling technique, SedCT reduces data size and allows for quick processing on typical laptop computers. SedCTimage uses a graphical interface to create quality tiff files of CT slices that are scaled to a user-defined HU range, preserving the quantitative nature of CT images and easily allowing for comparison between sediment cores with different HU means and variance. These tools are presented along with examples from lacustrine and marine sediment cores to highlight the robustness and quantitative nature of this method.

  15. Determining absolute protein numbers by quantitative fluorescence microscopy.

    PubMed

    Verdaasdonk, Jolien Suzanne; Lawrimore, Josh; Bloom, Kerry

    2014-01-01

    Biological questions are increasingly being addressed using a wide range of quantitative analytical tools to examine protein complex composition. Knowledge of the absolute number of proteins present provides insights into organization, function, and maintenance and is used in mathematical modeling of complex cellular dynamics. In this chapter, we outline and describe three microscopy-based methods for determining absolute protein numbers--fluorescence correlation spectroscopy, stepwise photobleaching, and ratiometric comparison of fluorescence intensity to known standards. In addition, we discuss the various fluorescently labeled proteins that have been used as standards for both stepwise photobleaching and ratiometric comparison analysis. A detailed procedure for determining absolute protein number by ratiometric comparison is outlined in the second half of this chapter. Counting proteins by quantitative microscopy is a relatively simple yet very powerful analytical tool that will increase our understanding of protein complex composition. © 2014 Elsevier Inc. All rights reserved.

  16. A review on recent developments in mass spectrometry instrumentation and quantitative tools advancing bacterial proteomics.

    PubMed

    Van Oudenhove, Laurence; Devreese, Bart

    2013-06-01

    Proteomics has evolved substantially since its early days, some 20 years ago. In this mini-review, we aim to provide an overview of general methodologies and more recent developments in mass spectrometric approaches used for relative and absolute quantitation of proteins. Enhancement of sensitivity of the mass spectrometers as well as improved sample preparation and protein fractionation methods are resulting in a more comprehensive analysis of proteomes. We also document some upcoming trends for quantitative proteomics such as the use of label-free quantification methods. Hopefully, microbiologists will continue to explore proteomics as a tool in their research to understand the adaptation of microorganisms to their ever changing environment. We encourage them to incorporate some of the described new developments in mass spectrometry to facilitate their analyses and improve the general knowledge of the fascinating world of microorganisms.

  17. Analysis of Vaginal Microbicide Film Hydration Kinetics by Quantitative Imaging Refractometry

    PubMed Central

    Rinehart, Matthew; Grab, Sheila; Rohan, Lisa; Katz, David; Wax, Adam

    2014-01-01

    We have developed a quantitative imaging refractometry technique, based on holographic phase microscopy, as a tool for investigating microscopic structural changes in water-soluble polymeric materials. Here we apply the approach to analyze the structural degradation of vaginal topical microbicide films due to water uptake. We implemented transmission imaging of 1-mm diameter film samples loaded into a flow chamber with a 1.5×2 mm field of view. After water was flooded into the chamber, interference images were captured and analyzed to obtain high resolution maps of the local refractive index and subsequently the volume fraction and mass density of film material at each spatial location. Here, we compare the hydration dynamics of a panel of films with varying thicknesses and polymer compositions, demonstrating that quantitative imaging refractometry can be an effective tool for evaluating and characterizing the performance of candidate microbicide film designs for anti-HIV drug delivery. PMID:24736376

  18. Analysis of vaginal microbicide film hydration kinetics by quantitative imaging refractometry.

    PubMed

    Rinehart, Matthew; Grab, Sheila; Rohan, Lisa; Katz, David; Wax, Adam

    2014-01-01

    We have developed a quantitative imaging refractometry technique, based on holographic phase microscopy, as a tool for investigating microscopic structural changes in water-soluble polymeric materials. Here we apply the approach to analyze the structural degradation of vaginal topical microbicide films due to water uptake. We implemented transmission imaging of 1-mm diameter film samples loaded into a flow chamber with a 1.5×2 mm field of view. After water was flooded into the chamber, interference images were captured and analyzed to obtain high resolution maps of the local refractive index and subsequently the volume fraction and mass density of film material at each spatial location. Here, we compare the hydration dynamics of a panel of films with varying thicknesses and polymer compositions, demonstrating that quantitative imaging refractometry can be an effective tool for evaluating and characterizing the performance of candidate microbicide film designs for anti-HIV drug delivery.

  19. Principles of Metamorphic Petrology

    NASA Astrophysics Data System (ADS)

    Williams, Michael L.

    2009-05-01

    The field of metamorphic petrology has seen spectacular advances in the past decade, including new X-ray mapping techniques for characterizing metamorphic rocks and minerals, new internally consistent thermobarometers, new software for constructing and viewing phase diagrams, new methods to date metamorphic processes, and perhaps most significant, revised petrologic databases and the ability to calculate accurate phase diagrams and pseudosections. These tools and techniques provide new power and resolution for constraining pressure-temperature (P-T) histories and tectonic events. Two books have been fundamental for empowering petrologists and structural geologists during the past decade. Frank Spear's Metamorphic Phase Equilibria and Pressure-Temperature-Time Paths, published in 1993, builds on his seminal papers to provide a quantitative framework for P-T path analysis. Spear's book lays the foundation for modern quantitative metamorphic analysis. Cees Passchier and Rudolph Trouw's Microtectonics, published in 2005, with its superb photos and figures, provides the tools and the theory for interpreting deformation textures and inferring deformation processes.

  20. An Improved Method for Measuring Quantitative Resistance to the Wheat Pathogen Zymoseptoria tritici Using High-Throughput Automated Image Analysis.

    PubMed

    Stewart, Ethan L; Hagerty, Christina H; Mikaberidze, Alexey; Mundt, Christopher C; Zhong, Ziming; McDonald, Bruce A

    2016-07-01

    Zymoseptoria tritici causes Septoria tritici blotch (STB) on wheat. An improved method of quantifying STB symptoms was developed based on automated analysis of diseased leaf images made using a flatbed scanner. Naturally infected leaves (n = 949) sampled from fungicide-treated field plots comprising 39 wheat cultivars grown in Switzerland and 9 recombinant inbred lines (RIL) grown in Oregon were included in these analyses. Measures of quantitative resistance were percent leaf area covered by lesions, pycnidia size and gray value, and pycnidia density per leaf and lesion. These measures were obtained automatically with a batch-processing macro utilizing the image-processing software ImageJ. All phenotypes in both locations showed a continuous distribution, as expected for a quantitative trait. The trait distributions at both sites were largely overlapping even though the field and host environments were quite different. Cultivars and RILs could be assigned to two or more statistically different groups for each measured phenotype. Traditional visual assessments of field resistance were highly correlated with quantitative resistance measures based on image analysis for the Oregon RILs. These results show that automated image analysis provides a promising tool for assessing quantitative resistance to Z. tritici under field conditions.

  1. Method development towards qualitative and semi-quantitative analysis of multiple pesticides from food surfaces and extracts by desorption electrospray ionization mass spectrometry as a preselective tool for food control.

    PubMed

    Gerbig, Stefanie; Stern, Gerold; Brunn, Hubertus E; Düring, Rolf-Alexander; Spengler, Bernhard; Schulz, Sabine

    2017-03-01

    Direct analysis of fruit and vegetable surfaces is an important tool for in situ detection of food contaminants such as pesticides. We tested three different ways to prepare samples for the qualitative desorption electrospray ionization mass spectrometry (DESI-MS) analysis of 32 pesticides found on nine authentic fruits collected from food control. Best recovery rates for topically applied pesticides (88%) were found by analyzing the surface of a glass slide which had been rubbed against the surface of the food. Pesticide concentration in all samples was at or below the maximum residue level allowed. In addition to the high sensitivity of the method for qualitative analysis, quantitative or, at least, semi-quantitative information is needed in food control. We developed a DESI-MS method for the simultaneous determination of linear calibration curves of multiple pesticides of the same chemical class using normalization to one internal standard (ISTD). The method was first optimized for food extracts and subsequently evaluated for the quantification of pesticides in three authentic food extracts. Next, pesticides and the ISTD were applied directly onto food surfaces, and the corresponding calibration curves were obtained. The determination of linear calibration curves was still feasible, as demonstrated for three different food surfaces. This proof-of-principle method was used to simultaneously quantify two pesticides on an authentic sample, showing that the method developed could serve as a fast and simple preselective tool for disclosure of pesticide regulation violations. Graphical Abstract Multiple pesticide residues were detected and quantified in-situ from an authentic set of food items and extracts in a proof of principle study.

  2. Extracting Metrics for Three-dimensional Root Systems: Volume and Surface Analysis from In-soil X-ray Computed Tomography Data.

    PubMed

    Suresh, Niraj; Stephens, Sean A; Adams, Lexor; Beck, Anthon N; McKinney, Adriana L; Varga, Tamas

    2016-04-26

    Plant roots play a critical role in plant-soil-microbe interactions that occur in the rhizosphere, as well as processes with important implications to climate change and crop management. Quantitative size information on roots in their native environment is invaluable for studying root growth and environmental processes involving plants. X-ray computed tomography (XCT) has been demonstrated to be an effective tool for in situ root scanning and analysis. We aimed to develop a costless and efficient tool that approximates the surface and volume of the root regardless of its shape from three-dimensional (3D) tomography data. The root structure of a Prairie dropseed (Sporobolus heterolepis) specimen was imaged using XCT. The root was reconstructed, and the primary root structure was extracted from the data using a combination of licensed and open-source software. An isosurface polygonal mesh was then created for ease of analysis. We have developed the standalone application imeshJ, generated in MATLAB(1), to calculate root volume and surface area from the mesh. The outputs of imeshJ are surface area (in mm(2)) and the volume (in mm(3)). The process, utilizing a unique combination of tools from imaging to quantitative root analysis, is described. A combination of XCT and open-source software proved to be a powerful combination to noninvasively image plant root samples, segment root data, and extract quantitative information from the 3D data. This methodology of processing 3D data should be applicable to other material/sample systems where there is connectivity between components of similar X-ray attenuation and difficulties arise with segmentation.

  3. PTMScout, a Web Resource for Analysis of High Throughput Post-translational Proteomics Studies*

    PubMed Central

    Naegle, Kristen M.; Gymrek, Melissa; Joughin, Brian A.; Wagner, Joel P.; Welsch, Roy E.; Yaffe, Michael B.; Lauffenburger, Douglas A.; White, Forest M.

    2010-01-01

    The rate of discovery of post-translational modification (PTM) sites is increasing rapidly and is significantly outpacing our biological understanding of the function and regulation of those modifications. To help meet this challenge, we have created PTMScout, a web-based interface for viewing, manipulating, and analyzing high throughput experimental measurements of PTMs in an effort to facilitate biological understanding of protein modifications in signaling networks. PTMScout is constructed around a custom database of PTM experiments and contains information from external protein and post-translational resources, including gene ontology annotations, Pfam domains, and Scansite predictions of kinase and phosphopeptide binding domain interactions. PTMScout functionality comprises data set comparison tools, data set summary views, and tools for protein assignments of peptides identified by mass spectrometry. Analysis tools in PTMScout focus on informed subset selection via common criteria and on automated hypothesis generation through subset labeling derived from identification of statistically significant enrichment of other annotations in the experiment. Subset selection can be applied through the PTMScout flexible query interface available for quantitative data measurements and data annotations as well as an interface for importing data set groupings by external means, such as unsupervised learning. We exemplify the various functions of PTMScout in application to data sets that contain relative quantitative measurements as well as data sets lacking quantitative measurements, producing a set of interesting biological hypotheses. PTMScout is designed to be a widely accessible tool, enabling generation of multiple types of biological hypotheses from high throughput PTM experiments and advancing functional assignment of novel PTM sites. PTMScout is available at http://ptmscout.mit.edu. PMID:20631208

  4. Standardized observation of neighbourhood disorder: does it work in Canada?

    PubMed Central

    2010-01-01

    Background There is a growing body of evidence that where you live is important to your health. Despite numerous previous studies investigating the relationship between neighbourhood deprivation (and structure) and residents' health, the precise nature of this relationship remains unclear. Relatively few investigations have relied on direct observation of neighbourhoods, while those that have were developed primarily in US settings. Evaluation of the transferability of such tools to other contexts is an important first step before applying such instruments to the investigation of health and well-being. This study evaluated the performance of a systematic social observational (SSO) tool (adapted from previous studies of American and British neighbourhoods) in a Canadian urban context. Methods This was a mixed-methods study. Quantitative SSO ratings and qualitative descriptions of 176 block faces were obtained in six Toronto neighbourhoods (4 low-income, and 2 middle/high-income) by trained raters. Exploratory factor analysis was conducted with the quantitative SSO ratings. Content analysis consisted of independent coding of qualitative data by three members of the research team to yield common themes and categories. Results Factor analysis identified three factors (physical decay/disorder, social accessibility, recreational opportunities), but only 'physical decay/disorder' reflected previous findings in the literature. Qualitative results (based on raters' fieldwork experiences) revealed the tool's shortcomings in capturing important features of the neighbourhoods under study, and informed interpretation of the quantitative findings. Conclusions This study tested the performance of an SSO tool in a Canadian context, which is an important initial step before applying it to the study of health and disease. The tool demonstrated important shortcomings when applied to six diverse Toronto neighbourhoods. The study's analyses challenge previously held assumptions (e.g. social 'disorder') regarding neighbourhood social and built environments. For example, neighbourhood 'order' has traditionally been assumed to be synonymous with a certain degree of homogeneity, however the neighbourhoods under study were characterized by high degrees of heterogeneity and low levels of disorder. Heterogeneity was seen as an appealing feature of a block face. Employing qualitative techniques with SSO represents a unique contribution, enhancing both our understanding of the quantitative ratings obtained and of neighbourhood characteristics that are not currently captured by such instruments. PMID:20146821

  5. Developing Interactional Competence through Video-Based Computer-Mediated Conversations: Beginning Learners of Spanish

    ERIC Educational Resources Information Center

    Tecedor Cabrero, Marta

    2013-01-01

    This dissertation examines the discourse produced by beginning learners of Spanish using social media. Specifically, it looks at the use and development of interactional resources during two video-mediated conversations. Through a combination of Conversation Analysis tools and quantitative data analysis, the use of turn-taking strategies, repair…

  6. Selection and validation of endogenous reference genes for qRT-PCR analysis in leafy spurge (Euphorbia esula)

    USDA-ARS?s Scientific Manuscript database

    Quantitative real-time polymerase chain reaction (qRT-PCR) is the most important tool in measuring levels of gene expression due to its accuracy, specificity, and sensitivity. However, the accuracy of qRT-PCR analysis strongly depends on transcript normalization using stably expressed reference gene...

  7. A Taiwanese Mandarin Main Concept Analysis (TM-MCA) for Quantification of Aphasic Oral Discourse

    ERIC Educational Resources Information Center

    Kong, Anthony Pak-Hin; Yeh, Chun-Chih

    2015-01-01

    Background: Various quantitative systems have been proposed to examine aphasic oral narratives in English. A clinical tool for assessing discourse produced by Cantonese-speaking persons with aphasia (PWA), namely Main Concept Analysis (MCA), was developed recently for quantifying the presence, accuracy and completeness of a narrative. Similar…

  8. Multivariate analysis, mass balance techniques, and statistical tests as tools in igneous petrology: application to the Sierra de las Cruces volcanic range (Mexican Volcanic Belt).

    PubMed

    Velasco-Tapia, Fernando

    2014-01-01

    Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC) volcanic range (Mexican Volcanic Belt). In this locality, the volcanic activity (3.7 to 0.5 Ma) was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward's linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas) in the comingled lavas (binary mixtures).

  9. Redefining the Breast Cancer Exosome Proteome by Tandem Mass Tag Quantitative Proteomics and Multivariate Cluster Analysis.

    PubMed

    Clark, David J; Fondrie, William E; Liao, Zhongping; Hanson, Phyllis I; Fulton, Amy; Mao, Li; Yang, Austin J

    2015-10-20

    Exosomes are microvesicles of endocytic origin constitutively released by multiple cell types into the extracellular environment. With evidence that exosomes can be detected in the blood of patients with various malignancies, the development of a platform that uses exosomes as a diagnostic tool has been proposed. However, it has been difficult to truly define the exosome proteome due to the challenge of discerning contaminant proteins that may be identified via mass spectrometry using various exosome enrichment strategies. To better define the exosome proteome in breast cancer, we incorporated a combination of Tandem-Mass-Tag (TMT) quantitative proteomics approach and Support Vector Machine (SVM) cluster analysis of three conditioned media derived fractions corresponding to a 10 000g cellular debris pellet, a 100 000g crude exosome pellet, and an Optiprep enriched exosome pellet. The quantitative analysis identified 2 179 proteins in all three fractions, with known exosomal cargo proteins displaying at least a 2-fold enrichment in the exosome fraction based on the TMT protein ratios. Employing SVM cluster analysis allowed for the classification 251 proteins as "true" exosomal cargo proteins. This study provides a robust and vigorous framework for the future development of using exosomes as a potential multiprotein marker phenotyping tool that could be useful in breast cancer diagnosis and monitoring disease progression.

  10. Recent Advances in Nanobiotechnology and High-Throughput Molecular Techniques for Systems Biomedicine

    PubMed Central

    Kim, Eung-Sam; Ahn, Eun Hyun; Chung, Euiheon; Kim, Deok-Ho

    2013-01-01

    Nanotechnology-based tools are beginning to emerge as promising platforms for quantitative high-throughput analysis of live cells and tissues. Despite unprecedented progress made over the last decade, a challenge still lies in integrating emerging nanotechnology-based tools into macroscopic biomedical apparatuses for practical purposes in biomedical sciences. In this review, we discuss the recent advances and limitations in the analysis and control of mechanical, biochemical, fluidic, and optical interactions in the interface areas of nanotechnology-based materials and living cells in both in vitro and in vivo settings. PMID:24258011

  11. Recent advances in nanobiotechnology and high-throughput molecular techniques for systems biomedicine.

    PubMed

    Kim, Eung-Sam; Ahn, Eun Hyun; Chung, Euiheon; Kim, Deok-Ho

    2013-12-01

    Nanotechnology-based tools are beginning to emerge as promising platforms for quantitative high-throughput analysis of live cells and tissues. Despite unprecedented progress made over the last decade, a challenge still lies in integrating emerging nanotechnology-based tools into macroscopic biomedical apparatuses for practical purposes in biomedical sciences. In this review, we discuss the recent advances and limitations in the analysis and control of mechanical, biochemical, fluidic, and optical interactions in the interface areas of nanotechnologybased materials and living cells in both in vitro and in vivo settings.

  12. SearchLight: a freely available web-based quantitative spectral analysis tool (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Prabhat, Prashant; Peet, Michael; Erdogan, Turan

    2016-03-01

    In order to design a fluorescence experiment, typically the spectra of a fluorophore and of a filter set are overlaid on a single graph and the spectral overlap is evaluated intuitively. However, in a typical fluorescence imaging system the fluorophores and optical filters are not the only wavelength dependent variables - even the excitation light sources have been changing. For example, LED Light Engines may have a significantly different spectral response compared to the traditional metal-halide lamps. Therefore, for a more accurate assessment of fluorophore-to-filter-set compatibility, all sources of spectral variation should be taken into account simultaneously. Additionally, intuitive or qualitative evaluation of many spectra does not necessarily provide a realistic assessment of the system performance. "SearchLight" is a freely available web-based spectral plotting and analysis tool that can be used to address the need for accurate, quantitative spectral evaluation of fluorescence measurement systems. This tool is available at: http://searchlight.semrock.com/. Based on a detailed mathematical framework [1], SearchLight calculates signal, noise, and signal-to-noise ratio for multiple combinations of fluorophores, filter sets, light sources and detectors. SearchLight allows for qualitative and quantitative evaluation of the compatibility of filter sets with fluorophores, analysis of bleed-through, identification of optimized spectral edge locations for a set of filters under specific experimental conditions, and guidance regarding labeling protocols in multiplexing imaging assays. Entire SearchLight sessions can be shared with colleagues and collaborators and saved for future reference. [1] Anderson, N., Prabhat, P. and Erdogan, T., Spectral Modeling in Fluorescence Microscopy, http://www.semrock.com (2010).

  13. Methods, Tools and Current Perspectives in Proteogenomics *

    PubMed Central

    Ruggles, Kelly V.; Krug, Karsten; Wang, Xiaojing; Clauser, Karl R.; Wang, Jing; Payne, Samuel H.; Fenyö, David; Zhang, Bing; Mani, D. R.

    2017-01-01

    With combined technological advancements in high-throughput next-generation sequencing and deep mass spectrometry-based proteomics, proteogenomics, i.e. the integrative analysis of proteomic and genomic data, has emerged as a new research field. Early efforts in the field were focused on improving protein identification using sample-specific genomic and transcriptomic sequencing data. More recently, integrative analysis of quantitative measurements from genomic and proteomic studies have identified novel insights into gene expression regulation, cell signaling, and disease. Many methods and tools have been developed or adapted to enable an array of integrative proteogenomic approaches and in this article, we systematically classify published methods and tools into four major categories, (1) Sequence-centric proteogenomics; (2) Analysis of proteogenomic relationships; (3) Integrative modeling of proteogenomic data; and (4) Data sharing and visualization. We provide a comprehensive review of methods and available tools in each category and highlight their typical applications. PMID:28456751

  14. Sentiment Analysis of Health Care Tweets: Review of the Methods Used.

    PubMed

    Gohil, Sunir; Vuik, Sabine; Darzi, Ara

    2018-04-23

    Twitter is a microblogging service where users can send and read short 140-character messages called "tweets." There are several unstructured, free-text tweets relating to health care being shared on Twitter, which is becoming a popular area for health care research. Sentiment is a metric commonly used to investigate the positive or negative opinion within these messages. Exploring the methods used for sentiment analysis in Twitter health care research may allow us to better understand the options available for future research in this growing field. The first objective of this study was to understand which tools would be available for sentiment analysis of Twitter health care research, by reviewing existing studies in this area and the methods they used. The second objective was to determine which method would work best in the health care settings, by analyzing how the methods were used to answer specific health care questions, their production, and how their accuracy was analyzed. A review of the literature was conducted pertaining to Twitter and health care research, which used a quantitative method of sentiment analysis for the free-text messages (tweets). The study compared the types of tools used in each case and examined methods for tool production, tool training, and analysis of accuracy. A total of 12 papers studying the quantitative measurement of sentiment in the health care setting were found. More than half of these studies produced tools specifically for their research, 4 used open source tools available freely, and 2 used commercially available software. Moreover, 4 out of the 12 tools were trained using a smaller sample of the study's final data. The sentiment method was trained against, on an average, 0.45% (2816/627,024) of the total sample data. One of the 12 papers commented on the analysis of accuracy of the tool used. Multiple methods are used for sentiment analysis of tweets in the health care setting. These range from self-produced basic categorizations to more complex and expensive commercial software. The open source and commercial methods are developed on product reviews and generic social media messages. None of these methods have been extensively tested against a corpus of health care messages to check their accuracy. This study suggests that there is a need for an accurate and tested tool for sentiment analysis of tweets trained using a health care setting-specific corpus of manually annotated tweets first. ©Sunir Gohil, Sabine Vuik, Ara Darzi. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 23.04.2018.

  15. Manual on performance of traffic signal systems: assessment of operations and maintenance : [summary].

    DOT National Transportation Integrated Search

    2017-05-01

    In this project, Florida Atlantic University researchers developed a methodology and software tools that allow objective, quantitative analysis of the performance of signal systems. : The researchers surveyed the state of practice for traffic signal ...

  16. Extracting microtubule networks from superresolution single-molecule localization microscopy data

    PubMed Central

    Zhang, Zhen; Nishimura, Yukako; Kanchanawong, Pakorn

    2017-01-01

    Microtubule filaments form ubiquitous networks that specify spatial organization in cells. However, quantitative analysis of microtubule networks is hampered by their complex architecture, limiting insights into the interplay between their organization and cellular functions. Although superresolution microscopy has greatly facilitated high-resolution imaging of microtubule filaments, extraction of complete filament networks from such data sets is challenging. Here we describe a computational tool for automated retrieval of microtubule filaments from single-molecule-localization–based superresolution microscopy images. We present a user-friendly, graphically interfaced implementation and a quantitative analysis of microtubule network architecture phenotypes in fibroblasts. PMID:27852898

  17. Highly sensitive transient absorption imaging of graphene and graphene oxide in living cells and circulating blood.

    PubMed

    Li, Junjie; Zhang, Weixia; Chung, Ting-Fung; Slipchenko, Mikhail N; Chen, Yong P; Cheng, Ji-Xin; Yang, Chen

    2015-07-23

    We report a transient absorption (TA) imaging method for fast visualization and quantitative layer analysis of graphene and GO. Forward and backward imaging of graphene on various substrates under ambient condition was imaged with a speed of 2 μs per pixel. The TA intensity linearly increased with the layer number of graphene. Real-time TA imaging of GO in vitro with capability of quantitative analysis of intracellular concentration and ex vivo in circulating blood were demonstrated. These results suggest that TA microscopy is a valid tool for the study of graphene based materials.

  18. ADC as a useful diagnostic tool for differentiating benign and malignant vertebral bone marrow lesions and compression fractures: a systematic review and meta-analysis.

    PubMed

    Suh, Chong Hyun; Yun, Seong Jong; Jin, Wook; Lee, Sun Hwa; Park, So Young; Ryu, Chang-Woo

    2018-07-01

    To assess the sensitivity and specificity of quantitative assessment of the apparent diffusion coefficient (ADC) for differentiating benign and malignant vertebral bone marrow lesions (BMLs) and compression fractures (CFs) METHODS: An electronic literature search of MEDLINE and EMBASE was conducted. Bivariate modelling and hierarchical summary receiver operating characteristic modelling were performed to evaluate the diagnostic performance of ADC for differentiating vertebral BMLs. Subgroup analysis was performed for differentiating benign and malignant vertebral CFs. Meta-regression analyses according to subject, study and diffusion-weighted imaging (DWI) characteristics were performed. Twelve eligible studies (748 lesions, 661 patients) were included. The ADC exhibited a pooled sensitivity of 0.89 (95% confidence interval [CI] 0.80-0.94) and a pooled specificity of 0.87 (95% CI 0.78-0.93) for differentiating benign and malignant vertebral BMLs. In addition, the pooled sensitivity and specificity for differentiating benign and malignant CFs were 0.92 (95% CI 0.82-0.97) and 0.91 (95% CI 0.87-0.94), respectively. In the meta-regression analysis, the DWI slice thickness was a significant factor affecting heterogeneity (p < 0.01); thinner slice thickness (< 5 mm) showed higher specificity (95%) than thicker slice thickness (81%). Quantitative assessment of ADC is a useful diagnostic tool for differentiating benign and malignant vertebral BMLs and CFs. • Quantitative assessment of ADC is useful in differentiating vertebral BMLs. • Quantitative ADC assessment for BMLs had sensitivity of 89%, specificity of 87%. • Quantitative ADC assessment for CFs had sensitivity of 92%, specificity of 91%. • The specificity is highest (95%) with thinner (< 5 mm) DWI slice thickness.

  19. Comparative Application of PLS and PCR Methods to Simultaneous Quantitative Estimation and Simultaneous Dissolution Test of Zidovudine - Lamivudine Tablets.

    PubMed

    Üstündağ, Özgür; Dinç, Erdal; Özdemir, Nurten; Tilkan, M Günseli

    2015-01-01

    In the development strategies of new drug products and generic drug products, the simultaneous in-vitro dissolution behavior of oral dosage formulations is the most important indication for the quantitative estimation of efficiency and biopharmaceutical characteristics of drug substances. This is to force the related field's scientists to improve very powerful analytical methods to get more reliable, precise and accurate results in the quantitative analysis and dissolution testing of drug formulations. In this context, two new chemometric tools, partial least squares (PLS) and principal component regression (PCR) were improved for the simultaneous quantitative estimation and dissolution testing of zidovudine (ZID) and lamivudine (LAM) in a tablet dosage form. The results obtained in this study strongly encourage us to use them for the quality control, the routine analysis and the dissolution test of the marketing tablets containing ZID and LAM drugs.

  20. Application of near infrared spectroscopy to the analysis and fast quality assessment of traditional Chinese medicinal products

    PubMed Central

    Zhang, Chao; Su, Jinghua

    2014-01-01

    Near infrared spectroscopy (NIRS) has been widely applied in both qualitative and quantitative analysis. There is growing interest in its application to traditional Chinese medicine (TCM) and a review of recent developments in the field is timely. To present an overview of recent applications of NIRS to the identification, classification and analysis of TCM products, studies describing the application of NIRS to TCM products are classified into those involving qualitative and quantitative analysis. In addition, the application of NIRS to the detection of illegal additives and the rapid assessment of quality of TCMs by fast inspection are also described. This review covers over 100 studies emphasizing the application of NIRS in different fields. Furthermore, basic analytical principles and specific examples are used to illustrate the feasibility and effectiveness of NIRS in pattern identification. NIRS provides an effective and powerful tool for the qualitative and quantitative analysis of TCM products. PMID:26579382

  1. GProX, a user-friendly platform for bioinformatics analysis and visualization of quantitative proteomics data.

    PubMed

    Rigbolt, Kristoffer T G; Vanselow, Jens T; Blagoev, Blagoy

    2011-08-01

    Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)(1). The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net.

  2. GProX, a User-Friendly Platform for Bioinformatics Analysis and Visualization of Quantitative Proteomics Data*

    PubMed Central

    Rigbolt, Kristoffer T. G.; Vanselow, Jens T.; Blagoev, Blagoy

    2011-01-01

    Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)1. The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net. PMID:21602510

  3. Using PSEA-Quant for Protein Set Enrichment Analysis of Quantitative Mass Spectrometry-Based Proteomics

    PubMed Central

    Lavallée-Adam, Mathieu

    2017-01-01

    PSEA-Quant analyzes quantitative mass spectrometry-based proteomics datasets to identify enrichments of annotations contained in repositories such as the Gene Ontology and Molecular Signature databases. It allows users to identify the annotations that are significantly enriched for reproducibly quantified high abundance proteins. PSEA-Quant is available on the web and as a command-line tool. It is compatible with all label-free and isotopic labeling-based quantitative proteomics methods. This protocol describes how to use PSEA-Quant and interpret its output. The importance of each parameter as well as troubleshooting approaches are also discussed. PMID:27010334

  4. Noninvasive characterization of the fission yeast cell cycle by monitoring dry mass with digital holographic microscopy.

    PubMed

    Rappaz, Benjamin; Cano, Elena; Colomb, Tristan; Kühn, Jonas; Depeursinge, Christian; Simanis, Viesturs; Magistretti, Pierre J; Marquet, Pierre

    2009-01-01

    Digital holography microscopy (DHM) is an optical technique which provides phase images yielding quantitative information about cell structure and cellular dynamics. Furthermore, the quantitative phase images allow the derivation of other parameters, including dry mass production, density, and spatial distribution. We have applied DHM to study the dry mass production rate and the dry mass surface density in wild-type and mutant fission yeast cells. Our study demonstrates the applicability of DHM as a tool for label-free quantitative analysis of the cell cycle and opens the possibility for its use in high-throughput screening.

  5. Quantitative aspects of inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Bulska, Ewa; Wagner, Barbara

    2016-10-01

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue 'Quantitative mass spectrometry'.

  6. The Quantitative Evaluation of the Clinical and Translational Science Awards (CTSA) Program Based on Science Mapping and Scientometric Analysis

    PubMed Central

    Zhang, Yin; Wang, Lei

    2013-01-01

    Abstract The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results. PMID:24330689

  7. The quantitative evaluation of the Clinical and Translational Science Awards (CTSA) program based on science mapping and scientometric analysis.

    PubMed

    Zhang, Yin; Wang, Lei; Diao, Tianxi

    2013-12-01

    The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results. © 2013 Wiley Periodicals, Inc.

  8. Teaching bioinformatics and neuroinformatics by using free web-based tools.

    PubMed

    Grisham, William; Schottler, Natalie A; Valli-Marill, Joanne; Beck, Lisa; Beatty, Jackson

    2010-01-01

    This completely computer-based module's purpose is to introduce students to bioinformatics resources. We present an easy-to-adopt module that weaves together several important bioinformatic tools so students can grasp how these tools are used in answering research questions. Students integrate information gathered from websites dealing with anatomy (Mouse Brain Library), quantitative trait locus analysis (WebQTL from GeneNetwork), bioinformatics and gene expression analyses (University of California, Santa Cruz Genome Browser, National Center for Biotechnology Information's Entrez Gene, and the Allen Brain Atlas), and information resources (PubMed). Instructors can use these various websites in concert to teach genetics from the phenotypic level to the molecular level, aspects of neuroanatomy and histology, statistics, quantitative trait locus analysis, and molecular biology (including in situ hybridization and microarray analysis), and to introduce bioinformatic resources. Students use these resources to discover 1) the region(s) of chromosome(s) influencing the phenotypic trait, 2) a list of candidate genes-narrowed by expression data, 3) the in situ pattern of a given gene in the region of interest, 4) the nucleotide sequence of the candidate gene, and 5) articles describing the gene. Teaching materials such as a detailed student/instructor's manual, PowerPoints, sample exams, and links to free Web resources can be found at http://mdcune.psych.ucla.edu/modules/bioinformatics.

  9. Visual analysis of variance: a tool for quantitative assessment of fMRI data processing and analysis.

    PubMed

    McNamee, R L; Eddy, W F

    2001-12-01

    Analysis of variance (ANOVA) is widely used for the study of experimental data. Here, the reach of this tool is extended to cover the preprocessing of functional magnetic resonance imaging (fMRI) data. This technique, termed visual ANOVA (VANOVA), provides both numerical and pictorial information to aid the user in understanding the effects of various parts of the data analysis. Unlike a formal ANOVA, this method does not depend on the mathematics of orthogonal projections or strictly additive decompositions. An illustrative example is presented and the application of the method to a large number of fMRI experiments is discussed. Copyright 2001 Wiley-Liss, Inc.

  10. Screening hypochromism (sieve effect) in red blood cells: a quantitative analysis

    PubMed Central

    Razi Naqvi, K.

    2014-01-01

    Multiwavelength UV-visible spectroscopy, Kramers-Kronig analysis, and several other experimental and theoretical tools have been applied over the last several decades to fathom absorption and scattering of light by suspensions of micron-sized pigmented particles, including red blood cells, but a satisfactory quantitative analysis of the difference between the absorption spectra of suspension of intact and lysed red blood cells is still lacking. It is stressed that such a comparison is meaningful only if the pertinent spectra are free from, or have been corrected for, scattering losses, and it is shown that Duysens’ theory can, whereas that of Vekshin cannot, account satisfactorily for the observed hypochromism of suspensions of red blood cells. PMID:24761307

  11. Screening hypochromism (sieve effect) in red blood cells: a quantitative analysis.

    PubMed

    Razi Naqvi, K

    2014-04-01

    Multiwavelength UV-visible spectroscopy, Kramers-Kronig analysis, and several other experimental and theoretical tools have been applied over the last several decades to fathom absorption and scattering of light by suspensions of micron-sized pigmented particles, including red blood cells, but a satisfactory quantitative analysis of the difference between the absorption spectra of suspension of intact and lysed red blood cells is still lacking. It is stressed that such a comparison is meaningful only if the pertinent spectra are free from, or have been corrected for, scattering losses, and it is shown that Duysens' theory can, whereas that of Vekshin cannot, account satisfactorily for the observed hypochromism of suspensions of red blood cells.

  12. Analysis of Student Activity in Web-Supported Courses as a Tool for Predicting Dropout

    ERIC Educational Resources Information Center

    Cohen, Anat

    2017-01-01

    Persistence in learning processes is perceived as a central value; therefore, dropouts from studies are a prime concern for educators. This study focuses on the quantitative analysis of data accumulated on 362 students in three academic course website log files in the disciplines of mathematics and statistics, in order to examine whether student…

  13. Quantitative three-dimensional microtextural analyses of tooth wear as a tool for dietary discrimination in fishes

    PubMed Central

    Purnell, Mark; Seehausen, Ole; Galis, Frietson

    2012-01-01

    Resource polymorphisms and competition for resources are significant factors in speciation. Many examples come from fishes, and cichlids are of particular importance because of their role as model organisms at the interface of ecology, development, genetics and evolution. However, analysis of trophic resource use in fishes can be difficult and time-consuming, and for fossil fish species it is particularly problematic. Here, we present evidence from cichlids that analysis of tooth microwear based on high-resolution (sub-micrometre scale) three-dimensional data and new ISO standards for quantification of surface textures provides a powerful tool for dietary discrimination and investigation of trophic resource exploitation. Our results suggest that three-dimensional approaches to analysis offer significant advantages over two-dimensional operator-scored methods of microwear analysis, including applicability to rough tooth surfaces that lack distinct scratches and pits. Tooth microwear textures develop over a longer period of time than is represented by stomach contents, and analyses based on textures are less prone to biases introduced by opportunistic feeding. They are more sensitive to subtle dietary differences than isotopic analysis. Quantitative textural analysis of tooth microwear has a useful role to play, complementing existing approaches, in trophic analysis of fishes—both extant and extinct. PMID:22491979

  14. Quantitative structure-property relationship (correlation analysis) of phosphonic acid-based chelates in design of MRI contrast agent.

    PubMed

    Tiwari, Anjani K; Ojha, Himanshu; Kaul, Ankur; Dutta, Anupama; Srivastava, Pooja; Shukla, Gauri; Srivastava, Rakesh; Mishra, Anil K

    2009-07-01

    Nuclear magnetic resonance imaging is a very useful tool in modern medical diagnostics, especially when gadolinium (III)-based contrast agents are administered to the patient with the aim of increasing the image contrast between normal and diseased tissues. With the use of soft modelling techniques such as quantitative structure-activity relationship/quantitative structure-property relationship after a suitable description of their molecular structure, we have studied a series of phosphonic acid for designing new MRI contrast agent. Quantitative structure-property relationship studies with multiple linear regression analysis were applied to find correlation between different calculated molecular descriptors of the phosphonic acid-based chelating agent and their stability constants. The final quantitative structure-property relationship mathematical models were found as--quantitative structure-property relationship Model for phosphonic acid series (Model 1)--log K(ML) = {5.00243(+/-0.7102)}- MR {0.0263(+/-0.540)}n = 12 l r l = 0.942 s = 0.183 F = 99.165 quantitative structure-property relationship Model for phosphonic acid series (Model 2)--log K(ML) = {5.06280(+/-0.3418)}- MR {0.0252(+/- .198)}n = 12 l r l = 0.956 s = 0.186 F = 99.256.

  15. Guidelines for reporting quantitative mass spectrometry based experiments in proteomics.

    PubMed

    Martínez-Bartolomé, Salvador; Deutsch, Eric W; Binz, Pierre-Alain; Jones, Andrew R; Eisenacher, Martin; Mayer, Gerhard; Campos, Alex; Canals, Francesc; Bech-Serra, Joan-Josep; Carrascal, Montserrat; Gay, Marina; Paradela, Alberto; Navajas, Rosana; Marcilla, Miguel; Hernáez, María Luisa; Gutiérrez-Blázquez, María Dolores; Velarde, Luis Felipe Clemente; Aloria, Kerman; Beaskoetxea, Jabier; Medina-Aunon, J Alberto; Albar, Juan P

    2013-12-16

    Mass spectrometry is already a well-established protein identification tool and recent methodological and technological developments have also made possible the extraction of quantitative data of protein abundance in large-scale studies. Several strategies for absolute and relative quantitative proteomics and the statistical assessment of quantifications are possible, each having specific measurements and therefore, different data analysis workflows. The guidelines for Mass Spectrometry Quantification allow the description of a wide range of quantitative approaches, including labeled and label-free techniques and also targeted approaches such as Selected Reaction Monitoring (SRM). The HUPO Proteomics Standards Initiative (HUPO-PSI) has invested considerable efforts to improve the standardization of proteomics data handling, representation and sharing through the development of data standards, reporting guidelines, controlled vocabularies and tooling. In this manuscript, we describe a key output from the HUPO-PSI-namely the MIAPE Quant guidelines, which have developed in parallel with the corresponding data exchange format mzQuantML [1]. The MIAPE Quant guidelines describe the HUPO-PSI proposal concerning the minimum information to be reported when a quantitative data set, derived from mass spectrometry (MS), is submitted to a database or as supplementary information to a journal. The guidelines have been developed with input from a broad spectrum of stakeholders in the proteomics field to represent a true consensus view of the most important data types and metadata, required for a quantitative experiment to be analyzed critically or a data analysis pipeline to be reproduced. It is anticipated that they will influence or be directly adopted as part of journal guidelines for publication and by public proteomics databases and thus may have an impact on proteomics laboratories across the world. This article is part of a Special Issue entitled: Standardization and Quality Control. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. Semi-quantitative analysis of salivary gland scintigraphy in Sjögren's syndrome diagnosis: a first-line tool.

    PubMed

    Angusti, Tiziana; Pilati, Emanuela; Parente, Antonella; Carignola, Renato; Manfredi, Matteo; Cauda, Simona; Pizzigati, Elena; Dubreuil, Julien; Giammarile, Francesco; Podio, Valerio; Skanjeti, Andrea

    2017-09-01

    The aim of this study was the assessment of semi-quantified salivary gland dynamic scintigraphy (SGdS) parameters independently and in an integrated way in order to predict primary Sjögren's syndrome (pSS). Forty-six consecutive patients (41 females; age 61 ± 11 years) with sicca syndrome were studied by SGdS after injection of 200 MBq of pertechnetate. In sixteen patients, pSS was diagnosed, according to American-European Consensus Group criteria (AECGc). Semi-quantitative parameters (uptake (UP) and excretion fraction (EF)) were obtained for each gland. ROC curves were used to determine the best cut-off value. The area under the curve (AUC) was used to estimate the accuracy of each semi-quantitative analysis. To assess the correlation between scintigraphic results and disease severity, semi-quantitative parameters were plotted versus Sjögren's syndrome disease activity index (ESSDAI). A nomogram was built to perform an integrated evaluation of all the scintigraphic semi-quantitative data. Both UP and EF of salivary glands were significantly lower in pSS patients compared to those in non-pSS (p < 0.001). ROC curve showed significantly large AUC for both the parameters (p < 0.05). Parotid UP and submandibular EF, assessed by univariated and multivariate logistic regression, showed a significant and independent correlation with pSS diagnosis (p value <0.05). No correlation was found between SGdS semi-quantitative parameters and ESSDAI. The proposed nomogram accuracy was 87%. SGdS is an accurate and reproducible tool for the diagnosis of pSS. ESSDAI was not shown to be correlated with SGdS data. SGdS should be the first-line imaging technique in patients with suspected pSS.

  17. A Pilot Study of the Noninvasive Assessment of the Lung Microbiota as a Potential Tool for the Early Diagnosis of Ventilator-Associated Pneumonia

    PubMed Central

    Brady, Jacob S.; Romano-Keeler, Joann; Drake, Wonder P.; Norris, Patrick R.; Jenkins, Judith M.; Isaacs, Richard J.; Boczko, Erik M.

    2015-01-01

    BACKGROUND: Ventilator-associated pneumonia (VAP) remains a common complication in critically ill surgical patients, and its diagnosis remains problematic. Exhaled breath contains aerosolized droplets that reflect the lung microbiota. We hypothesized that exhaled breath condensate fluid (EBCF) in hygroscopic condenser humidifier/heat and moisture exchanger (HCH/HME) filters would contain bacterial DNA that qualitatively and quantitatively correlate with pathogens isolated from quantitative BAL samples obtained for clinical suspicion of pneumonia. METHODS: Forty-eight adult patients who were mechanically ventilated and undergoing quantitative BAL (n = 51) for suspected pneumonia in the surgical ICU were enrolled. Per protocol, patients fulfilling VAP clinical criteria undergo quantitative BAL bacterial culture. Immediately prior to BAL, time-matched HCH/HME filters were collected for study of EBCF by real-time polymerase chain reaction. Additionally, convenience samples of serially collected filters in patients with BAL-diagnosed VAP were analyzed. RESULTS: Forty-nine of 51 time-matched EBCF/BAL fluid samples were fully concordant (concordance > 95% by κ statistic) relative to identified pathogens and strongly correlated with clinical cultures. Regression analysis of quantitative bacterial DNA in paired samples revealed a statistically significant positive correlation (r = 0.85). In a convenience sample, qualitative and quantitative polymerase chain reaction analysis of serial HCH/HME samples for bacterial DNA demonstrated an increase in load that preceded the suspicion of pneumonia. CONCLUSIONS: Bacterial DNA within EBCF demonstrates a high correlation with BAL fluid and clinical cultures. Bacterial DNA within EBCF increases prior to the suspicion of pneumonia. Further study of this novel approach may allow development of a noninvasive tool for the early diagnosis of VAP. PMID:25474571

  18. Parallel and serial computing tools for testing single-locus and epistatic SNP effects of quantitative traits in genome-wide association studies

    PubMed Central

    Ma, Li; Runesha, H Birali; Dvorkin, Daniel; Garbe, John R; Da, Yang

    2008-01-01

    Background Genome-wide association studies (GWAS) using single nucleotide polymorphism (SNP) markers provide opportunities to detect epistatic SNPs associated with quantitative traits and to detect the exact mode of an epistasis effect. Computational difficulty is the main bottleneck for epistasis testing in large scale GWAS. Results The EPISNPmpi and EPISNP computer programs were developed for testing single-locus and epistatic SNP effects on quantitative traits in GWAS, including tests of three single-locus effects for each SNP (SNP genotypic effect, additive and dominance effects) and five epistasis effects for each pair of SNPs (two-locus interaction, additive × additive, additive × dominance, dominance × additive, and dominance × dominance) based on the extended Kempthorne model. EPISNPmpi is the parallel computing program for epistasis testing in large scale GWAS and achieved excellent scalability for large scale analysis and portability for various parallel computing platforms. EPISNP is the serial computing program based on the EPISNPmpi code for epistasis testing in small scale GWAS using commonly available operating systems and computer hardware. Three serial computing utility programs were developed for graphical viewing of test results and epistasis networks, and for estimating CPU time and disk space requirements. Conclusion The EPISNPmpi parallel computing program provides an effective computing tool for epistasis testing in large scale GWAS, and the epiSNP serial computing programs are convenient tools for epistasis analysis in small scale GWAS using commonly available computer hardware. PMID:18644146

  19. Descriptive approaches to landscape analysis

    Treesearch

    R. Burton Litton Jr.

    1979-01-01

    Descriptive landscape analyses include various procedures used to document visual/scenic resources. Historic and regional examples of landscape description represent desirable insight for contemporary professional inventory work. Routed and areal landscape inventories are discussed as basic tools. From them, qualitative and quantitative evaluations can be developed...

  20. REACTIVE MINERALS IN AQUIFERS: FORMATION PROCESSES AND QUANTITATIVE ANALYSIS

    EPA Science Inventory

    The presentation will focus on the occurrence, form, and characterization of reactive iron minerals in aquifers and soils. The potential for abiotic reductive transformations of contaminants at the mineral-water interface will be discussed along with available tools for site min...

  1. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc.

  2. qSR: a quantitative super-resolution analysis tool reveals the cell-cycle dependent organization of RNA Polymerase I in live human cells.

    PubMed

    Andrews, J O; Conway, W; Cho, W -K; Narayanan, A; Spille, J -H; Jayanth, N; Inoue, T; Mullen, S; Thaler, J; Cissé, I I

    2018-05-09

    We present qSR, an analytical tool for the quantitative analysis of single molecule based super-resolution data. The software is created as an open-source platform integrating multiple algorithms for rigorous spatial and temporal characterizations of protein clusters in super-resolution data of living cells. First, we illustrate qSR using a sample live cell data of RNA Polymerase II (Pol II) as an example of highly dynamic sub-diffractive clusters. Then we utilize qSR to investigate the organization and dynamics of endogenous RNA Polymerase I (Pol I) in live human cells, throughout the cell cycle. Our analysis reveals a previously uncharacterized transient clustering of Pol I. Both stable and transient populations of Pol I clusters co-exist in individual living cells, and their relative fraction vary during cell cycle, in a manner correlating with global gene expression. Thus, qSR serves to facilitate the study of protein organization and dynamics with very high spatial and temporal resolutions directly in live cell.

  3. Visualization techniques to aid in the analysis of multi-spectral astrophysical data sets

    NASA Technical Reports Server (NTRS)

    Brugel, Edward W.; Domik, Gitta O.; Ayres, Thomas R.

    1993-01-01

    The goal of this project was to support the scientific analysis of multi-spectral astrophysical data by means of scientific visualization. Scientific visualization offers its greatest value if it is not used as a method separate or alternative to other data analysis methods but rather in addition to these methods. Together with quantitative analysis of data, such as offered by statistical analysis, image or signal processing, visualization attempts to explore all information inherent in astrophysical data in the most effective way. Data visualization is one aspect of data analysis. Our taxonomy as developed in Section 2 includes identification and access to existing information, preprocessing and quantitative analysis of data, visual representation and the user interface as major components to the software environment of astrophysical data analysis. In pursuing our goal to provide methods and tools for scientific visualization of multi-spectral astrophysical data, we therefore looked at scientific data analysis as one whole process, adding visualization tools to an already existing environment and integrating the various components that define a scientific data analysis environment. As long as the software development process of each component is separate from all other components, users of data analysis software are constantly interrupted in their scientific work in order to convert from one data format to another, or to move from one storage medium to another, or to switch from one user interface to another. We also took an in-depth look at scientific visualization and its underlying concepts, current visualization systems, their contributions, and their shortcomings. The role of data visualization is to stimulate mental processes different from quantitative data analysis, such as the perception of spatial relationships or the discovery of patterns or anomalies while browsing through large data sets. Visualization often leads to an intuitive understanding of the meaning of data values and their relationships by sacrificing accuracy in interpreting the data values. In order to be accurate in the interpretation, data values need to be measured, computed on, and compared to theoretical or empirical models (quantitative analysis). If visualization software hampers quantitative analysis (which happens with some commercial visualization products), its use is greatly diminished for astrophysical data analysis. The software system STAR (Scientific Toolkit for Astrophysical Research) was developed as a prototype during the course of the project to better understand the pragmatic concerns raised in the project. STAR led to a better understanding on the importance of collaboration between astrophysicists and computer scientists.

  4. Virtual Planetary Analysis Environment for Remote Science

    NASA Technical Reports Server (NTRS)

    Keely, Leslie; Beyer, Ross; Edwards. Laurence; Lees, David

    2009-01-01

    All of the data for NASA's current planetary missions and most data for field experiments are collected via orbiting spacecraft, aircraft, and robotic explorers. Mission scientists are unable to employ traditional field methods when operating remotely. We have developed a virtual exploration tool for remote sites with data analysis capabilities that extend human perception quantitatively and qualitatively. Scientists and mission engineers can use it to explore a realistic representation of a remote site. It also provides software tools to "touch" and "measure" remote sites with an immediacy that boosts scientific productivity and is essential for mission operations.

  5. EEG analysis using wavelet-based information tools.

    PubMed

    Rosso, O A; Martin, M T; Figliola, A; Keller, K; Plastino, A

    2006-06-15

    Wavelet-based informational tools for quantitative electroencephalogram (EEG) record analysis are reviewed. Relative wavelet energies, wavelet entropies and wavelet statistical complexities are used in the characterization of scalp EEG records corresponding to secondary generalized tonic-clonic epileptic seizures. In particular, we show that the epileptic recruitment rhythm observed during seizure development is well described in terms of the relative wavelet energies. In addition, during the concomitant time-period the entropy diminishes while complexity grows. This is construed as evidence supporting the conjecture that an epileptic focus, for this kind of seizures, triggers a self-organized brain state characterized by both order and maximal complexity.

  6. New EVSE Analytical Tools/Models: Electric Vehicle Infrastructure Projection Tool (EVI-Pro)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, Eric W; Rames, Clement L; Muratori, Matteo

    This presentation addresses the fundamental question of how much charging infrastructure is needed in the United States to support PEVs. It complements ongoing EVSE initiatives by providing a comprehensive analysis of national PEV charging infrastructure requirements. The result is a quantitative estimate for a U.S. network of non-residential (public and workplace) EVSE that would be needed to support broader PEV adoption. The analysis provides guidance to public and private stakeholders who are seeking to provide nationwide charging coverage, improve the EVSE business case by maximizing station utilization, and promote effective use of private/public infrastructure investments.

  7. Benchmarking quantitative label-free LC-MS data processing workflows using a complex spiked proteomic standard dataset.

    PubMed

    Ramus, Claire; Hovasse, Agnès; Marcellin, Marlène; Hesse, Anne-Marie; Mouton-Barbosa, Emmanuelle; Bouyssié, David; Vaca, Sebastian; Carapito, Christine; Chaoui, Karima; Bruley, Christophe; Garin, Jérôme; Cianférani, Sarah; Ferro, Myriam; Van Dorssaeler, Alain; Burlet-Schiltz, Odile; Schaeffer, Christine; Couté, Yohann; Gonzalez de Peredo, Anne

    2016-01-30

    Proteomic workflows based on nanoLC-MS/MS data-dependent-acquisition analysis have progressed tremendously in recent years. High-resolution and fast sequencing instruments have enabled the use of label-free quantitative methods, based either on spectral counting or on MS signal analysis, which appear as an attractive way to analyze differential protein expression in complex biological samples. However, the computational processing of the data for label-free quantification still remains a challenge. Here, we used a proteomic standard composed of an equimolar mixture of 48 human proteins (Sigma UPS1) spiked at different concentrations into a background of yeast cell lysate to benchmark several label-free quantitative workflows, involving different software packages developed in recent years. This experimental design allowed to finely assess their performances in terms of sensitivity and false discovery rate, by measuring the number of true and false-positive (respectively UPS1 or yeast background proteins found as differential). The spiked standard dataset has been deposited to the ProteomeXchange repository with the identifier PXD001819 and can be used to benchmark other label-free workflows, adjust software parameter settings, improve algorithms for extraction of the quantitative metrics from raw MS data, or evaluate downstream statistical methods. Bioinformatic pipelines for label-free quantitative analysis must be objectively evaluated in their ability to detect variant proteins with good sensitivity and low false discovery rate in large-scale proteomic studies. This can be done through the use of complex spiked samples, for which the "ground truth" of variant proteins is known, allowing a statistical evaluation of the performances of the data processing workflow. We provide here such a controlled standard dataset and used it to evaluate the performances of several label-free bioinformatics tools (including MaxQuant, Skyline, MFPaQ, IRMa-hEIDI and Scaffold) in different workflows, for detection of variant proteins with different absolute expression levels and fold change values. The dataset presented here can be useful for tuning software tool parameters, and also testing new algorithms for label-free quantitative analysis, or for evaluation of downstream statistical methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Volumetric neuroimage analysis extensions for the MIPAV software package.

    PubMed

    Bazin, Pierre-Louis; Cuzzocreo, Jennifer L; Yassa, Michael A; Gandler, William; McAuliffe, Matthew J; Bassett, Susan S; Pham, Dzung L

    2007-09-15

    We describe a new collection of publicly available software tools for performing quantitative neuroimage analysis. The tools perform semi-automatic brain extraction, tissue classification, Talairach alignment, and atlas-based measurements within a user-friendly graphical environment. They are implemented as plug-ins for MIPAV, a freely available medical image processing software package from the National Institutes of Health. Because the plug-ins and MIPAV are implemented in Java, both can be utilized on nearly any operating system platform. In addition to the software plug-ins, we have also released a digital version of the Talairach atlas that can be used to perform regional volumetric analyses. Several studies are conducted applying the new tools to simulated and real neuroimaging data sets.

  9. Metabolic network flux analysis for engineering plant systems.

    PubMed

    Shachar-Hill, Yair

    2013-04-01

    Metabolic network flux analysis (NFA) tools have proven themselves to be powerful aids to metabolic engineering of microbes by providing quantitative insights into the flows of material and energy through cellular systems. The development and application of NFA tools to plant systems has advanced in recent years and are yielding significant insights and testable predictions. Plants present substantial opportunities for the practical application of NFA but they also pose serious challenges related to the complexity of plant metabolic networks and to deficiencies in our knowledge of their structure and regulation. By considering the tools available and selected examples, this article attempts to assess where and how NFA is most likely to have a real impact on plant biotechnology. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. 'Talk to me': a mixed methods study on preferred physician behaviours during end-of-life communication from the patient perspective.

    PubMed

    Abdul-Razzak, Amane; Sherifali, Diana; You, John; Simon, Jessica; Brazil, Kevin

    2016-08-01

    Despite the recognized importance of end-of-life (EOL) communication between patients and physicians, the extent and quality of such communication is lacking. We sought to understand patient perspectives on physician behaviours during EOL communication. In this mixed methods study, we conducted quantitative and qualitative strands and then merged data sets during a mixed methods analysis phase. In the quantitative strand, we used the quality of communication tool (QOC) to measure physician behaviours that predict global rating of satisfaction in EOL communication skills, while in the qualitative strand we conducted semi-structured interviews. During the mixed methods analysis, we compared and contrasted qualitative and quantitative data. Seriously ill inpatients at three tertiary care hospitals in Canada. We found convergence between qualitative and quantitative strands: patients desire candid information from their physician and a sense of familiarity. The quantitative results (n = 132) suggest a paucity of certain EOL communication behaviours in this seriously ill population with a limited prognosis. The qualitative findings (n = 16) suggest that at times, physicians did not engage in EOL communication despite patient readiness, while sometimes this may represent an appropriate deferral after assessment of a patient's lack of readiness. Avoidance of certain EOL topics may not always be a failure if it is a result of an assessment of lack of patient readiness. This has implications for future tool development: a measure could be built in to assess whether physician behaviours align with patient readiness. © 2015 The Authors. Health Expectations Published by John Wiley & Sons Ltd.

  11. Multivariate Analysis, Mass Balance Techniques, and Statistical Tests as Tools in Igneous Petrology: Application to the Sierra de las Cruces Volcanic Range (Mexican Volcanic Belt)

    PubMed Central

    Velasco-Tapia, Fernando

    2014-01-01

    Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC) volcanic range (Mexican Volcanic Belt). In this locality, the volcanic activity (3.7 to 0.5 Ma) was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward's linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas) in the comingled lavas (binary mixtures). PMID:24737994

  12. The life sciences mass spectrometry research unit.

    PubMed

    Hopfgartner, Gérard; Varesio, Emmanuel

    2012-01-01

    The Life Sciences Mass Spectrometry (LSMS) research unit focuses on the development of novel analytical workflows based on innovative mass spectrometric and software tools for the analysis of low molecular weight compounds, peptides and proteins in complex biological matrices. The present article summarizes some of the recent work of the unit: i) the application of matrix-assisted laser desorption/ionization (MALDI) for mass spectrometry imaging (MSI) of drug of abuse in hair, ii) the use of high resolution mass spectrometry for simultaneous qualitative/quantitative analysis in drug metabolism and metabolomics, and iii) the absolute quantitation of proteins by mass spectrometry using the selected reaction monitoring mode.

  13. Highly sensitive transient absorption imaging of graphene and graphene oxide in living cells and circulating blood

    PubMed Central

    Li, Junjie; Zhang, Weixia; Chung, Ting-Fung; Slipchenko, Mikhail N.; Chen, Yong P.; Cheng, Ji-Xin; Yang, Chen

    2015-01-01

    We report a transient absorption (TA) imaging method for fast visualization and quantitative layer analysis of graphene and GO. Forward and backward imaging of graphene on various substrates under ambient condition was imaged with a speed of 2 μs per pixel. The TA intensity linearly increased with the layer number of graphene. Real-time TA imaging of GO in vitro with capability of quantitative analysis of intracellular concentration and ex vivo in circulating blood were demonstrated. These results suggest that TA microscopy is a valid tool for the study of graphene based materials. PMID:26202216

  14. Quantitative aspects of inductively coupled plasma mass spectrometry

    PubMed Central

    Wagner, Barbara

    2016-01-01

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644971

  15. Automated Quantitative Nuclear Cardiology Methods

    PubMed Central

    Motwani, Manish; Berman, Daniel S.; Germano, Guido; Slomka, Piotr J.

    2016-01-01

    Quantitative analysis of SPECT and PET has become a major part of nuclear cardiology practice. Current software tools can automatically segment the left ventricle, quantify function, establish myocardial perfusion maps and estimate global and local measures of stress/rest perfusion – all with minimal user input. State-of-the-art automated techniques have been shown to offer high diagnostic accuracy for detecting coronary artery disease, as well as predict prognostic outcomes. This chapter briefly reviews these techniques, highlights several challenges and discusses the latest developments. PMID:26590779

  16. A Qualitative Analysis Framework Using Natural Language Processing and Graph Theory

    ERIC Educational Resources Information Center

    Tierney, Patrick J.

    2012-01-01

    This paper introduces a method of extending natural language-based processing of qualitative data analysis with the use of a very quantitative tool--graph theory. It is not an attempt to convert qualitative research to a positivist approach with a mathematical black box, nor is it a "graphical solution". Rather, it is a method to help qualitative…

  17. Advanced Productivity Analysis Methods for Air Traffic Control Operations

    DTIC Science & Technology

    1976-12-01

    Routine Work ............................... 37 4.2.2. Surveillance Work .......................... 40 4.2.3. Conflict Prcessing Work ................... 41...crossing and overtake conflicts) includes potential- conflict recognition, assessment, and resolution decision making and A/N voice communications...makers to utilize £ .quantitative and dynamic analysis as a tool for decision - making. 1.1.3 Types of Simulation Models Although there are many ways to

  18. Investigating the Magnetic Interaction with Geomag and Tracker Video Analysis: Static Equilibrium and Anharmonic Dynamics

    ERIC Educational Resources Information Center

    Onorato, P.; Mascheretti, P.; DeAmbrosis, A.

    2012-01-01

    In this paper, we describe how simple experiments realizable by using easily found and low-cost materials allow students to explore quantitatively the magnetic interaction thanks to the help of an Open Source Physics tool, the Tracker Video Analysis software. The static equilibrium of a "column" of permanents magnets is carefully investigated by…

  19. Statistics, Structures & Satisfied Customers: Using Web Log Data to Improve Site Performance.

    ERIC Educational Resources Information Center

    Peacock, Darren

    This paper explores some of the ways in which the National Museum of Australia is using Web analysis tools to shape its future directions in the delivery of online services. In particular, it explores the potential of quantitative analysis, based on Web server log data, to convert these ephemeral traces of user experience into a strategic…

  20. The role of 3-D interactive visualization in blind surveys of H I in galaxies

    NASA Astrophysics Data System (ADS)

    Punzo, D.; van der Hulst, J. M.; Roerdink, J. B. T. M.; Oosterloo, T. A.; Ramatsoku, M.; Verheijen, M. A. W.

    2015-09-01

    Upcoming H I surveys will deliver large datasets, and automated processing using the full 3-D information (two positional dimensions and one spectral dimension) to find and characterize H I objects is imperative. In this context, visualization is an essential tool for enabling qualitative and quantitative human control on an automated source finding and analysis pipeline. We discuss how Visual Analytics, the combination of automated data processing and human reasoning, creativity and intuition, supported by interactive visualization, enables flexible and fast interaction with the 3-D data, helping the astronomer to deal with the analysis of complex sources. 3-D visualization, coupled to modeling, provides additional capabilities helping the discovery and analysis of subtle structures in the 3-D domain. The requirements for a fully interactive visualization tool are: coupled 1-D/2-D/3-D visualization, quantitative and comparative capabilities, combined with supervised semi-automated analysis. Moreover, the source code must have the following characteristics for enabling collaborative work: open, modular, well documented, and well maintained. We review four state of-the-art, 3-D visualization packages assessing their capabilities and feasibility for use in the case of 3-D astronomical data.

  1. Exploring Valid Reference Genes for Quantitative Real-time PCR Analysis in Plutella xylostella (Lepidoptera: Plutellidae)

    PubMed Central

    Fu, Wei; Xie, Wen; Zhang, Zhuo; Wang, Shaoli; Wu, Qingjun; Liu, Yong; Zhou, Xiaomao; Zhou, Xuguo; Zhang, Youjun

    2013-01-01

    Abstract: Quantitative real-time PCR (qRT-PCR), a primary tool in gene expression analysis, requires an appropriate normalization strategy to control for variation among samples. The best option is to compare the mRNA level of a target gene with that of reference gene(s) whose expression level is stable across various experimental conditions. In this study, expression profiles of eight candidate reference genes from the diamondback moth, Plutella xylostella, were evaluated under diverse experimental conditions. RefFinder, a web-based analysis tool, integrates four major computational programs including geNorm, Normfinder, BestKeeper, and the comparative ΔCt method to comprehensively rank the tested candidate genes. Elongation factor 1 (EF1) was the most suited reference gene for the biotic factors (development stage, tissue, and strain). In contrast, although appropriate reference gene(s) do exist for several abiotic factors (temperature, photoperiod, insecticide, and mechanical injury), we were not able to identify a single universal reference gene. Nevertheless, a suite of candidate reference genes were specifically recommended for selected experimental conditions. Our finding is the first step toward establishing a standardized qRT-PCR analysis of this agriculturally important insect pest. PMID:23983612

  2. On line biomonitors used as a tool for toxicity reduction evaluation of in situ groundwater remediation techniques.

    PubMed

    Küster, Eberhard; Dorusch, Falk; Vogt, Carsten; Weiss, Holger; Altenburger, Rolf

    2004-07-15

    Success of groundwater remediation is typically controlled via snapshot analysis of selected chemical substances or physical parameters. Biological parameters, i.e. ecotoxicological assays, are rarely employed. Hence the aim of the study was to develop a bioassay tool, which allows an on line monitoring of contaminated groundwater, as well as a toxicity reduction evaluation (TRE) of different remediation techniques in parallel and may furthermore be used as an additional tool for process control to supervise remediation techniques in a real time mode. Parallel testing of groundwater remediation techniques was accomplished for short and long time periods, by using the energy dependent luminescence of the bacterium Vibrio fischeri as biological monitoring parameter. One data point every hour for each remediation technique was generated by an automated biomonitor. The bacteria proved to be highly sensitive to the contaminated groundwater and the biomonitor showed a long standing time despite the highly corrosive groundwater present in Bitterfeld, Germany. The bacterial biomonitor is demonstrated to be a valuable tool for remediation success evaluation. Dose response relationships were generated for the six quantitatively dominant groundwater contaminants (2-chlortoluene, 1,2- and 1,4-dichlorobenzene, monochlorobenzene, ethylenbenzene and benzene). The concentrations of individual volatile organic chemicals (VOCs) could not explain the observed effects in the bacteria. An expected mixture toxicity was calculated for the six components using the concept of concentration addition. The calculated EC(50) for the mixture was still one order of magnitude lower than the observed EC(50) of the actual groundwater. The results pointed out that chemical analysis of the six most quantitative substances alone was not able to explain the effects observed with the bacteria. Thus chemical analysis alone may not be an adequate tool for remediation success evaluation in terms of toxicity reduction.

  3. Data Independent Acquisition analysis in ProHits 4.0.

    PubMed

    Liu, Guomin; Knight, James D R; Zhang, Jian Ping; Tsou, Chih-Chiang; Wang, Jian; Lambert, Jean-Philippe; Larsen, Brett; Tyers, Mike; Raught, Brian; Bandeira, Nuno; Nesvizhskii, Alexey I; Choi, Hyungwon; Gingras, Anne-Claude

    2016-10-21

    Affinity purification coupled with mass spectrometry (AP-MS) is a powerful technique for the identification and quantification of physical interactions. AP-MS requires careful experimental design, appropriate control selection and quantitative workflows to successfully identify bona fide interactors amongst a large background of contaminants. We previously introduced ProHits, a Laboratory Information Management System for interaction proteomics, which tracks all samples in a mass spectrometry facility, initiates database searches and provides visualization tools for spectral counting-based AP-MS approaches. More recently, we implemented Significance Analysis of INTeractome (SAINT) within ProHits to provide scoring of interactions based on spectral counts. Here, we provide an update to ProHits to support Data Independent Acquisition (DIA) with identification software (DIA-Umpire and MSPLIT-DIA), quantification tools (through DIA-Umpire, or externally via targeted extraction), and assessment of quantitative enrichment (through mapDIA) and scoring of interactions (through SAINT-intensity). With additional improvements, notably support of the iProphet pipeline, facilitated deposition into ProteomeXchange repositories and enhanced export and viewing functions, ProHits 4.0 offers a comprehensive suite of tools to facilitate affinity proteomics studies. It remains challenging to score, annotate and analyze proteomics data in a transparent manner. ProHits was previously introduced as a LIMS to enable storing, tracking and analysis of standard AP-MS data. In this revised version, we expand ProHits to include integration with a number of identification and quantification tools based on Data-Independent Acquisition (DIA). ProHits 4.0 also facilitates data deposition into public repositories, and the transfer of data to new visualization tools. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Anticipatory Understanding of Adversary Intent: A Signature-Based Knowledge System

    DTIC Science & Technology

    2009-06-01

    concept of logical positivism has been applied more recently to all human knowledge and reflected in current data fusion research, information mining...this work has been successfully translated into useful analytical tools that can provide a rigorous and quantitative basis for predictive analysis

  5. A Fan-tastic Quantitative Exploration of Ohm's Law

    NASA Astrophysics Data System (ADS)

    Mitchell, Brandon; Ekey, Robert; McCullough, Roy; Reitz, William

    2018-02-01

    Teaching simple circuits and Ohm's law to students in the introductory classroom has been extensively investigated through the common practice of using incandescent light bulbs to help students develop a conceptual foundation before moving on to quantitative analysis. However, the bulb filaments' resistance has a large temperature dependence, which makes them less suitable as a tool for quantitative analysis. Some instructors show that light bulbs do not obey Ohm's law either outright or through inquiry-based laboratory experiments. Others avoid the subject altogether by using bulbs strictly for qualitative purposes and then later switching to resistors for a numerical analysis, or by changing the operating conditions of the bulb so that it is "barely" glowing. It seems incongruous to develop a conceptual basis for the behavior of simple circuits using bulbs only to later reveal that they do not follow Ohm's law. Recently, small computer fans were proposed as a suitable replacement of bulbs for qualitative analysis of simple circuits where the current is related to the rotational speed of the fans. In this contribution, we demonstrate that fans can also be used for quantitative measurements and provide suggestions for successful classroom implementation.

  6. Sensorized toys for measuring manipulation capabilities of infants at home.

    PubMed

    Passetti, Giovanni; Cecchi, Francesca; Baldoli, Ilaria; Sgandurra, Giuseppina; Beani, Elena; Cioni, Giovanni; Laschi, Cecilia; Dario, Paolo

    2015-01-01

    Preterm infants, i.e. babies born after a gestation period shorter than 37 weeks, spend less time exploring objects. The quantitative measurement of grasping actions and forces in infants can give insights on their typical or atypical motor development. The aim of this work was to test a new tool, a kit of sensorized toys, to longitudinally measure, monitor and promote preterm infants manipulation capabilities with a purposive training in an ecological environment. This study presents preliminary analysis of grasping activity. Three preterm infants performed 4 weeks of daily training at home. Sensorized toys with embedded pressure sensors were used as part of the training to allow quantitative analysis of grasping (pressure and acceleration applied to toys while playing). Each toy was placed on the midline, while the infant was in supine position. Preliminary data show differences in the grasping parameters in relation to infants age and the performed daily training. Ongoing clinical trial will allow a full validation of this new tool for promoting object exploration in preterm infants.

  7. Recovering the dynamics of root growth and development using novel image acquisition and analysis methods

    PubMed Central

    Wells, Darren M.; French, Andrew P.; Naeem, Asad; Ishaq, Omer; Traini, Richard; Hijazi, Hussein; Bennett, Malcolm J.; Pridmore, Tony P.

    2012-01-01

    Roots are highly responsive to environmental signals encountered in the rhizosphere, such as nutrients, mechanical resistance and gravity. As a result, root growth and development is very plastic. If this complex and vital process is to be understood, methods and tools are required to capture the dynamics of root responses. Tools are needed which are high-throughput, supporting large-scale experimental work, and provide accurate, high-resolution, quantitative data. We describe and demonstrate the efficacy of the high-throughput and high-resolution root imaging systems recently developed within the Centre for Plant Integrative Biology (CPIB). This toolset includes (i) robotic imaging hardware to generate time-lapse datasets from standard cameras under infrared illumination and (ii) automated image analysis methods and software to extract quantitative information about root growth and development both from these images and via high-resolution light microscopy. These methods are demonstrated using data gathered during an experimental study of the gravitropic response of Arabidopsis thaliana. PMID:22527394

  8. Tracking and Quantifying Developmental Processes in C. elegans Using Open-source Tools.

    PubMed

    Dutta, Priyanka; Lehmann, Christina; Odedra, Devang; Singh, Deepika; Pohl, Christian

    2015-12-16

    Quantitatively capturing developmental processes is crucial to derive mechanistic models and key to identify and describe mutant phenotypes. Here protocols are presented for preparing embryos and adult C. elegans animals for short- and long-term time-lapse microscopy and methods for tracking and quantification of developmental processes. The methods presented are all based on C. elegans strains available from the Caenorhabditis Genetics Center and on open-source software that can be easily implemented in any laboratory independently of the microscopy system used. A reconstruction of a 3D cell-shape model using the modelling software IMOD, manual tracking of fluorescently-labeled subcellular structures using the multi-purpose image analysis program Endrov, and an analysis of cortical contractile flow using PIVlab (Time-Resolved Digital Particle Image Velocimetry Tool for MATLAB) are shown. It is discussed how these methods can also be deployed to quantitatively capture other developmental processes in different models, e.g., cell tracking and lineage tracing, tracking of vesicle flow.

  9. Recovering the dynamics of root growth and development using novel image acquisition and analysis methods.

    PubMed

    Wells, Darren M; French, Andrew P; Naeem, Asad; Ishaq, Omer; Traini, Richard; Hijazi, Hussein I; Hijazi, Hussein; Bennett, Malcolm J; Pridmore, Tony P

    2012-06-05

    Roots are highly responsive to environmental signals encountered in the rhizosphere, such as nutrients, mechanical resistance and gravity. As a result, root growth and development is very plastic. If this complex and vital process is to be understood, methods and tools are required to capture the dynamics of root responses. Tools are needed which are high-throughput, supporting large-scale experimental work, and provide accurate, high-resolution, quantitative data. We describe and demonstrate the efficacy of the high-throughput and high-resolution root imaging systems recently developed within the Centre for Plant Integrative Biology (CPIB). This toolset includes (i) robotic imaging hardware to generate time-lapse datasets from standard cameras under infrared illumination and (ii) automated image analysis methods and software to extract quantitative information about root growth and development both from these images and via high-resolution light microscopy. These methods are demonstrated using data gathered during an experimental study of the gravitropic response of Arabidopsis thaliana.

  10. Food Forensics: Using Mass Spectrometry To Detect Foodborne Protein Contaminants, as Exemplified by Shiga Toxin Variants and Prion Strains.

    PubMed

    Silva, Christopher J

    2018-06-13

    Food forensicists need a variety of tools to detect the many possible food contaminants. As a result of its analytical flexibility, mass spectrometry is one of those tools. Use of the multiple reaction monitoring (MRM) method expands its use to quantitation as well as detection of infectious proteins (prions) and protein toxins, such as Shiga toxins. The sample processing steps inactivate prions and Shiga toxins; the proteins are digested with proteases to yield peptides suitable for MRM-based analysis. Prions are detected by their distinct physicochemical properties and differential covalent modification. Shiga toxin analysis is based on detecting peptides derived from the five identical binding B subunits comprising the toxin. 15 N-labeled internal standards are prepared from cloned proteins. These examples illustrate the power of MRM, in that the same instrument can be used to safely detect and quantitate protein toxins, prions, and small molecules that might contaminate our food.

  11. VIPER: Visualization Pipeline for RNA-seq, a Snakemake workflow for efficient and complete RNA-seq analysis.

    PubMed

    Cornwell, MacIntosh; Vangala, Mahesh; Taing, Len; Herbert, Zachary; Köster, Johannes; Li, Bo; Sun, Hanfei; Li, Taiwen; Zhang, Jian; Qiu, Xintao; Pun, Matthew; Jeselsohn, Rinath; Brown, Myles; Liu, X Shirley; Long, Henry W

    2018-04-12

    RNA sequencing has become a ubiquitous technology used throughout life sciences as an effective method of measuring RNA abundance quantitatively in tissues and cells. The increase in use of RNA-seq technology has led to the continuous development of new tools for every step of analysis from alignment to downstream pathway analysis. However, effectively using these analysis tools in a scalable and reproducible way can be challenging, especially for non-experts. Using the workflow management system Snakemake we have developed a user friendly, fast, efficient, and comprehensive pipeline for RNA-seq analysis. VIPER (Visualization Pipeline for RNA-seq analysis) is an analysis workflow that combines some of the most popular tools to take RNA-seq analysis from raw sequencing data, through alignment and quality control, into downstream differential expression and pathway analysis. VIPER has been created in a modular fashion to allow for the rapid incorporation of new tools to expand the capabilities. This capacity has already been exploited to include very recently developed tools that explore immune infiltrate and T-cell CDR (Complementarity-Determining Regions) reconstruction abilities. The pipeline has been conveniently packaged such that minimal computational skills are required to download and install the dozens of software packages that VIPER uses. VIPER is a comprehensive solution that performs most standard RNA-seq analyses quickly and effectively with a built-in capacity for customization and expansion.

  12. A standardized kit for automated quantitative assessment of candidate protein biomarkers in human plasma.

    PubMed

    Percy, Andrew J; Mohammed, Yassene; Yang, Juncong; Borchers, Christoph H

    2015-12-01

    An increasingly popular mass spectrometry-based quantitative approach for health-related research in the biomedical field involves the use of stable isotope-labeled standards (SIS) and multiple/selected reaction monitoring (MRM/SRM). To improve inter-laboratory precision and enable more widespread use of this 'absolute' quantitative technique in disease-biomarker assessment studies, methods must be standardized. Results/methodology: Using this MRM-with-SIS-peptide approach, we developed an automated method (encompassing sample preparation, processing and analysis) for quantifying 76 candidate protein markers (spanning >4 orders of magnitude in concentration) in neat human plasma. The assembled biomarker assessment kit - the 'BAK-76' - contains the essential materials (SIS mixes), methods (for acquisition and analysis), and tools (Qualis-SIS software) for performing biomarker discovery or verification studies in a rapid and standardized manner.

  13. Label-free and amplified quantitation of proteins in complex mixtures using diffractive optics technology.

    PubMed

    Cleverley, Steve; Chen, Irene; Houle, Jean-François

    2010-01-15

    Immunoaffinity approaches remain invaluable tools for characterization and quantitation of biopolymers. Their application in separation science is often limited due to the challenges of immunoassay development. Typical end-point immunoassays require time consuming and labor-intensive approaches for optimization. Real-time label-free analysis using diffractive optics technology (dot) helps guide a very effective iterative process for rapid immunoassay development. Both label-free and amplified approaches can be used throughout feasibility testing and ultimately in the final assay, providing a robust platform for biopolymer analysis over a very broad dynamic range. We demonstrate the use of dot in rapidly developing assays for quantitating (1) human IgG in complex media, (2) a fusion protein in production media and (3) protein A contamination in purified immunoglobulin preparations. 2009 Elsevier B.V. All rights reserved.

  14. Quantitative mass spectrometry: an overview

    NASA Astrophysics Data System (ADS)

    Urban, Pawel L.

    2016-10-01

    Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry-especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements. This article is part of the themed issue 'Quantitative mass spectrometry'.

  15. MIDAS Website. Revised

    NASA Technical Reports Server (NTRS)

    Goodman, Allen; Shively, R. Joy (Technical Monitor)

    1997-01-01

    MIDAS, Man-machine Integration Design and Analysis System, is a unique combination of software tools aimed at reducing design cycle time, supporting quantitative predictions of human-system effectiveness and improving the design of crew stations and their associated operating procedures. This project is supported jointly by the US Army and NASA.

  16. Standardised Library Instruction Assessment: An Institution-Specific Approach

    ERIC Educational Resources Information Center

    Staley, Shannon M.; Branch, Nicole A.; Hewitt, Tom L.

    2010-01-01

    Introduction: We explore the use of a psychometric model for locally-relevant, information literacy assessment, using an online tool for standardised assessment of student learning during discipline-based library instruction sessions. Method: A quantitative approach to data collection and analysis was used, employing standardised multiple-choice…

  17. Managing complex research datasets using electronic tools: A meta-analysis exemplar

    PubMed Central

    Brown, Sharon A.; Martin, Ellen E.; Garcia, Theresa J.; Winter, Mary A.; García, Alexandra A.; Brown, Adama; Cuevas, Heather E.; Sumlin, Lisa L.

    2013-01-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256

  18. Managing complex research datasets using electronic tools: a meta-analysis exemplar.

    PubMed

    Brown, Sharon A; Martin, Ellen E; Garcia, Theresa J; Winter, Mary A; García, Alexandra A; Brown, Adama; Cuevas, Heather E; Sumlin, Lisa L

    2013-06-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, for example, EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process as well as enhancing communication among research team members. The purpose of this article is to describe the electronic processes designed, using commercially available software, for an extensive, quantitative model-testing meta-analysis. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to decide on which electronic tools to use, determine how these tools would be used, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members.

  19. Development and Validation of a Quantitative Framework and Management Expectation Tool for the Selection of Bioremediation Approaches at Chlorinated Ethene Sites

    DTIC Science & Technology

    2015-12-01

    FINAL REPORT Development and Validation of a Quantitative Framework and Management Expectation Tool for the Selection of Bioremediation ...TITLE AND SUBTITLE Development and Validation of a Quantitative Framework and Management Expectation Tool for the Selection of Bioremediation ...project ER-201129 was to develop and validate a framework used to make bioremediation decisions based on site-specific physical and biogeochemical

  20. Two worlds collide: Image analysis methods for quantifying structural variation in cluster molecular dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steenbergen, K. G., E-mail: kgsteen@gmail.com; Gaston, N.

    2014-02-14

    Inspired by methods of remote sensing image analysis, we analyze structural variation in cluster molecular dynamics (MD) simulations through a unique application of the principal component analysis (PCA) and Pearson Correlation Coefficient (PCC). The PCA analysis characterizes the geometric shape of the cluster structure at each time step, yielding a detailed and quantitative measure of structural stability and variation at finite temperature. Our PCC analysis captures bond structure variation in MD, which can be used to both supplement the PCA analysis as well as compare bond patterns between different cluster sizes. Relying only on atomic position data, without requirement formore » a priori structural input, PCA and PCC can be used to analyze both classical and ab initio MD simulations for any cluster composition or electronic configuration. Taken together, these statistical tools represent powerful new techniques for quantitative structural characterization and isomer identification in cluster MD.« less

  1. Two worlds collide: image analysis methods for quantifying structural variation in cluster molecular dynamics.

    PubMed

    Steenbergen, K G; Gaston, N

    2014-02-14

    Inspired by methods of remote sensing image analysis, we analyze structural variation in cluster molecular dynamics (MD) simulations through a unique application of the principal component analysis (PCA) and Pearson Correlation Coefficient (PCC). The PCA analysis characterizes the geometric shape of the cluster structure at each time step, yielding a detailed and quantitative measure of structural stability and variation at finite temperature. Our PCC analysis captures bond structure variation in MD, which can be used to both supplement the PCA analysis as well as compare bond patterns between different cluster sizes. Relying only on atomic position data, without requirement for a priori structural input, PCA and PCC can be used to analyze both classical and ab initio MD simulations for any cluster composition or electronic configuration. Taken together, these statistical tools represent powerful new techniques for quantitative structural characterization and isomer identification in cluster MD.

  2. Tools for quantifying isotopic niche space and dietary variation at the individual and population level.

    USGS Publications Warehouse

    Newsome, Seth D.; Yeakel, Justin D.; Wheatley, Patrick V.; Tinker, M. Tim

    2012-01-01

    Ecologists are increasingly using stable isotope analysis to inform questions about variation in resource and habitat use from the individual to community level. In this study we investigate data sets from 2 California sea otter (Enhydra lutris nereis) populations to illustrate the advantages and potential pitfalls of applying various statistical and quantitative approaches to isotopic data. We have subdivided these tools, or metrics, into 3 categories: IsoSpace metrics, stable isotope mixing models, and DietSpace metrics. IsoSpace metrics are used to quantify the spatial attributes of isotopic data that are typically presented in bivariate (e.g., δ13C versus δ15N) 2-dimensional space. We review IsoSpace metrics currently in use and present a technique by which uncertainty can be included to calculate the convex hull area of consumers or prey, or both. We then apply a Bayesian-based mixing model to quantify the proportion of potential dietary sources to the diet of each sea otter population and compare this to observational foraging data. Finally, we assess individual dietary specialization by comparing a previously published technique, variance components analysis, to 2 novel DietSpace metrics that are based on mixing model output. As the use of stable isotope analysis in ecology continues to grow, the field will need a set of quantitative tools for assessing isotopic variance at the individual to community level. Along with recent advances in Bayesian-based mixing models, we hope that the IsoSpace and DietSpace metrics described here will provide another set of interpretive tools for ecologists.

  3. A clustering approach to segmenting users of internet-based risk calculators.

    PubMed

    Harle, C A; Downs, J S; Padman, R

    2011-01-01

    Risk calculators are widely available Internet applications that deliver quantitative health risk estimates to consumers. Although these tools are known to have varying effects on risk perceptions, little is known about who will be more likely to accept objective risk estimates. To identify clusters of online health consumers that help explain variation in individual improvement in risk perceptions from web-based quantitative disease risk information. A secondary analysis was performed on data collected in a field experiment that measured people's pre-diabetes risk perceptions before and after visiting a realistic health promotion website that provided quantitative risk information. K-means clustering was performed on numerous candidate variable sets, and the different segmentations were evaluated based on between-cluster variation in risk perception improvement. Variation in responses to risk information was best explained by clustering on pre-intervention absolute pre-diabetes risk perceptions and an objective estimate of personal risk. Members of a high-risk overestimater cluster showed large improvements in their risk perceptions, but clusters of both moderate-risk and high-risk underestimaters were much more muted in improving their optimistically biased perceptions. Cluster analysis provided a unique approach for segmenting health consumers and predicting their acceptance of quantitative disease risk information. These clusters suggest that health consumers were very responsive to good news, but tended not to incorporate bad news into their self-perceptions much. These findings help to quantify variation among online health consumers and may inform the targeted marketing of and improvements to risk communication tools on the Internet.

  4. GiA Roots: software for the high throughput analysis of plant root system architecture.

    PubMed

    Galkovskyi, Taras; Mileyko, Yuriy; Bucksch, Alexander; Moore, Brad; Symonova, Olga; Price, Charles A; Topp, Christopher N; Iyer-Pascuzzi, Anjali S; Zurek, Paul R; Fang, Suqin; Harer, John; Benfey, Philip N; Weitz, Joshua S

    2012-07-26

    Characterizing root system architecture (RSA) is essential to understanding the development and function of vascular plants. Identifying RSA-associated genes also represents an underexplored opportunity for crop improvement. Software tools are needed to accelerate the pace at which quantitative traits of RSA are estimated from images of root networks. We have developed GiA Roots (General Image Analysis of Roots), a semi-automated software tool designed specifically for the high-throughput analysis of root system images. GiA Roots includes user-assisted algorithms to distinguish root from background and a fully automated pipeline that extracts dozens of root system phenotypes. Quantitative information on each phenotype, along with intermediate steps for full reproducibility, is returned to the end-user for downstream analysis. GiA Roots has a GUI front end and a command-line interface for interweaving the software into large-scale workflows. GiA Roots can also be extended to estimate novel phenotypes specified by the end-user. We demonstrate the use of GiA Roots on a set of 2393 images of rice roots representing 12 genotypes from the species Oryza sativa. We validate trait measurements against prior analyses of this image set that demonstrated that RSA traits are likely heritable and associated with genotypic differences. Moreover, we demonstrate that GiA Roots is extensible and an end-user can add functionality so that GiA Roots can estimate novel RSA traits. In summary, we show that the software can function as an efficient tool as part of a workflow to move from large numbers of root images to downstream analysis.

  5. NIRS-SPM: statistical parametric mapping for near infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Tak, Sungho; Jang, Kwang Eun; Jung, Jinwook; Jang, Jaeduck; Jeong, Yong; Ye, Jong Chul

    2008-02-01

    Even though there exists a powerful statistical parametric mapping (SPM) tool for fMRI, similar public domain tools are not available for near infrared spectroscopy (NIRS). In this paper, we describe a new public domain statistical toolbox called NIRS-SPM for quantitative analysis of NIRS signals. Specifically, NIRS-SPM statistically analyzes the NIRS data using GLM and makes inference as the excursion probability which comes from the random field that are interpolated from the sparse measurement. In order to obtain correct inference, NIRS-SPM offers the pre-coloring and pre-whitening method for temporal correlation estimation. For simultaneous recording NIRS signal with fMRI, the spatial mapping between fMRI image and real coordinate in 3-D digitizer is estimated using Horn's algorithm. These powerful tools allows us the super-resolution localization of the brain activation which is not possible using the conventional NIRS analysis tools.

  6. Cost analysis of objective resident cataract surgery assessments.

    PubMed

    Nandigam, Kiran; Soh, Jonathan; Gensheimer, William G; Ghazi, Ahmed; Khalifa, Yousuf M

    2015-05-01

    To compare 8 ophthalmology resident surgical training tools to determine which is most cost effective. University of Rochester Medical Center, Rochester, New York, USA. Retrospective evaluation of technology. A cost-analysis model was created to compile all relevant costs in running each tool in a medium-sized ophthalmology program. Quantitative cost estimates were obtained based on cost of tools, cost of time in evaluations, and supply and maintenance costs. For wet laboratory simulation, Eyesi was the least expensive cataract surgery simulation method; however, it is only capable of evaluating simulated cataract surgery rehearsal and requires supplementation with other evaluative methods for operating room performance and for noncataract wet lab training and evaluation. The most expensive training tool was the Eye Surgical Skills Assessment Test (ESSAT). The 2 most affordable methods for resident evaluation in operating room performance were the Objective Assessment of Skills in Intraocular Surgery (OASIS) and Global Rating Assessment of Skills in Intraocular Surgery (GRASIS). Cost-based analysis of ophthalmology resident surgical training tools are needed so residency programs can implement tools that are valid, reliable, objective, and cost effective. There is no perfect training system at this time. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  7. Phasegram Analysis of Vocal Fold Vibration Documented With Laryngeal High-speed Video Endoscopy.

    PubMed

    Herbst, Christian T; Unger, Jakob; Herzel, Hanspeter; Švec, Jan G; Lohscheller, Jörg

    2016-11-01

    In a recent publication, the phasegram, a bifurcation diagram over time, has been introduced as an intuitive visualization tool for assessing the vibratory states of oscillating systems. Here, this nonlinear dynamics approach is augmented with quantitative analysis parameters, and it is applied to clinical laryngeal high-speed video (HSV) endoscopic recordings of healthy and pathological phonations. HSV data from a total of 73 females diagnosed as healthy (n = 42), or with functional dysphonia (n = 15) or with unilateral vocal fold paralysis (n = 16), were quantitatively analyzed. Glottal area waveforms (GAW) and left and right hemi-GAWs (hGAW) were extracted from the HSV recordings. Based on Poincaré sections through phase space-embedded signals, two novel quantitative parameters were computed: the phasegram entropy (PE) and the phasegram complexity estimate (PCE), inspired by signal entropy and correlation dimension computation, respectively. Both PE and PCE assumed higher average values (suggesting more irregular vibrations) for the pathological as compared with the healthy participants, thus significantly discriminating healthy group from the paralysis group (P = 0.02 for both PE and PCE). Comparisons of individual PE or PCE data for the left and the right hGAW within each subject resulted in asymmetry measures for the regularity of vocal fold vibration. The PCE-based asymmetry measure revealed significant differences between the healthy group and the paralysis group (P = 0.03). Quantitative phasegram analysis of GAW and hGAW data is a promising tool for the automated processing of HSV data in research and in clinical practice. Copyright © 2016 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  8. Quantitative imaging assay for NF-κB nuclear translocation in primary human macrophages

    PubMed Central

    Noursadeghi, Mahdad; Tsang, Jhen; Haustein, Thomas; Miller, Robert F.; Chain, Benjamin M.; Katz, David R.

    2008-01-01

    Quantitative measurement of NF-κB nuclear translocation is an important research tool in cellular immunology. Established methodologies have a number of limitations, such as poor sensitivity, high cost or dependence on cell lines. Novel imaging methods to measure nuclear translocation of transcriptionally active components of NF-κB are being used but are also partly limited by the need for specialist imaging equipment or image analysis software. Herein we present a method for quantitative detection of NF-κB rel A nuclear translocation, using immunofluorescence microscopy and the public domain image analysis software ImageJ that can be easily adopted for cellular immunology research without the need for specialist image analysis expertise and at low cost. The method presented here is validated by demonstrating the time course and dose response of NF-κB nuclear translocation in primary human macrophages stimulated with LPS, and by comparison with a commercial NF-κB activation reporter cell line. PMID:18036607

  9. Quantitative risk assessment system (QRAS)

    NASA Technical Reports Server (NTRS)

    Tan, Zhibin (Inventor); Mosleh, Ali (Inventor); Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Chang, Yung-Hsien (Inventor); Groen, Francisco J (Inventor); Swaminathan, Sankaran (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  10. 3-D interactive visualisation tools for Hi spectral line imaging

    NASA Astrophysics Data System (ADS)

    van der Hulst, J. M.; Punzo, D.; Roerdink, J. B. T. M.

    2017-06-01

    Upcoming HI surveys will deliver such large datasets that automated processing using the full 3-D information to find and characterize HI objects is unavoidable. Full 3-D visualization is an essential tool for enabling qualitative and quantitative inspection and analysis of the 3-D data, which is often complex in nature. Here we present SlicerAstro, an open-source extension of 3DSlicer, a multi-platform open source software package for visualization and medical image processing, which we developed for the inspection and analysis of HI spectral line data. We describe its initial capabilities, including 3-D filtering, 3-D selection and comparative modelling.

  11. Verus: A Tool for Quantitative Analysis of Finite-State Real-Time Systems.

    DTIC Science & Technology

    1996-08-12

    Symbolic model checking is a technique for verifying finite-state concurrent systems that has been extended to handle real - time systems . Models with...up to 10(exp 30) states can often be verified in minutes. In this paper, we present a new tool to analyze real - time systems , based on this technique...We have designed a language, called Verus, for the description of real - time systems . Such a description is compiled into a state-transition graph and

  12. SaaS Platform for Time Series Data Handling

    NASA Astrophysics Data System (ADS)

    Oplachko, Ekaterina; Rykunov, Stanislav; Ustinin, Mikhail

    2018-02-01

    The paper is devoted to the description of MathBrain, a cloud-based resource, which works as a "Software as a Service" model. It is designed to maximize the efficiency of the current technology and to provide a tool for time series data handling. The resource provides access to the following analysis methods: direct and inverse Fourier transforms, Principal component analysis and Independent component analysis decompositions, quantitative analysis, magnetoencephalography inverse problem solution in a single dipole model based on multichannel spectral data.

  13. Validating a Lifestyle Physical Activity Measure for People with Serious Mental Illness

    ERIC Educational Resources Information Center

    Bezyak, Jill L.; Chan, Fong; Chiu, Chung-Yi; Kaya, Cahit; Huck, Garrett

    2014-01-01

    Purpose: To evaluate the measurement structure of the "Physical Activity Scale for Individuals With Physical Disabilities" (PASIPD) as an assessment tool of lifestyle physical activities for people with severe mental illness. Method: A quantitative descriptive research design using factor analysis was employed. A sample of 72 individuals…

  14. Imaging and quantitative methods for studying cytoskeletal rearrangements during root development and gravitropism.

    PubMed

    Jacques, Eveline; Wells, Darren M; Bennett, Malcolm J; Vissenberg, Kris

    2015-01-01

    High-resolution imaging of cytoskeletal structures paves the way for standardized methods to quantify cytoskeletal organization. Here we provide a detailed description of the analysis performed to determine the microtubule patterns in gravistimulated roots, using the recently developed software tool MicroFilament Analyzer.

  15. How Linguistic Frames Affect Motivational Profiles and the Roles of Quantitative versus Qualitative Research Strategies

    ERIC Educational Resources Information Center

    Yeager, Joseph; Sommer, Linda

    2005-01-01

    The combined tools of psycholinguistics and systems analysis have produced advances in motivational profiling resulting in numerous applications to behavioral engineering. Knowing the way people frame their motive offers leverage in causing behavior change ranging from persuasive marketing campaigns, forensic profiling, individual psychotherapy,…

  16. Community College Students' Perceptions of Effective Communication in Online Learning

    ERIC Educational Resources Information Center

    Parker, Donna Alice Hill

    2012-01-01

    This quantitative research project analyzed the application of instructional communication tools and techniques used by community college students to determine how they perceive communication in their online classes. Online students from a community college participated in this study by completing an electronic survey. Data analysis revealed that…

  17. Quantitative analysis of biological tissues using Fourier transform-second-harmonic generation imaging

    NASA Astrophysics Data System (ADS)

    Ambekar Ramachandra Rao, Raghu; Mehta, Monal R.; Toussaint, Kimani C., Jr.

    2010-02-01

    We demonstrate the use of Fourier transform-second-harmonic generation (FT-SHG) imaging of collagen fibers as a means of performing quantitative analysis of obtained images of selected spatial regions in porcine trachea, ear, and cornea. Two quantitative markers, preferred orientation and maximum spatial frequency are proposed for differentiating structural information between various spatial regions of interest in the specimens. The ear shows consistent maximum spatial frequency and orientation as also observed in its real-space image. However, there are observable changes in the orientation and minimum feature size of fibers in the trachea indicating a more random organization. Finally, the analysis is applied to a 3D image stack of the cornea. It is shown that the standard deviation of the orientation is sensitive to the randomness in fiber orientation. Regions with variations in the maximum spatial frequency, but with relatively constant orientation, suggest that maximum spatial frequency is useful as an independent quantitative marker. We emphasize that FT-SHG is a simple, yet powerful, tool for extracting information from images that is not obvious in real space. This technique can be used as a quantitative biomarker to assess the structure of collagen fibers that may change due to damage from disease or physical injury.

  18. iMet-Q: A User-Friendly Tool for Label-Free Metabolomics Quantitation Using Dynamic Peak-Width Determination

    PubMed Central

    Chang, Hui-Yin; Chen, Ching-Tai; Lih, T. Mamie; Lynn, Ke-Shiuan; Juo, Chiun-Gung; Hsu, Wen-Lian; Sung, Ting-Yi

    2016-01-01

    Efficient and accurate quantitation of metabolites from LC-MS data has become an important topic. Here we present an automated tool, called iMet-Q (intelligent Metabolomic Quantitation), for label-free metabolomics quantitation from high-throughput MS1 data. By performing peak detection and peak alignment, iMet-Q provides a summary of quantitation results and reports ion abundance at both replicate level and sample level. Furthermore, it gives the charge states and isotope ratios of detected metabolite peaks to facilitate metabolite identification. An in-house standard mixture and a public Arabidopsis metabolome data set were analyzed by iMet-Q. Three public quantitation tools, including XCMS, MetAlign, and MZmine 2, were used for performance comparison. From the mixture data set, seven standard metabolites were detected by the four quantitation tools, for which iMet-Q had a smaller quantitation error of 12% in both profile and centroid data sets. Our tool also correctly determined the charge states of seven standard metabolites. By searching the mass values for those standard metabolites against Human Metabolome Database, we obtained a total of 183 metabolite candidates. With the isotope ratios calculated by iMet-Q, 49% (89 out of 183) metabolite candidates were filtered out. From the public Arabidopsis data set reported with two internal standards and 167 elucidated metabolites, iMet-Q detected all of the peaks corresponding to the internal standards and 167 metabolites. Meanwhile, our tool had small abundance variation (≤0.19) when quantifying the two internal standards and had higher abundance correlation (≥0.92) when quantifying the 167 metabolites. iMet-Q provides user-friendly interfaces and is publicly available for download at http://ms.iis.sinica.edu.tw/comics/Software_iMet-Q.html. PMID:26784691

  19. How to Combine ChIP with qPCR.

    PubMed

    Asp, Patrik

    2018-01-01

    Chromatin immunoprecipitation (ChIP) coupled with quantitative PCR (qPCR) has in the last 15 years become a basic mainstream tool in genomic research. Numerous commercially available ChIP kits, qPCR kits, and real-time PCR systems allow for quick and easy analysis of virtually anything chromatin-related as long as there is an available antibody. However, the highly accurate quantitative dimension added by using qPCR to analyze ChIP samples significantly raises the bar in terms of experimental accuracy, appropriate controls, data analysis, and data presentation. This chapter will address these potential pitfalls by providing protocols and procedures that address the difficulties inherent in ChIP-qPCR assays.

  20. Quantitative petri net model of gene regulated metabolic networks in the cell.

    PubMed

    Chen, Ming; Hofestädt, Ralf

    2011-01-01

    A method to exploit hybrid Petri nets (HPN) for quantitatively modeling and simulating gene regulated metabolic networks is demonstrated. A global kinetic modeling strategy and Petri net modeling algorithm are applied to perform the bioprocess functioning and model analysis. With the model, the interrelations between pathway analysis and metabolic control mechanism are outlined. Diagrammatical results of the dynamics of metabolites are simulated and observed by implementing a HPN tool, Visual Object Net ++. An explanation of the observed behavior of the urea cycle is proposed to indicate possibilities for metabolic engineering and medical care. Finally, the perspective of Petri nets on modeling and simulation of metabolic networks is discussed.

  1. Platform-independent and label-free quantitation of proteomic data using MS1 extracted ion chromatograms in skyline: application to protein acetylation and phosphorylation.

    PubMed

    Schilling, Birgit; Rardin, Matthew J; MacLean, Brendan X; Zawadzka, Anna M; Frewen, Barbara E; Cusack, Michael P; Sorensen, Dylan J; Bereman, Michael S; Jing, Enxuan; Wu, Christine C; Verdin, Eric; Kahn, C Ronald; Maccoss, Michael J; Gibson, Bradford W

    2012-05-01

    Despite advances in metabolic and postmetabolic labeling methods for quantitative proteomics, there remains a need for improved label-free approaches. This need is particularly pressing for workflows that incorporate affinity enrichment at the peptide level, where isobaric chemical labels such as isobaric tags for relative and absolute quantitation and tandem mass tags may prove problematic or where stable isotope labeling with amino acids in cell culture labeling cannot be readily applied. Skyline is a freely available, open source software tool for quantitative data processing and proteomic analysis. We expanded the capabilities of Skyline to process ion intensity chromatograms of peptide analytes from full scan mass spectral data (MS1) acquired during HPLC MS/MS proteomic experiments. Moreover, unlike existing programs, Skyline MS1 filtering can be used with mass spectrometers from four major vendors, which allows results to be compared directly across laboratories. The new quantitative and graphical tools now available in Skyline specifically support interrogation of multiple acquisitions for MS1 filtering, including visual inspection of peak picking and both automated and manual integration, key features often lacking in existing software. In addition, Skyline MS1 filtering displays retention time indicators from underlying MS/MS data contained within the spectral library to ensure proper peak selection. The modular structure of Skyline also provides well defined, customizable data reports and thus allows users to directly connect to existing statistical programs for post hoc data analysis. To demonstrate the utility of the MS1 filtering approach, we have carried out experiments on several MS platforms and have specifically examined the performance of this method to quantify two important post-translational modifications: acetylation and phosphorylation, in peptide-centric affinity workflows of increasing complexity using mouse and human models.

  2. Platform-independent and Label-free Quantitation of Proteomic Data Using MS1 Extracted Ion Chromatograms in Skyline

    PubMed Central

    Schilling, Birgit; Rardin, Matthew J.; MacLean, Brendan X.; Zawadzka, Anna M.; Frewen, Barbara E.; Cusack, Michael P.; Sorensen, Dylan J.; Bereman, Michael S.; Jing, Enxuan; Wu, Christine C.; Verdin, Eric; Kahn, C. Ronald; MacCoss, Michael J.; Gibson, Bradford W.

    2012-01-01

    Despite advances in metabolic and postmetabolic labeling methods for quantitative proteomics, there remains a need for improved label-free approaches. This need is particularly pressing for workflows that incorporate affinity enrichment at the peptide level, where isobaric chemical labels such as isobaric tags for relative and absolute quantitation and tandem mass tags may prove problematic or where stable isotope labeling with amino acids in cell culture labeling cannot be readily applied. Skyline is a freely available, open source software tool for quantitative data processing and proteomic analysis. We expanded the capabilities of Skyline to process ion intensity chromatograms of peptide analytes from full scan mass spectral data (MS1) acquired during HPLC MS/MS proteomic experiments. Moreover, unlike existing programs, Skyline MS1 filtering can be used with mass spectrometers from four major vendors, which allows results to be compared directly across laboratories. The new quantitative and graphical tools now available in Skyline specifically support interrogation of multiple acquisitions for MS1 filtering, including visual inspection of peak picking and both automated and manual integration, key features often lacking in existing software. In addition, Skyline MS1 filtering displays retention time indicators from underlying MS/MS data contained within the spectral library to ensure proper peak selection. The modular structure of Skyline also provides well defined, customizable data reports and thus allows users to directly connect to existing statistical programs for post hoc data analysis. To demonstrate the utility of the MS1 filtering approach, we have carried out experiments on several MS platforms and have specifically examined the performance of this method to quantify two important post-translational modifications: acetylation and phosphorylation, in peptide-centric affinity workflows of increasing complexity using mouse and human models. PMID:22454539

  3. Qualitative and quantitative interpretation of SEM image using digital image processing.

    PubMed

    Saladra, Dawid; Kopernik, Magdalena

    2016-10-01

    The aim of the this study is improvement of qualitative and quantitative analysis of scanning electron microscope micrographs by development of computer program, which enables automatic crack analysis of scanning electron microscopy (SEM) micrographs. Micromechanical tests of pneumatic ventricular assist devices result in a large number of micrographs. Therefore, the analysis must be automatic. Tests for athrombogenic titanium nitride/gold coatings deposited on polymeric substrates (Bionate II) are performed. These tests include microshear, microtension and fatigue analysis. Anisotropic surface defects observed in the SEM micrographs require support for qualitative and quantitative interpretation. Improvement of qualitative analysis of scanning electron microscope images was achieved by a set of computational tools that includes binarization, simplified expanding, expanding, simple image statistic thresholding, the filters Laplacian 1, and Laplacian 2, Otsu and reverse binarization. Several modifications of the known image processing techniques and combinations of the selected image processing techniques were applied. The introduced quantitative analysis of digital scanning electron microscope images enables computation of stereological parameters such as area, crack angle, crack length, and total crack length per unit area. This study also compares the functionality of the developed computer program of digital image processing with existing applications. The described pre- and postprocessing may be helpful in scanning electron microscopy and transmission electron microscopy surface investigations. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  4. Application of principal component analysis (PCA) as a sensory assessment tool for fermented food products.

    PubMed

    Ghosh, Debasree; Chattopadhyay, Parimal

    2012-06-01

    The objective of the work was to use the method of quantitative descriptive analysis (QDA) to describe the sensory attributes of the fermented food products prepared with the incorporation of lactic cultures. Panellists were selected and trained to evaluate various attributes specially color and appearance, body texture, flavor, overall acceptability and acidity of the fermented food products like cow milk curd and soymilk curd, idli, sauerkraut and probiotic ice cream. Principal component analysis (PCA) identified the six significant principal components that accounted for more than 90% of the variance in the sensory attribute data. Overall product quality was modelled as a function of principal components using multiple least squares regression (R (2) = 0.8). The result from PCA was statistically analyzed by analysis of variance (ANOVA). These findings demonstrate the utility of quantitative descriptive analysis for identifying and measuring the fermented food product attributes that are important for consumer acceptability.

  5. Quantitative analysis of the rubric as an assessment tool: an empirical study of student peer-group rating

    NASA Astrophysics Data System (ADS)

    Hafner, John C.; Hafner, Patti M.

    2003-12-01

    Although the rubric has emerged as one of the most popular assessment tools in progressive educational programs, there is an unfortunate dearth of information in the literature quantifying the actual effectiveness of the rubric as an assessment tool in the hands of the students. This study focuses on the validity and reliability of the rubric as an assessment tool for student peer-group evaluation in an effort to further explore the use and effectiveness of the rubric. A total of 1577 peer-group ratings using a rubric for an oral presentation was used in this 3-year study involving 107 college biology students. A quantitative analysis of the rubric used in this study shows that it is used consistently by both students and the instructor across the study years. Moreover, the rubric appears to be 'gender neutral' and the students' academic strength has no significant bearing on the way that they employ the rubric. A significant, one-to-one relationship (slope = 1.0) between the instructor's assessment and the students' rating is seen across all years using the rubric. A generalizability study yields estimates of inter-rater reliability of moderate values across all years and allows for the estimation of variance components. Taken together, these data indicate that the general form and evaluative criteria of the rubric are clear and that the rubric is a useful assessment tool for peer-group (and self-) assessment by students. To our knowledge, these data provide the first statistical documentation of the validity and reliability of the rubric for student peer-group assessment.

  6. Quantitative proteomics in biological research.

    PubMed

    Wilm, Matthias

    2009-10-01

    Proteomics has enabled the direct investigation of biological material, at first through the analysis of individual proteins, then of lysates from cell cultures, and finally of extracts from tissues and biopsies from entire organisms. Its latest manifestation - quantitative proteomics - allows deeper insight into biological systems. This article reviews the different methods used to extract quantitative information from mass spectra. It follows the technical developments aimed toward global proteomics, the attempt to characterize every expressed protein in a cell by at least one peptide. When applications of the technology are discussed, the focus is placed on yeast biology. In particular, differential quantitative proteomics, the comparison between an experiment and its control, is very discriminating for proteins involved in the process being studied. When trying to understand biological processes on a molecular level, differential quantitative proteomics tends to give a clearer picture than global transcription analyses. As a result, MS has become an even more indispensable tool for biochemically motivated biological research.

  7. Using PSEA-Quant for Protein Set Enrichment Analysis of Quantitative Mass Spectrometry-Based Proteomics.

    PubMed

    Lavallée-Adam, Mathieu; Yates, John R

    2016-03-24

    PSEA-Quant analyzes quantitative mass spectrometry-based proteomics datasets to identify enrichments of annotations contained in repositories such as the Gene Ontology and Molecular Signature databases. It allows users to identify the annotations that are significantly enriched for reproducibly quantified high abundance proteins. PSEA-Quant is available on the Web and as a command-line tool. It is compatible with all label-free and isotopic labeling-based quantitative proteomics methods. This protocol describes how to use PSEA-Quant and interpret its output. The importance of each parameter as well as troubleshooting approaches are also discussed. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.

  8. QUANTITATIVE MASS SPECTROMETRIC ANALYSIS OF GLYCOPROTEINS COMBINED WITH ENRICHMENT METHODS

    PubMed Central

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc. Rapid Commun. Mass Spec Rev 34:148–165, 2015. PMID:24889823

  9. Agreement between clinical estimation and a new quantitative analysis by Photoshop software in fundus and angiographic image variables.

    PubMed

    Ramezani, Alireza; Ahmadieh, Hamid; Azarmina, Mohsen; Soheilian, Masoud; Dehghan, Mohammad H; Mohebbi, Mohammad R

    2009-12-01

    To evaluate the validity of a new method for the quantitative analysis of fundus or angiographic images using Photoshop 7.0 (Adobe, USA) software by comparing with clinical evaluation. Four hundred and eighteen fundus and angiographic images of diabetic patients were evaluated by three retina specialists and then by computing using Photoshop 7.0 software. Four variables were selected for comparison: amount of hard exudates (HE) on color pictures, amount of HE on red-free pictures, severity of leakage, and the size of the foveal avascular zone (FAZ). The coefficient of agreement (Kappa) between the two methods in the amount of HE on color and red-free photographs were 85% (0.69) and 79% (0.59), respectively. The agreement for severity of leakage was 72% (0.46). In the two methods for the evaluation of the FAZ size using the magic and lasso software tools, the agreement was 54% (0.09) and 89% (0.77), respectively. Agreement in the estimation of the FAZ size by the lasso magnetic tool was excellent and was almost as good in the quantification of HE on color and on red-free images. Considering the agreement of this new technique for the measurement of variables in fundus images using Photoshop software with the clinical evaluation, this method seems to have sufficient validity to be used for the quantitative analysis of HE, leakage, and FAZ size on the angiograms of diabetic patients.

  10. Visualization techniques to aid in the analysis of multispectral astrophysical data sets

    NASA Technical Reports Server (NTRS)

    Brugel, E. W.; Domik, Gitta O.; Ayres, T. R.

    1993-01-01

    The goal of this project was to support the scientific analysis of multi-spectral astrophysical data by means of scientific visualization. Scientific visualization offers its greatest value if it is not used as a method separate or alternative to other data analysis methods but rather in addition to these methods. Together with quantitative analysis of data, such as offered by statistical analysis, image or signal processing, visualization attempts to explore all information inherent in astrophysical data in the most effective way. Data visualization is one aspect of data analysis. Our taxonomy as developed in Section 2 includes identification and access to existing information, preprocessing and quantitative analysis of data, visual representation and the user interface as major components to the software environment of astrophysical data analysis. In pursuing our goal to provide methods and tools for scientific visualization of multi-spectral astrophysical data, we therefore looked at scientific data analysis as one whole process, adding visualization tools to an already existing environment and integrating the various components that define a scientific data analysis environment. As long as the software development process of each component is separate from all other components, users of data analysis software are constantly interrupted in their scientific work in order to convert from one data format to another, or to move from one storage medium to another, or to switch from one user interface to another. We also took an in-depth look at scientific visualization and its underlying concepts, current visualization systems, their contributions and their shortcomings. The role of data visualization is to stimulate mental processes different from quantitative data analysis, such as the perception of spatial relationships or the discovery of patterns or anomalies while browsing through large data sets. Visualization often leads to an intuitive understanding of the meaning of data values and their relationships by sacrificing accuracy in interpreting the data values. In order to be accurate in the interpretation, data values need to be measured, computed on, and compared to theoretical or empirical models (quantitative analysis). If visualization software hampers quantitative analysis (which happens with some commercial visualization products), its use is greatly diminished for astrophysical data analysis. The software system STAR (Scientific Toolkit for Astrophysical Research) was developed as a prototype during the course of the project to better understand the pragmatic concerns raised in the project. STAR led to a better understanding on the importance of collaboration between astrophysicists and computer scientists. Twenty-one examples of the use of visualization for astrophysical data are included with this report. Sixteen publications related to efforts performed during or initiated through work on this project are listed at the end of this report.

  11. Quantitative 4D analyses of epithelial folding during Drosophila gastrulation.

    PubMed

    Khan, Zia; Wang, Yu-Chiun; Wieschaus, Eric F; Kaschube, Matthias

    2014-07-01

    Understanding the cellular and mechanical processes that underlie the shape changes of individual cells and their collective behaviors in a tissue during dynamic and complex morphogenetic events is currently one of the major frontiers in developmental biology. The advent of high-speed time-lapse microscopy and its use in monitoring the cellular events in fluorescently labeled developing organisms demonstrate tremendous promise in establishing detailed descriptions of these events and could potentially provide a foundation for subsequent hypothesis-driven research strategies. However, obtaining quantitative measurements of dynamic shapes and behaviors of cells and tissues in a rapidly developing metazoan embryo using time-lapse 3D microscopy remains technically challenging, with the main hurdle being the shortage of robust imaging processing and analysis tools. We have developed EDGE4D, a software tool for segmenting and tracking membrane-labeled cells using multi-photon microscopy data. Our results demonstrate that EDGE4D enables quantification of the dynamics of cell shape changes, cell interfaces and neighbor relations at single-cell resolution during a complex epithelial folding event in the early Drosophila embryo. We expect this tool to be broadly useful for the analysis of epithelial cell geometries and movements in a wide variety of developmental contexts. © 2014. Published by The Company of Biologists Ltd.

  12. Manufacturing of hybrid aluminum copper joints by electromagnetic pulse welding - Identification of quantitative process windows

    NASA Astrophysics Data System (ADS)

    Psyk, Verena; Scheffler, Christian; Linnemann, Maik; Landgrebe, Dirk

    2017-10-01

    Compared to conventional joining techniques, electromagnetic pulse welding offers important advantages especially when it comes to dissimilar material connections as e.g. copper aluminum welds. However, due to missing guidelines and tools for process design, the process has not been widely implemented in industrial production, yet. In order to contribute to overcoming this obstacle, a combined numerical and experimental process analysis for electromagnetic pulse welding of Cu-DHP and EN AW-1050 was carried out and the results were consolidated in a quantitative collision parameter based process window.

  13. Case-Deletion Diagnostics for Maximum Likelihood Multipoint Quantitative Trait Locus Linkage Analysis

    PubMed Central

    Mendoza, Maria C.B.; Burns, Trudy L.; Jones, Michael P.

    2009-01-01

    Objectives Case-deletion diagnostic methods are tools that allow identification of influential observations that may affect parameter estimates and model fitting conclusions. The goal of this paper was to develop two case-deletion diagnostics, the exact case deletion (ECD) and the empirical influence function (EIF), for detecting outliers that can affect results of sib-pair maximum likelihood quantitative trait locus (QTL) linkage analysis. Methods Subroutines to compute the ECD and EIF were incorporated into the maximum likelihood QTL variance estimation components of the linkage analysis program MAPMAKER/SIBS. Performance of the diagnostics was compared in simulation studies that evaluated the proportion of outliers correctly identified (sensitivity), and the proportion of non-outliers correctly identified (specificity). Results Simulations involving nuclear family data sets with one outlier showed EIF sensitivities approximated ECD sensitivities well for outlier-affected parameters. Sensitivities were high, indicating the outlier was identified a high proportion of the time. Simulations also showed the enormous computational time advantage of the EIF. Diagnostics applied to body mass index in nuclear families detected observations influential on the lod score and model parameter estimates. Conclusions The EIF is a practical diagnostic tool that has the advantages of high sensitivity and quick computation. PMID:19172086

  14. PeptideDepot: flexible relational database for visual analysis of quantitative proteomic data and integration of existing protein information.

    PubMed

    Yu, Kebing; Salomon, Arthur R

    2009-12-01

    Recently, dramatic progress has been achieved in expanding the sensitivity, resolution, mass accuracy, and scan rate of mass spectrometers able to fragment and identify peptides through MS/MS. Unfortunately, this enhanced ability to acquire proteomic data has not been accompanied by a concomitant increase in the availability of flexible tools allowing users to rapidly assimilate, explore, and analyze this data and adapt to various experimental workflows with minimal user intervention. Here we fill this critical gap by providing a flexible relational database called PeptideDepot for organization of expansive proteomic data sets, collation of proteomic data with available protein information resources, and visual comparison of multiple quantitative proteomic experiments. Our software design, built upon the synergistic combination of a MySQL database for safe warehousing of proteomic data with a FileMaker-driven graphical user interface for flexible adaptation to diverse workflows, enables proteomic end-users to directly tailor the presentation of proteomic data to the unique analysis requirements of the individual proteomics lab. PeptideDepot may be deployed as an independent software tool or integrated directly with our high throughput autonomous proteomic pipeline used in the automated acquisition and post-acquisition analysis of proteomic data.

  15. Meeting Report: Tissue-based Image Analysis.

    PubMed

    Saravanan, Chandra; Schumacher, Vanessa; Brown, Danielle; Dunstan, Robert; Galarneau, Jean-Rene; Odin, Marielle; Mishra, Sasmita

    2017-10-01

    Quantitative image analysis (IA) is a rapidly evolving area of digital pathology. Although not a new concept, the quantification of histological features on photomicrographs used to be cumbersome, resource-intensive, and limited to specialists and specialized laboratories. Recent technological advances like highly efficient automated whole slide digitizer (scanner) systems, innovative IA platforms, and the emergence of pathologist-friendly image annotation and analysis systems mean that quantification of features on histological digital images will become increasingly prominent in pathologists' daily professional lives. The added value of quantitative IA in pathology includes confirmation of equivocal findings noted by a pathologist, increasing the sensitivity of feature detection, quantification of signal intensity, and improving efficiency. There is no denying that quantitative IA is part of the future of pathology; however, there are also several potential pitfalls when trying to estimate volumetric features from limited 2-dimensional sections. This continuing education session on quantitative IA offered a broad overview of the field; a hands-on toxicologic pathologist experience with IA principles, tools, and workflows; a discussion on how to apply basic stereology principles in order to minimize bias in IA; and finally, a reflection on the future of IA in the toxicologic pathology field.

  16. Stereological analysis of bacterial load and lung lesions in nonhuman primates (rhesus macaques) experimentally infected with Mycobacterium tuberculosis.

    PubMed

    Luciw, Paul A; Oslund, Karen L; Yang, Xiao-Wei; Adamson, Lourdes; Ravindran, Resmi; Canfield, Don R; Tarara, Ross; Hirst, Linda; Christensen, Miles; Lerche, Nicholas W; Offenstein, Heather; Lewinsohn, David; Ventimiglia, Frank; Brignolo, Laurie; Wisner, Erik R; Hyde, Dallas M

    2011-11-01

    Infection with Mycobacterium tuberculosis primarily produces a multifocal distribution of pulmonary granulomas in which the pathogen resides. Accordingly, quantitative assessment of the bacterial load and pathology is a substantial challenge in tuberculosis. Such assessments are critical for studies of the pathogenesis and for the development of vaccines and drugs in animal models of experimental M. tuberculosis infection. Stereology enables unbiased quantitation of three-dimensional objects from two-dimensional sections and thus is suited to quantify histological lesions. We have developed a protocol for stereological analysis of the lung in rhesus macaques inoculated with a pathogenic clinical strain of M. tuberculosis (Erdman strain). These animals exhibit a pattern of infection and tuberculosis similar to that of naturally infected humans. Conditions were optimized for collecting lung samples in a nonbiased, random manner. Bacterial load in these samples was assessed by a standard plating assay, and granulomas were graded and enumerated microscopically. Stereological analysis provided quantitative data that supported a significant correlation between bacterial load and lung granulomas. Thus this stereological approach enables a quantitative, statistically valid analysis of the impact of M. tuberculosis infection in the lung and will serve as an essential tool for objectively comparing the efficacy of drugs and vaccines.

  17. Quantitative Clinical Diagnostic Analysis of Acetone in Human Blood by HPLC: A Metabolomic Search for Acetone as Indicator

    PubMed Central

    Akgul Kalkan, Esin; Sahiner, Mehtap; Ulker Cakir, Dilek; Alpaslan, Duygu; Yilmaz, Selehattin

    2016-01-01

    Using high-performance liquid chromatography (HPLC) and 2,4-dinitrophenylhydrazine (2,4-DNPH) as a derivatizing reagent, an analytical method was developed for the quantitative determination of acetone in human blood. The determination was carried out at 365 nm using an ultraviolet-visible (UV-Vis) diode array detector (DAD). For acetone as its 2,4-dinitrophenylhydrazone derivative, a good separation was achieved with a ThermoAcclaim C18 column (15 cm × 4.6 mm × 3 μm) at retention time (t R) 12.10 min and flowrate of 1 mL min−1 using a (methanol/acetonitrile) water elution gradient. The methodology is simple, rapid, sensitive, and of low cost, exhibits good reproducibility, and allows the analysis of acetone in biological fluids. A calibration curve was obtained for acetone using its standard solutions in acetonitrile. Quantitative analysis of acetone in human blood was successfully carried out using this calibration graph. The applied method was validated in parameters of linearity, limit of detection and quantification, accuracy, and precision. We also present acetone as a useful tool for the HPLC-based metabolomic investigation of endogenous metabolism and quantitative clinical diagnostic analysis. PMID:27298750

  18. Arabidopsis phenotyping through Geometric Morphometrics.

    PubMed

    Manacorda, Carlos A; Asurmendi, Sebastian

    2018-06-18

    Recently, much technical progress was achieved in the field of plant phenotyping. High-throughput platforms and the development of improved algorithms for rosette image segmentation make it now possible to extract shape and size parameters for genetic, physiological and environmental studies on a large scale. The development of low-cost phenotyping platforms and freeware resources make it possible to widely expand phenotypic analysis tools for Arabidopsis. However, objective descriptors of shape parameters that could be used independently of platform and segmentation software used are still lacking and shape descriptions still rely on ad hoc or even sometimes contradictory descriptors, which could make comparisons difficult and perhaps inaccurate. Modern geometric morphometrics is a family of methods in quantitative biology proposed to be the main source of data and analytical tools in the emerging field of phenomics studies. Based on the location of landmarks (corresponding points) over imaged specimens and by combining geometry, multivariate analysis and powerful statistical techniques, these tools offer the possibility to reproducibly and accurately account for shape variations amongst groups and measure them in shape distance units. Here, a particular scheme of landmarks placement on Arabidopsis rosette images is proposed to study shape variation in the case of viral infection processes. Shape differences between controls and infected plants are quantified throughout the infectious process and visualized. Quantitative comparisons between two unrelated ssRNA+ viruses are shown and reproducibility issues are assessed. Combined with the newest automated platforms and plant segmentation procedures, geometric morphometric tools could boost phenotypic features extraction and processing in an objective, reproducible manner.

  19. Modeling with Young Students--Quantitative and Qualitative.

    ERIC Educational Resources Information Center

    Bliss, Joan; Ogborn, Jon; Boohan, Richard; Brosnan, Tim; Mellar, Harvey; Sakonidis, Babis

    1999-01-01

    A project created tasks and tools to investigate quality and nature of 11- to 14-year-old pupils' reasoning with quantitative and qualitative computer-based modeling tools. Tasks and tools were used in two innovative modes of learning: expressive, where pupils created their own models, and exploratory, where pupils investigated an expert's model.…

  20. Application of Fault Management Theory to the Quantitive Selection of a Launch Vehicle Abort Trigger Suite

    NASA Technical Reports Server (NTRS)

    Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.

    2014-01-01

    SHM/FM theory has been successfully applied to the selection of the baseline set Abort Triggers for the NASA SLS center dot Quantitative assessment played a useful role in the decision process ? M&FM, which is new within NASA MSFC, required the most "new" work, as this quantitative analysis had never been done before center dot Required development of the methodology and tool to mechanize the process center dot Established new relationships to the other groups ? The process is now an accepted part of the SLS design process, and will likely be applied to similar programs in the future at NASA MSFC ? Future improvements center dot Improve technical accuracy ?Differentiate crew survivability due to an abort, vs. survivability even no immediate abort occurs (small explosion with little debris) ?Account for contingent dependence of secondary triggers on primary triggers ?Allocate "? LOC Benefit" of each trigger when added to the previously selected triggers. center dot Reduce future costs through the development of a specialized tool ? Methodology can be applied to any manned/unmanned vehicle, in space or terrestrial

  1. Wear-Induced Changes in FSW Tool Pin Profile: Effect of Process Parameters

    NASA Astrophysics Data System (ADS)

    Sahlot, Pankaj; Jha, Kaushal; Dey, G. K.; Arora, Amit

    2018-06-01

    Friction stir welding (FSW) of high melting point metallic (HMPM) materials has limited application due to tool wear and relatively short tool life. Tool wear changes the profile of the tool pin and adversely affects weld properties. A quantitative understanding of tool wear and tool pin profile is crucial to develop the process for joining of HMPM materials. Here we present a quantitative wear study of H13 steel tool pin profile for FSW of CuCrZr alloy. The tool pin profile is analyzed at multiple traverse distances for welding with various tool rotational and traverse speeds. The results indicate that measured wear depth is small near the pin root and significantly increases towards the tip. Near the pin tip, wear depth increases with increase in tool rotational speed. However, change in wear depth near the pin root is minimal. Wear depth also increases with decrease in tool traverse speeds. Tool pin wear from the bottom results in pin length reduction, which is greater for higher tool rotational speeds, and longer traverse distances. The pin profile changes due to wear and result in root defect for long traverse distance. This quantitative understanding of tool wear would be helpful to estimate tool wear, optimize process parameters, and tool pin shape during FSW of HMPM materials.

  2. Quantitative analysis of amygdalin and prunasin in Prunus serotina Ehrh. using (1) H-NMR spectroscopy.

    PubMed

    Santos Pimenta, Lúcia P; Schilthuizen, Menno; Verpoorte, Robert; Choi, Young Hae

    2014-01-01

    Prunus serotina is native to North America but has been invasively introduced in Europe since the seventeenth century. This plant contains cyanogenic glycosides that are believed to be related to its success as an invasive plant. For these compounds, chromatographic- or spectrometric-based (targeting on HCN hydrolysis) methods of analysis have been employed so far. However, the conventional methods require tedious preparation steps and a long measuring time. To develop a fast and simple method to quantify the cyanogenic glycosides, amygdalin and prunasin in dried Prunus serotina leaves without any pre-purification steps using (1) H-NMR spectroscopy. Extracts of Prunus serotina leaves using CH3 OH-d4 and KH2 PO4 buffer in D2 O (1:1) were quantitatively analysed for amygdalin and prunasin using (1) H-NMR spectroscopy. Different internal standards were evaluated for accuracy and stability. The purity of quantitated (1) H-NMR signals was evaluated using several two-dimensional NMR experiments. Trimethylsilylpropionic acid sodium salt-d4 proved most suitable as the internal standard for quantitative (1) H-NMR analysis. Two-dimensional J-resolved NMR was shown to be a useful tool to confirm the structures and to check for possible signal overlapping with the target signals for the quantitation. Twenty-two samples of P. serotina were subsequently quantitatively analysed for the cyanogenic glycosides prunasin and amygdalin. The NMR method offers a fast, high-throughput analysis of cyanogenic glycosides in dried leaves permitting simultaneous quantification and identification of prunasin and amygdalin in Prunus serotina. Copyright © 2013 John Wiley & Sons, Ltd.

  3. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    PubMed

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory analysis our results show similar diagnostic accuracy comparing anatomical (AUC 0.86(0.83-0.89)) and functional reference standards (AUC 0.88(0.84-0.90)). Only the per territory analysis sensitivity did not show significant heterogeneity. None of the groups showed signs of publication bias. The clinical value of semi-quantitative and quantitative CMR perfusion analysis remains uncertain due to extensive inter-study heterogeneity and large differences in CMR perfusion acquisition protocols, reference standards, and methods of assessment of myocardial perfusion parameters. For wide spread implementation, standardization of CMR perfusion techniques is essential. CRD42016040176 .

  4. Global scaling for semi-quantitative analysis in FP-CIT SPECT.

    PubMed

    Kupitz, D; Apostolova, I; Lange, C; Ulrich, G; Amthauer, H; Brenner, W; Buchert, R

    2014-01-01

    Semi-quantitative characterization of dopamine transporter availability from single photon emission computed tomography (SPECT) with 123I-ioflupane (FP-CIT) is based on uptake ratios relative to a reference region. The aim of this study was to evaluate the whole brain as reference region for semi-quantitative analysis of FP-CIT SPECT. The rationale was that this might reduce statistical noise associated with the estimation of non-displaceable FP-CIT uptake. 150 FP-CIT SPECTs were categorized as neurodegenerative or non-neurodegenerative by an expert. Semi-quantitative analysis of specific binding ratios (SBR) was performed with a custom-made tool based on the Statistical Parametric Mapping software package using predefined regions of interest (ROIs) in the anatomical space of the Montreal Neurological Institute. The following reference regions were compared: predefined ROIs for frontal and occipital lobe and whole brain (without striata, thalamus and brainstem). Tracer uptake in the reference region was characterized by the mean, median or 75th percentile of its voxel intensities. The area (AUC) under the receiver operating characteristic curve was used as performance measure. The highest AUC of 0.973 was achieved by the SBR of the putamen with the 75th percentile in the whole brain as reference. The lowest AUC for the putamen SBR of 0.937 was obtained with the mean in the frontal lobe as reference. We recommend the 75th percentile in the whole brain as reference for semi-quantitative analysis in FP-CIT SPECT. This combination provided the best agreement of the semi-quantitative analysis with visual evaluation of the SPECT images by an expert and, therefore, is appropriate to support less experienced physicians.

  5. Nonlinear optical microscopy: use of second harmonic generation and two-photon microscopy for automated quantitative liver fibrosis studies.

    PubMed

    Sun, Wanxin; Chang, Shi; Tai, Dean C S; Tan, Nancy; Xiao, Guangfa; Tang, Huihuan; Yu, Hanry

    2008-01-01

    Liver fibrosis is associated with an abnormal increase in an extracellular matrix in chronic liver diseases. Quantitative characterization of fibrillar collagen in intact tissue is essential for both fibrosis studies and clinical applications. Commonly used methods, histological staining followed by either semiquantitative or computerized image analysis, have limited sensitivity, accuracy, and operator-dependent variations. The fibrillar collagen in sinusoids of normal livers could be observed through second-harmonic generation (SHG) microscopy. The two-photon excited fluorescence (TPEF) images, recorded simultaneously with SHG, clearly revealed the hepatocyte morphology. We have systematically optimized the parameters for the quantitative SHG/TPEF imaging of liver tissue and developed fully automated image analysis algorithms to extract the information of collagen changes and cell necrosis. Subtle changes in the distribution and amount of collagen and cell morphology are quantitatively characterized in SHG/TPEF images. By comparing to traditional staining, such as Masson's trichrome and Sirius red, SHG/TPEF is a sensitive quantitative tool for automated collagen characterization in liver tissue. Our system allows for enhanced detection and quantification of sinusoidal collagen fibers in fibrosis research and clinical diagnostics.

  6. Application of image analysis in studies of quantitative disease resistance, exemplified using common bacterial blight-common bean pathosystem.

    PubMed

    Xie, Weilong; Yu, Kangfu; Pauls, K Peter; Navabi, Alireza

    2012-04-01

    The effectiveness of image analysis (IA) compared with an ordinal visual scale, for quantitative measurement of disease severity, its application in quantitative genetic studies, and its effect on the estimates of genetic parameters were investigated. Studies were performed using eight backcross-derived families of common bean (Phaseolus vulgaris) (n = 172) segregating for the molecular marker SU91, known to be associated with a quantitative trait locus (QTL) for resistance to common bacterial blight (CBB), caused by Xanthomonas campestris pv. phaseoli and X. fuscans subsp. fuscans. Even though both IA and visual assessments were highly repeatable, IA was more sensitive in detecting quantitative differences between bean genotypes. The CBB phenotypic difference between the two SU91 genotypic groups was consistently more than fivefold for IA assessments but generally only two- to threefold for visual assessments. Results suggest that the visual assessment results in overestimation of the effect of QTL in genetic studies. This may have been caused by lack of additivity and uneven intervals of the visual scale. Although visual assessment of disease severity is a useful tool for general selection in breeding programs, assessments using IA may be more suitable for phenotypic evaluations in quantitative genetic studies involving CBB resistance as well as other foliar diseases.

  7. Widely-targeted quantitative lipidomics methodology by supercritical fluid chromatography coupled with fast-scanning triple quadrupole mass spectrometry.

    PubMed

    Takeda, Hiroaki; Izumi, Yoshihiro; Takahashi, Masatomo; Paxton, Thanai; Tamura, Shohei; Koike, Tomonari; Yu, Ying; Kato, Noriko; Nagase, Katsutoshi; Shiomi, Masashi; Bamba, Takeshi

    2018-05-03

    Lipidomics, the mass spectrometry-based comprehensive analysis of lipids, has attracted attention as an analytical approach to provide novel insight into lipid metabolism and to search for biomarkers. However, an ideal method for both comprehensive and quantitative analysis of lipids has not been fully developed. Herein, we have proposed a practical methodology for widely-targeted quantitative lipidome analysis using supercritical fluid chromatography fast-scanning triple-quadrupole mass spectrometry (SFC/QqQMS) and theoretically calculated a comprehensive lipid multiple reaction monitoring (MRM) library. Lipid classes can be separated by SFC with a normal phase diethylamine-bonded silica column with high-resolution, high-throughput, and good repeatability. Structural isomers of phospholipids can be monitored by mass spectrometric separation with fatty acyl-based MRM transitions. SFC/QqQMS analysis with an internal standard-dilution method offers quantitative information for both lipid class and individual lipid molecular species in the same lipid class. Additionally, data acquired using this method has advantages including reduction of misidentification and acceleration of data analysis. Using the SFC/QqQMS system, alteration of plasma lipid levels in myocardial infarction-prone rabbits to the supplementation of eicosapentaenoic acid was first observed. Our developed SFC/QqQMS method represents a potentially useful tool for in-depth studies focused on complex lipid metabolism and biomarker discovery. Published under license by The American Society for Biochemistry and Molecular Biology, Inc.

  8. Introducing Graduate Students to High-Resolution Mass Spectrometry (HRMS) Using a Hands-On Approach

    ERIC Educational Resources Information Center

    Stock, Naomi L.

    2017-01-01

    High-resolution mass spectrometry (HRMS) features both high resolution and high mass accuracy and is a powerful tool for the analysis and quantitation of compounds, determination of elemental compositions, and identification of unknowns. A hands-on laboratory experiment for upper-level undergraduate and graduate students to investigate HRMS is…

  9. Thinking Critically in Space: Toward a Mixed-Methods Geospatial Approach to Education Policy Analysis

    ERIC Educational Resources Information Center

    Yoon, Ee-Seul; Lubienski, Christopher

    2018-01-01

    This paper suggests that synergies can be produced by using geospatial analyses as a bridge between traditional qualitative-quantitative distinctions in education research. While mapping tools have been effective for informing education policy studies, especially in terms of educational access and choice, they have also been underutilized and…

  10. New Tools for "New" History: Computers and the Teaching of Quantitative Historical Methods.

    ERIC Educational Resources Information Center

    Burton, Orville Vernon; Finnegan, Terence

    1989-01-01

    Explains the development of an instructional software package and accompanying workbook which teaches students to apply computerized statistical analysis to historical data, improving the study of social history. Concludes that the use of microcomputers and supercomputers to manipulate historical data enhances critical thinking skills and the use…

  11. Monitoring Urban Quality of Life: The Porto Experience

    ERIC Educational Resources Information Center

    Santos, Luis Delfim; Martins, Isabel

    2007-01-01

    This paper describes the monitoring system of the urban quality of life developed by the Porto City Council, a new tool being used to support urban planning and management. The two components of this system--a quantitative approach based on statistical indicators and a qualitative analysis based on the citizens' perceptions of the conditions of…

  12. Semi-quantitative analysis of FT-IR spectra of humic fractions of nine US soils

    USDA-ARS?s Scientific Manuscript database

    Fourier Transform Infrared Spectroscopy (FT-IR) is a simple and fast tool for characterizing soil organic matter. However, most FT-IR spectra are only analyzed qualitatively. In this work, we prepared mobile humic acid (MHA) and recalcitrant calcium humate (CaHA) from nine soils collected from six ...

  13. Using Texas Instruments Emulators as Teaching Tools in Quantitative Chemical Analysis

    ERIC Educational Resources Information Center

    Young, Vaneica Y.

    2011-01-01

    This technology report alerts upper-division undergraduate chemistry faculty and lecturers to the use of Texas Instruments emulators as virtual graphing calculators. These may be used in multimedia lectures to instruct students on the use of their graphing calculators to obtain solutions to complex chemical problems. (Contains 1 figure.)

  14. Tools for understanding landscapes: combining large-scale surveys to characterize change. Chapter 9.

    Treesearch

    W. Keith Moser; Janine Bolliger; Don C. Bragg; Mark H. Hansen; Mark A. Hatfield; Timothy A. Nigh; Lisa A. Schulte

    2008-01-01

    All landscapes change continuously. Since change is perceived and interpreted through measures of scale, any quantitative analysis of landscapes must identify and describe the spatiotemporal mosaics shaped by large-scale structures and processes. This process is controlled by core influences, or "drivers," that shape the change and affect the outcome...

  15. Faster than "g", Revisited with High-Speed Imaging

    ERIC Educational Resources Information Center

    Vollmer, Michael; Mollmann, Klaus-Peter

    2012-01-01

    The introduction of modern high-speed cameras in physics teaching provides a tool not only for easy visualization, but also for quantitative analysis of many simple though fast occurring phenomena. As an example, we present a very well-known demonstration experiment--sometimes also discussed in the context of falling chimneys--which is commonly…

  16. Hybrid computational and experimental approach for the study and optimization of mechanical components

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    1998-05-01

    Increased demands on the performance and efficiency of mechanical components impose challenges on their engineering design and optimization, especially when new and more demanding applications must be developed in relatively short periods of time while satisfying design objectives, as well as cost and manufacturability. In addition, reliability and durability must be taken into consideration. As a consequence, effective quantitative methodologies, computational and experimental, should be applied in the study and optimization of mechanical components. Computational investigations enable parametric studies and the determination of critical engineering design conditions, while experimental investigations, especially those using optical techniques, provide qualitative and quantitative information on the actual response of the structure of interest to the applied load and boundary conditions. We discuss a hybrid experimental and computational approach for investigation and optimization of mechanical components. The approach is based on analytical, computational, and experimental resolutions methodologies in the form of computational, noninvasive optical techniques, and fringe prediction analysis tools. Practical application of the hybrid approach is illustrated with representative examples that demonstrate the viability of the approach as an effective engineering tool for analysis and optimization.

  17. Bispectral infrared forest fire detection and analysis using classification techniques

    NASA Astrophysics Data System (ADS)

    Aranda, Jose M.; Melendez, Juan; de Castro, Antonio J.; Lopez, Fernando

    2004-01-01

    Infrared cameras are well established as a useful tool for fire detection, but their use for quantitative forest fire measurements faces difficulties, due to the complex spatial and spectral structure of fires. In this work it is shown that some of these difficulties can be overcome by applying classification techniques, a standard tool for the analysis of satellite multispectral images, to bi-spectral images of fires. Images were acquired by two cameras that operate in the medium infrared (MIR) and thermal infrared (TIR) bands. They provide simultaneous and co-registered images, calibrated in brightness temperatures. The MIR-TIR scatterplot of these images can be used to classify the scene into different fire regions (background, ashes, and several ember and flame regions). It is shown that classification makes possible to obtain quantitative measurements of physical fire parameters like rate of spread, embers temperature, and radiated power in the MIR and TIR bands. An estimation of total radiated power and heat release per unit area is also made and compared with values derived from heat of combustion and fuel consumption.

  18. Impact and Penetration Simulations for Composite Wing-like Structures

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.

    1998-01-01

    The goal of this research project was to develop methodologies for the analysis of wing-like structures subjected to impact loadings. Low-speed impact causing either no damage or only minimal damage and high-speed impact causing severe laminate damage and possible penetration of the structure were to be considered during this research effort. To address this goal, an assessment of current analytical tools for impact analysis was performed. Assessment of the analytical tools for impact and penetration simulations with regard to accuracy, modeling, and damage modeling was considered as well as robustness, efficient, and usage in a wing design environment. Following a qualitative assessment, selected quantitative evaluations will be performed using the leading simulation tools. Based on this assessment, future research thrusts for impact and penetration simulation of composite wing-like structures were identified.

  19. Health impact assessment – A survey on quantifying tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fehr, Rainer, E-mail: rainer.fehr@uni-bielefeld.de; Mekel, Odile C.L., E-mail: odile.mekel@lzg.nrw.de; Fintan Hurley, J., E-mail: fintan.hurley@iom-world.org

    Integrating human health into prospective impact assessments is known to be challenging. This is true for both approaches: dedicated health impact assessments (HIA) as well as inclusion of health into more general impact assessments. Acknowledging the full range of participatory, qualitative, and quantitative approaches, this study focuses on the latter, especially on computational tools for quantitative health modelling. We conducted a survey among tool developers concerning the status quo of development and availability of such tools; experiences made with model usage in real-life situations; and priorities for further development. Responding toolmaker groups described 17 such tools, most of them beingmore » maintained and reported as ready for use and covering a wide range of topics, including risk & protective factors, exposures, policies, and health outcomes. In recent years, existing models have been improved and were applied in new ways, and completely new models emerged. There was high agreement among respondents on the need to further develop methods for assessment of inequalities and uncertainty. The contribution of quantitative modeling to health foresight would benefit from building joint strategies of further tool development, improving the visibility of quantitative tools and methods, and engaging continuously with actual and potential users. - Highlights: • A survey investigated computational tools for health impact quantification. • Formal evaluation of such tools has been rare. • Handling inequalities and uncertainties are priority areas for further development. • Health foresight would benefit from tool developers and users forming a community. • Joint development strategies across computational tools are needed.« less

  20. Classification and quantitation of milk powder by near-infrared spectroscopy and mutual information-based variable selection and partial least squares

    NASA Astrophysics Data System (ADS)

    Chen, Hui; Tan, Chao; Lin, Zan; Wu, Tong

    2018-01-01

    Milk is among the most popular nutrient source worldwide, which is of great interest due to its beneficial medicinal properties. The feasibility of the classification of milk powder samples with respect to their brands and the determination of protein concentration is investigated by NIR spectroscopy along with chemometrics. Two datasets were prepared for experiment. One contains 179 samples of four brands for classification and the other contains 30 samples for quantitative analysis. Principal component analysis (PCA) was used for exploratory analysis. Based on an effective model-independent variable selection method, i.e., minimal-redundancy maximal-relevance (MRMR), only 18 variables were selected to construct a partial least-square discriminant analysis (PLS-DA) model. On the test set, the PLS-DA model based on the selected variable set was compared with the full-spectrum PLS-DA model, both of which achieved 100% accuracy. In quantitative analysis, the partial least-square regression (PLSR) model constructed by the selected subset of 260 variables outperforms significantly the full-spectrum model. It seems that the combination of NIR spectroscopy, MRMR and PLS-DA or PLSR is a powerful tool for classifying different brands of milk and determining the protein content.

  1. Design of a Web-tool for diagnostic clinical trials handling medical imaging research.

    PubMed

    Baltasar Sánchez, Alicia; González-Sistal, Angel

    2011-04-01

    New clinical studies in medicine are based on patients and controls using different imaging diagnostic modalities. Medical information systems are not designed for clinical trials employing clinical imaging. Although commercial software and communication systems focus on storage of image data, they are not suitable for storage and mining of new types of quantitative data. We sought to design a Web-tool to support diagnostic clinical trials involving different experts and hospitals or research centres. The image analysis of this project is based on skeletal X-ray imaging. It involves a computerised image method using quantitative analysis of regions of interest in healthy bone and skeletal metastases. The database is implemented with ASP.NET 3.5 and C# technologies for our Web-based application. For data storage, we chose MySQL v.5.0, one of the most popular open source databases. User logins were necessary, and access to patient data was logged for auditing. For security, all data transmissions were carried over encrypted connections. This Web-tool is available to users scattered at different locations; it allows an efficient organisation and storage of data (case report form) and images and allows each user to know precisely what his task is. The advantages of our Web-tool are as follows: (1) sustainability is guaranteed; (2) network locations for collection of data are secured; (3) all clinical information is stored together with the original images and the results derived from processed images and statistical analysis that enable us to perform retrospective studies; (4) changes are easily incorporated because of the modular architecture; and (5) assessment of trial data collected at different sites is centralised to reduce statistical variance.

  2. HPTLC Fingerprint Analysis: A Quality Control for Authentication of Herbal Phytochemicals

    NASA Astrophysics Data System (ADS)

    Ram, Mauji; Abdin, M. Z.; Khan, M. A.; Jha, Prabhakar

    Authentication and consistent quality are the basic requirement for Indian traditional medicine (TIM), Chinese traditional herbal medicine (TCHM), and their commercial products, regardless of the kind of research conducted to modernize the TIM and TCHM. The complexities of TIM and TCHM challenge the current official quality control mode, for which only a few biochemical markers were selected for identification and quantitative assay. Referring too many unknown factors existed in TIM and TCHM, it is impossible and unnecessary to pinpoint qualitatively and quantitatively every single component contained in the herbal drug. Chromatographic fingerprint is a rational option to meet the need for more effective and powerful quality assessment to TIM and TCHM. The optimized chromatographic fingerprint is not only an alternative analytical tool for authentication, but also an approach to express the various pattern of chemical ingredients distribution in the herbal drugs and preserve such "database" for further multifaced sustainable studies. Analytical separation techniques, for example, high-performance liquid chromatography (HPLC), gas chromatography (GC) and mass spectrometry (MS) were among the most popular methods of choice used for quality control of raw material and finished herbal product. Fingerprint analysis approach using high-performance thin-layer chromatography (HPTLC) has become the most potent tool for quality control of herbal medicines because of its simplicity and reliability. It can serve as a tool for identification, authentication, and quality control of herbal drugs. In this chapter, attempts are being made to expand the use of HPTLC and at the same time create interest among prospective researcher in herbal analysis. The developed method can be used as a quality control tool for rapid authentication from a wide variety of herbal samples. Some examples demonstrated the role of fingerprinting in quality control and assessment.

  3. Discrimination of surface wear on obsidian tools using LSCM and RelA: pilot study results (area-scale analysis of obsidian tool surfaces).

    PubMed

    Stemp, W James; Chung, Steven

    2011-01-01

    This pilot study tests the reliability of laser scanning confocal microscopy (LSCM) to quantitatively measure wear on experimental obsidian tools. To our knowledge, this is the first use of confocal microscopy to study wear on stone flakes made from an amorphous silicate like obsidian. Three-dimensional surface roughness or texture area scans on three obsidian flakes used on different contact materials (hide, shell, wood) were documented using the LSCM to determine whether the worn surfaces could be discriminated using area-scale analysis, specifically relative area (RelA). When coupled with the F-test, this scale-sensitive fractal analysis could not only discriminate the used from unused surfaces on individual tools, but was also capable of discriminating the wear histories of tools used on different contact materials. Results indicate that such discriminations occur at different scales. Confidence levels for the discriminations at different scales were established using the F-test (mean square ratios or MSRs). In instances where discrimination of surface roughness or texture was not possible above the established confidence level based on MSRs, photomicrographs and RelA assisted in hypothesizing why this was so. Copyright © 2011 Wiley Periodicals, Inc.

  4. Open source tools for fluorescent imaging.

    PubMed

    Hamilton, Nicholas A

    2012-01-01

    As microscopy becomes increasingly automated and imaging expands in the spatial and time dimensions, quantitative analysis tools for fluorescent imaging are becoming critical to remove both bottlenecks in throughput as well as fully extract and exploit the information contained in the imaging. In recent years there has been a flurry of activity in the development of bio-image analysis tools and methods with the result that there are now many high-quality, well-documented, and well-supported open source bio-image analysis projects with large user bases that cover essentially every aspect from image capture to publication. These open source solutions are now providing a viable alternative to commercial solutions. More importantly, they are forming an interoperable and interconnected network of tools that allow data and analysis methods to be shared between many of the major projects. Just as researchers build on, transmit, and verify knowledge through publication, open source analysis methods and software are creating a foundation that can be built upon, transmitted, and verified. Here we describe many of the major projects, their capabilities, and features. We also give an overview of the current state of open source software for fluorescent microscopy analysis and the many reasons to use and develop open source methods. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. Quantitative Computerized Two-Point Correlation Analysis of Lung CT Scans Correlates With Pulmonary Function in Pulmonary Sarcoidosis

    PubMed Central

    Erdal, Barbaros Selnur; Yildiz, Vedat; King, Mark A.; Patterson, Andrew T.; Knopp, Michael V.; Clymer, Bradley D.

    2012-01-01

    Background: Chest CT scans are commonly used to clinically assess disease severity in patients presenting with pulmonary sarcoidosis. Despite their ability to reliably detect subtle changes in lung disease, the utility of chest CT scans for guiding therapy is limited by the fact that image interpretation by radiologists is qualitative and highly variable. We sought to create a computerized CT image analysis tool that would provide quantitative and clinically relevant information. Methods: We established that a two-point correlation analysis approach reduced the background signal attendant to normal lung structures, such as blood vessels, airways, and lymphatics while highlighting diseased tissue. This approach was applied to multiple lung fields to generate an overall lung texture score (LTS) representing the quantity of diseased lung parenchyma. Using deidentified lung CT scan and pulmonary function test (PFT) data from The Ohio State University Medical Center’s Information Warehouse, we analyzed 71 consecutive CT scans from patients with sarcoidosis for whom simultaneous matching PFTs were available to determine whether the LTS correlated with standard PFT results. Results: We found a high correlation between LTS and FVC, total lung capacity, and diffusing capacity of the lung for carbon monoxide (P < .0001 for all comparisons). Moreover, LTS was equivalent to PFTs for the detection of active lung disease. The image analysis protocol was conducted quickly (< 1 min per study) on a standard laptop computer connected to a publicly available National Institutes of Health ImageJ toolkit. Conclusions: The two-point image analysis tool is highly practical and appears to reliably assess lung disease severity. We predict that this tool will be useful for clinical and research applications. PMID:22628487

  6. The potential of statistical shape modelling for geometric morphometric analysis of human teeth in archaeological research

    PubMed Central

    Fernee, Christianne; Browne, Martin; Zakrzewski, Sonia

    2017-01-01

    This paper introduces statistical shape modelling (SSM) for use in osteoarchaeology research. SSM is a full field, multi-material analytical technique, and is presented as a supplementary geometric morphometric (GM) tool. Lower mandibular canines from two archaeological populations and one modern population were sampled, digitised using micro-CT, aligned, registered to a baseline and statistically modelled using principal component analysis (PCA). Sample material properties were incorporated as a binary enamel/dentin parameter. Results were assessed qualitatively and quantitatively using anatomical landmarks. Finally, the technique’s application was demonstrated for inter-sample comparison through analysis of the principal component (PC) weights. It was found that SSM could provide high detail qualitative and quantitative insight with respect to archaeological inter- and intra-sample variability. This technique has value for archaeological, biomechanical and forensic applications including identification, finite element analysis (FEA) and reconstruction from partial datasets. PMID:29216199

  7. Methodological Variables in the Analysis of Cell-Free DNA.

    PubMed

    Bronkhorst, Abel Jacobus; Aucamp, Janine; Pretorius, Piet J

    2016-01-01

    In recent years, cell-free DNA (cfDNA) analysis has received increasing amounts of attention as a potential non-invasive screening tool for the early detection of genetic aberrations and a wide variety of diseases, especially cancer. However, except for some prenatal tests and BEAMing, a technique used to detect mutations in various genes of cancer patients, cfDNA analysis is not yet routinely applied in clinical practice. Although some confusing biological factors inherent to the in vivo setting play a key part, it is becoming increasingly clear that this struggle is mainly due to the lack of an analytical consensus, especially as regards quantitative analyses of cfDNA. In order to use quantitative analysis of cfDNA with confidence, process optimization and standardization are crucial. In this work we aim to elucidate the most confounding variables of each preanalytical step that must be considered for process optimization and equivalence of procedures.

  8. The FAQUIRE Approach: FAst, QUantitative, hIghly Resolved and sEnsitivity Enhanced 1H, 13C Data.

    PubMed

    Farjon, Jonathan; Milande, Clément; Martineau, Estelle; Akoka, Serge; Giraudeau, Patrick

    2018-02-06

    The targeted analysis of metabolites in complex mixtures is a challenging issue. NMR is one of the major tools in this field, but there is a strong need for more sensitive, better-resolved, and faster quantitative methods. In this framework, we introduce the concept of FAst, QUantitative, hIghly Resolved and sEnsitivity enhanced (FAQUIRE) NMR to push forward the limits of metabolite NMR analysis. 2D 1 H, 13 C 2D quantitative maps are promising alternatives for enhancing the spectral resolution but are highly time-consuming because of (i) the intrinsic nature of 2D, (ii) the longer recycling times required for quantitative conditions, and (iii) the higher number of scans needed to reduce the level of detection/quantification to access low concentrated metabolites. To reach this aim, speeding up the recently developed QUantItative Perfected and pUre shifted HSQC (QUIPU HSQC) is an interesting attempt to develop the FAQUIRE concept. Thanks to the combination of spectral aliasing, nonuniform sampling, and variable repetition time, the acquisition time of 2D quantitative maps is reduced by a factor 6 to 9, while conserving a high spectral resolution thanks to a pure shift approach. The analytical potential of the new Quick QUIPU HSQC (Q QUIPU HSQC) is evaluated on a model metabolite sample, and its potential is shown on breast-cell extracts embedding metabolites at millimolar to submillimolar concentrations.

  9. Quantitative analysis of bristle number in Drosophila mutants identifies genes involved in neural development

    NASA Technical Reports Server (NTRS)

    Norga, Koenraad K.; Gurganus, Marjorie C.; Dilda, Christy L.; Yamamoto, Akihiko; Lyman, Richard F.; Patel, Prajal H.; Rubin, Gerald M.; Hoskins, Roger A.; Mackay, Trudy F.; Bellen, Hugo J.

    2003-01-01

    BACKGROUND: The identification of the function of all genes that contribute to specific biological processes and complex traits is one of the major challenges in the postgenomic era. One approach is to employ forward genetic screens in genetically tractable model organisms. In Drosophila melanogaster, P element-mediated insertional mutagenesis is a versatile tool for the dissection of molecular pathways, and there is an ongoing effort to tag every gene with a P element insertion. However, the vast majority of P element insertion lines are viable and fertile as homozygotes and do not exhibit obvious phenotypic defects, perhaps because of the tendency for P elements to insert 5' of transcription units. Quantitative genetic analysis of subtle effects of P element mutations that have been induced in an isogenic background may be a highly efficient method for functional genome annotation. RESULTS: Here, we have tested the efficacy of this strategy by assessing the extent to which screening for quantitative effects of P elements on sensory bristle number can identify genes affecting neural development. We find that such quantitative screens uncover an unusually large number of genes that are known to function in neural development, as well as genes with yet uncharacterized effects on neural development, and novel loci. CONCLUSIONS: Our findings establish the use of quantitative trait analysis for functional genome annotation through forward genetics. Similar analyses of quantitative effects of P element insertions will facilitate our understanding of the genes affecting many other complex traits in Drosophila.

  10. Quantitative evaluation of DNA damage and mutation rate by atmospheric and room-temperature plasma (ARTP) and conventional mutagenesis.

    PubMed

    Zhang, Xue; Zhang, Chong; Zhou, Qian-Qian; Zhang, Xiao-Fei; Wang, Li-Yan; Chang, Hai-Bo; Li, He-Ping; Oda, Yoshimitsu; Xing, Xin-Hui

    2015-07-01

    DNA damage is the dominant source of mutation, which is the driving force of evolution. Therefore, it is important to quantitatively analyze the DNA damage caused by different mutagenesis methods, the subsequent mutation rates, and their relationship. Atmospheric and room temperature plasma (ARTP) mutagenesis has been used for the mutation breeding of more than 40 microorganisms. However, ARTP mutagenesis has not been quantitatively compared with conventional mutation methods. In this study, the umu test using a flow-cytometric analysis was developed to quantify the DNA damage in individual viable cells using Salmonella typhimurium NM2009 as the model strain and to determine the mutation rate. The newly developed method was used to evaluate four different mutagenesis systems: a new ARTP tool, ultraviolet radiation, 4-nitroquinoline-1-oxide (4-NQO), and N-methyl-N'-nitro-N-nitrosoguanidine (MNNG) mutagenesis. The mutation rate was proportional to the corresponding SOS response induced by DNA damage. ARTP caused greater DNA damage to individual living cells than the other conventional mutagenesis methods, and the mutation rate was also higher. By quantitatively comparing the DNA damage and consequent mutation rate after different types of mutagenesis, we have shown that ARTP is a potentially powerful mutagenesis tool with which to improve the characteristics of microbial cell factories.

  11. Quantitative refractive index distribution of single cell by combining phase-shifting interferometry and AFM imaging.

    PubMed

    Zhang, Qinnan; Zhong, Liyun; Tang, Ping; Yuan, Yingjie; Liu, Shengde; Tian, Jindong; Lu, Xiaoxu

    2017-05-31

    Cell refractive index, an intrinsic optical parameter, is closely correlated with the intracellular mass and concentration. By combining optical phase-shifting interferometry (PSI) and atomic force microscope (AFM) imaging, we constructed a label free, non-invasive and quantitative refractive index of single cell measurement system, in which the accurate phase map of single cell was retrieved with PSI technique and the cell morphology with nanoscale resolution was achieved with AFM imaging. Based on the proposed AFM/PSI system, we achieved quantitative refractive index distributions of single red blood cell and Jurkat cell, respectively. Further, the quantitative change of refractive index distribution during Daunorubicin (DNR)-induced Jurkat cell apoptosis was presented, and then the content changes of intracellular biochemical components were achieved. Importantly, these results were consistent with Raman spectral analysis, indicating that the proposed PSI/AFM based refractive index system is likely to become a useful tool for intracellular biochemical components analysis measurement, and this will facilitate its application for revealing cell structure and pathological state from a new perspective.

  12. In silico quantitative structure-toxicity relationship study of aromatic nitro compounds.

    PubMed

    Pasha, Farhan Ahmad; Neaz, Mohammad Morshed; Cho, Seung Joo; Ansari, Mohiuddin; Mishra, Sunil Kumar; Tiwari, Sharvan

    2009-05-01

    Small molecules often have toxicities that are a function of molecular structural features. Minor variations in structural features can make large difference in such toxicity. Consequently, in silico techniques may be used to correlate such molecular toxicities with their structural features. Relative to nine different sets of aromatic nitro compounds having known observed toxicities against different targets, we developed ligand-based 2D quantitative structure-toxicity relationship models using 20 selected topological descriptors. The topological descriptors have several advantages such as conformational independency, facile and less time-consuming computation to yield good results. Multiple linear regression analysis was used to correlate variations of toxicity with molecular properties. The information index on molecular size, lopping centric index and Kier flexibility index were identified as fundamental descriptors for different kinds of toxicity, and further showed that molecular size, branching and molecular flexibility might be particularly important factors in quantitative structure-toxicity relationship analysis. This study revealed that topological descriptor-guided quantitative structure-toxicity relationship provided a very useful, cost and time-efficient, in silico tool for describing small-molecule toxicities.

  13. A comprehensive comparison of tools for differential ChIP-seq analysis

    PubMed Central

    Steinhauser, Sebastian; Kurzawa, Nils; Eils, Roland

    2016-01-01

    ChIP-seq has become a widely adopted genomic assay in recent years to determine binding sites for transcription factors or enrichments for specific histone modifications. Beside detection of enriched or bound regions, an important question is to determine differences between conditions. While this is a common analysis for gene expression, for which a large number of computational approaches have been validated, the same question for ChIP-seq is particularly challenging owing to the complexity of ChIP-seq data in terms of noisiness and variability. Many different tools have been developed and published in recent years. However, a comprehensive comparison and review of these tools is still missing. Here, we have reviewed 14 tools, which have been developed to determine differential enrichment between two conditions. They differ in their algorithmic setups, and also in the range of applicability. Hence, we have benchmarked these tools on real data sets for transcription factors and histone modifications, as well as on simulated data sets to quantitatively evaluate their performance. Overall, there is a great variety in the type of signal detected by these tools with a surprisingly low level of agreement. Depending on the type of analysis performed, the choice of method will crucially impact the outcome. PMID:26764273

  14. Validation of a Three-Dimensional Method for Counting and Sizing Podocytes in Whole Glomeruli

    PubMed Central

    van der Wolde, James W.; Schulze, Keith E.; Short, Kieran M.; Wong, Milagros N.; Bensley, Jonathan G.; Cullen-McEwen, Luise A.; Caruana, Georgina; Hokke, Stacey N.; Li, Jinhua; Firth, Stephen D.; Harper, Ian S.; Nikolic-Paterson, David J.; Bertram, John F.

    2016-01-01

    Podocyte depletion is sufficient for the development of numerous glomerular diseases and can be absolute (loss of podocytes) or relative (reduced number of podocytes per volume of glomerulus). Commonly used methods to quantify podocyte depletion introduce bias, whereas gold standard stereologic methodologies are time consuming and impractical. We developed a novel approach for assessing podocyte depletion in whole glomeruli that combines immunofluorescence, optical clearing, confocal microscopy, and three-dimensional analysis. We validated this method in a transgenic mouse model of selective podocyte depletion, in which we determined dose-dependent alterations in several quantitative indices of podocyte depletion. This new approach provides a quantitative tool for the comprehensive and time-efficient analysis of podocyte depletion in whole glomeruli. PMID:26975438

  15. LOD significance thresholds for QTL analysis in experimental populations of diploid species

    PubMed

    Van Ooijen JW

    1999-11-01

    Linkage analysis with molecular genetic markers is a very powerful tool in the biological research of quantitative traits. The lack of an easy way to know what areas of the genome can be designated as statistically significant for containing a gene affecting the quantitative trait of interest hampers the important prediction of the rate of false positives. In this paper four tables, obtained by large-scale simulations, are presented that can be used with a simple formula to get the false-positives rate for analyses of the standard types of experimental populations with diploid species with any size of genome. A new definition of the term 'suggestive linkage' is proposed that allows a more objective comparison of results across species.

  16. An Image Analysis Method for the Precise Selection and Quantitation of Fluorescently Labeled Cellular Constituents

    PubMed Central

    Agley, Chibeza C.; Velloso, Cristiana P.; Lazarus, Norman R.

    2012-01-01

    The accurate measurement of the morphological characteristics of cells with nonuniform conformations presents difficulties. We report here a straightforward method using immunofluorescent staining and the commercially available imaging program Adobe Photoshop, which allows objective and precise information to be gathered on irregularly shaped cells. We have applied this measurement technique to the analysis of human muscle cells and their immunologically marked intracellular constituents, as these cells are prone to adopting a highly branched phenotype in culture. Use of this method can be used to overcome many of the long-standing limitations of conventional approaches for quantifying muscle cell size in vitro. In addition, wider applications of Photoshop as a quantitative and semiquantitative tool in immunocytochemistry are explored. PMID:22511600

  17. Quantitative image analysis for investigating cell-matrix interactions

    NASA Astrophysics Data System (ADS)

    Burkel, Brian; Notbohm, Jacob

    2017-07-01

    The extracellular matrix provides both chemical and physical cues that control cellular processes such as migration, division, differentiation, and cancer progression. Cells can mechanically alter the matrix by applying forces that result in matrix displacements, which in turn may localize to form dense bands along which cells may migrate. To quantify the displacements, we use confocal microscopy and fluorescent labeling to acquire high-contrast images of the fibrous material. Using a technique for quantitative image analysis called digital volume correlation, we then compute the matrix displacements. Our experimental technology offers a means to quantify matrix mechanics and cell-matrix interactions. We are now using these experimental tools to modulate mechanical properties of the matrix to study cell contraction and migration.

  18. Diagnosing holographic type dark energy models with the Statefinder hierarchy, composite null diagnostic and w- w' pair

    NASA Astrophysics Data System (ADS)

    Zhao, Ze; Wang, Shuang

    2018-03-01

    The main purpose of this work is to distinguish various holographic type dark energy (DE) models, including the ΛHDE, HDE, NADE, and RDE model, by using various diagnostic tools. The first diagnostic tool is the Statefinder hierarchy, in which the evolution of Statefinder hierarchy parmeter S (1) 3( z) and S (1) 4( z) are studied. The second is composite null diagnostic (CND), in which the trajectories of { S (1) 3, ɛ} and { S (1) 4, ɛ} are investigated, where ɛ is the fractional growth parameter. The last is w-w' analysis, where w is the equation of state for DE and the prime denotes derivative with respect to ln a. In the analysis we consider two cases: varying current fractional DE density Ω de0 and varying DE model parameter C. We find that: (1) both the Statefinder hierarchy and the CND have qualitative impact on ΛHDE, but only have quantitative impact on HDE. (2) S (1) 4 can lead to larger differences than S (1) 3, while the CND pair has a stronger ability to distinguish different models than the Statefinder hierarchy. (3) For the case of varying C, the { w,w'} pair has qualitative impact on ΛHDE; for the case of varying Ω de0, the { w, w'} pair only has quantitative impact; these results are different from the cases of HDE, RDE, and NADE, in which the {w,w'} pair only has quantitative impact on these models. In conclusion, compared with HDE, RDE, and NADE, the ΛHDE model can be easily distinguished by using these diagnostic tools.

  19. Quantitative analysis of doped/undoped ZnO nanomaterials using laser assisted atom probe tomography: Influence of the analysis parameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amirifar, Nooshin; Lardé, Rodrigue, E-mail: rodrigue.larde@univ-rouen.fr; Talbot, Etienne

    2015-12-07

    In the last decade, atom probe tomography has become a powerful tool to investigate semiconductor and insulator nanomaterials in microelectronics, spintronics, and optoelectronics. In this paper, we report an investigation of zinc oxide nanostructures using atom probe tomography. We observed that the chemical composition of zinc oxide is strongly dependent on the analysis parameters used for atom probe experiments. It was observed that at high laser pulse energies, the electric field at the specimen surface is strongly dependent on the crystallographic directions. This dependence leads to an inhomogeneous field evaporation of the surface atoms, resulting in unreliable measurements. We showmore » that the laser pulse energy has to be well tuned to obtain reliable quantitative chemical composition measurements of undoped and doped ZnO nanomaterials.« less

  20. The cutting edge - Micro-CT for quantitative toolmark analysis of sharp force trauma to bone.

    PubMed

    Norman, D G; Watson, D G; Burnett, B; Fenne, P M; Williams, M A

    2018-02-01

    Toolmark analysis involves examining marks created on an object to identify the likely tool responsible for creating those marks (e.g., a knife). Although a potentially powerful forensic tool, knife mark analysis is still in its infancy and the validation of imaging techniques as well as quantitative approaches is ongoing. This study builds on previous work by simulating real-world stabbings experimentally and statistically exploring quantitative toolmark properties, such as cut mark angle captured by micro-CT imaging, to predict the knife responsible. In Experiment 1 a mechanical stab rig and two knives were used to create 14 knife cut marks on dry pig ribs. The toolmarks were laser and micro-CT scanned to allow for quantitative measurements of numerous toolmark properties. The findings from Experiment 1 demonstrated that both knives produced statistically different cut mark widths, wall angle and shapes. Experiment 2 examined knife marks created on fleshed pig torsos with conditions designed to better simulate real-world stabbings. Eight knives were used to generate 64 incision cut marks that were also micro-CT scanned. Statistical exploration of these cut marks suggested that knife type, serrated or plain, can be predicted from cut mark width and wall angle. Preliminary results suggest that knives type can be predicted from cut mark width, and that knife edge thickness correlates with cut mark width. An additional 16 cut marks walls were imaged for striation marks using scanning electron microscopy with results suggesting that this approach might not be useful for knife mark analysis. Results also indicated that observer judgements of cut mark shape were more consistent when rated from micro-CT images than light microscopy images. The potential to combine micro-CT data, medical grade CT data and photographs to develop highly realistic virtual models for visualisation and 3D printing is also demonstrated. This is the first study to statistically explore simulated real-world knife marks imaged by micro-CT to demonstrate the potential of quantitative approaches in knife mark analysis. Findings and methods presented in this study are relevant to both forensic toolmark researchers as well as practitioners. Limitations of the experimental methodologies and imaging techniques are discussed, and further work is recommended. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Quantification of indium in steel using PIXE

    NASA Astrophysics Data System (ADS)

    Oliver, A.; Miranda, J.; Rickards, J.; Cheang, J. C.

    1989-04-01

    The quantitative analysis of steel for endodontics tools was carried out using low-energy protons (≤ 700 keV). A computer program for a thick-target analysis which includes enhancement due to secondary fluorescence was used. In this experiment the L-lines of indium are enhanced due to the proximity of other elements' K-lines to the indium absorption edge. The results show that the ionization cross section expression employed to evaluate this magnitude is important.

  2. Quantitative 1H NMR: Development and Potential of an Analytical Method – an Update

    PubMed Central

    Pauli, Guido F.; Gödecke, Tanja; Jaki, Birgit U.; Lankin, David C.

    2012-01-01

    Covering the literature from mid-2004 until the end of 2011, this review continues a previous literature overview on quantitative 1H NMR (qHNMR) methodology and its applications in the analysis of natural products (NPs). Among the foremost advantages of qHNMR is its accurate function with external calibration, the lack of any requirement for identical reference materials, a high precision and accuracy when properly validated, and an ability to quantitate multiple analytes simultaneously. As a result of the inclusion of over 170 new references, this updated review summarizes a wealth of detailed experiential evidence and newly developed methodology that supports qHNMR as a valuable and unbiased analytical tool for natural product and other areas of research. PMID:22482996

  3. Quantitative imaging of heterogeneous dynamics in drying and aging paints

    PubMed Central

    van der Kooij, Hanne M.; Fokkink, Remco; van der Gucht, Jasper; Sprakel, Joris

    2016-01-01

    Drying and aging paint dispersions display a wealth of complex phenomena that make their study fascinating yet challenging. To meet the growing demand for sustainable, high-quality paints, it is essential to unravel the microscopic mechanisms underlying these phenomena. Visualising the governing dynamics is, however, intrinsically difficult because the dynamics are typically heterogeneous and span a wide range of time scales. Moreover, the high turbidity of paints precludes conventional imaging techniques from reaching deep inside the paint. To address these challenges, we apply a scattering technique, Laser Speckle Imaging, as a versatile and quantitative tool to elucidate the internal dynamics, with microscopic resolution and spanning seven decades of time. We present a toolbox of data analysis and image processing methods that allows a tailored investigation of virtually any turbid dispersion, regardless of the geometry and substrate. Using these tools we watch a variety of paints dry and age with unprecedented detail. PMID:27682840

  4. Focus groups: a useful tool for curriculum evaluation.

    PubMed

    Frasier, P Y; Slatt, L; Kowlowitz, V; Kollisch, D O; Mintzer, M

    1997-01-01

    Focus group interviews have been used extensively in health services program planning, health education, and curriculum planning. However, with the exception of a few reports describing the use of focus groups for a basic science course evaluation and a clerkship's impact on medical students, the potential of focus groups as a tool for curriculum evaluation has not been explored. Focus groups are a valid stand-alone evaluation process, but they are most often used in combination with other quantitative and qualitative methods. Focus groups rely heavily on group interaction, combining elements of individual interviews and participant observation. This article compares the focus group interview with both quantitative and qualitative methods; discusses when to use focus group interviews; outlines a protocol for conducting focus groups, including a comparison of various styles of qualitative data analysis; and offers a case study, in which focus groups evaluated the effectiveness of a pilot preclinical curriculum.

  5. Cognitive Issues in Learning Advanced Physics: An Example from Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Singh, Chandralekha; Zhu, Guangtian

    2009-11-01

    We are investigating cognitive issues in learning quantum mechanics in order to develop effective teaching and learning tools. The analysis of cognitive issues is particularly important for bridging the gap between the quantitative and conceptual aspects of quantum mechanics and for ensuring that the learning tools help students build a robust knowledge structure. We discuss the cognitive aspects of quantum mechanics that are similar or different from those of introductory physics and their implications for developing strategies to help students develop a good grasp of quantum mechanics.

  6. NUclear EVacuation Analysis Code (NUEVAC) : a tool for evaluation of sheltering and evacuation responses following urban nuclear detonations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoshimura, Ann S.; Brandt, Larry D.

    2009-11-01

    The NUclear EVacuation Analysis Code (NUEVAC) has been developed by Sandia National Laboratories to support the analysis of shelter-evacuate (S-E) strategies following an urban nuclear detonation. This tool can model a range of behaviors, including complex evacuation timing and path selection, as well as various sheltering or mixed evacuation and sheltering strategies. The calculations are based on externally generated, high resolution fallout deposition and plume data. Scenario setup and calculation outputs make extensive use of graphics and interactive features. This software is designed primarily to produce quantitative evaluations of nuclear detonation response options. However, the outputs have also proven usefulmore » in the communication of technical insights concerning shelter-evacuate tradeoffs to urban planning or response personnel.« less

  7. PyQuant: A Versatile Framework for Analysis of Quantitative Mass Spectrometry Data*

    PubMed Central

    Mitchell, Christopher J.; Kim, Min-Sik; Na, Chan Hyun; Pandey, Akhilesh

    2016-01-01

    Quantitative mass spectrometry data necessitates an analytical pipeline that captures the accuracy and comprehensiveness of the experiments. Currently, data analysis is often coupled to specific software packages, which restricts the analysis to a given workflow and precludes a more thorough characterization of the data by other complementary tools. To address this, we have developed PyQuant, a cross-platform mass spectrometry data quantification application that is compatible with existing frameworks and can be used as a stand-alone quantification tool. PyQuant supports most types of quantitative mass spectrometry data including SILAC, NeuCode, 15N, 13C, or 18O and chemical methods such as iTRAQ or TMT and provides the option of adding custom labeling strategies. In addition, PyQuant can perform specialized analyses such as quantifying isotopically labeled samples where the label has been metabolized into other amino acids and targeted quantification of selected ions independent of spectral assignment. PyQuant is capable of quantifying search results from popular proteomic frameworks such as MaxQuant, Proteome Discoverer, and the Trans-Proteomic Pipeline in addition to several standalone search engines. We have found that PyQuant routinely quantifies a greater proportion of spectral assignments, with increases ranging from 25–45% in this study. Finally, PyQuant is capable of complementing spectral assignments between replicates to quantify ions missed because of lack of MS/MS fragmentation or that were omitted because of issues such as spectra quality or false discovery rates. This results in an increase of biologically useful data available for interpretation. In summary, PyQuant is a flexible mass spectrometry data quantification platform that is capable of interfacing with a variety of existing formats and is highly customizable, which permits easy configuration for custom analysis. PMID:27231314

  8. Predictive value of EEG in postanoxic encephalopathy: A quantitative model-based approach.

    PubMed

    Efthymiou, Evdokia; Renzel, Roland; Baumann, Christian R; Poryazova, Rositsa; Imbach, Lukas L

    2017-10-01

    The majority of comatose patients after cardiac arrest do not regain consciousness due to severe postanoxic encephalopathy. Early and accurate outcome prediction is therefore essential in determining further therapeutic interventions. The electroencephalogram is a standardized and commonly available tool used to estimate prognosis in postanoxic patients. The identification of pathological EEG patterns with poor prognosis relies however primarily on visual EEG scoring by experts. We introduced a model-based approach of EEG analysis (state space model) that allows for an objective and quantitative description of spectral EEG variability. We retrospectively analyzed standard EEG recordings in 83 comatose patients after cardiac arrest between 2005 and 2013 in the intensive care unit of the University Hospital Zürich. Neurological outcome was assessed one month after cardiac arrest using the Cerebral Performance Category. For a dynamic and quantitative EEG analysis, we implemented a model-based approach (state space analysis) to quantify EEG background variability independent from visual scoring of EEG epochs. Spectral variability was compared between groups and correlated with clinical outcome parameters and visual EEG patterns. Quantitative assessment of spectral EEG variability (state space velocity) revealed significant differences between patients with poor and good outcome after cardiac arrest: Lower mean velocity in temporal electrodes (T4 and T5) was significantly associated with poor prognostic outcome (p<0.005) and correlated with independently identified visual EEG patterns such as generalized periodic discharges (p<0.02). Receiver operating characteristic (ROC) analysis confirmed the predictive value of lower state space velocity for poor clinical outcome after cardiac arrest (AUC 80.8, 70% sensitivity, 15% false positive rate). Model-based quantitative EEG analysis (state space analysis) provides a novel, complementary marker for prognosis in postanoxic encephalopathy. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Three-dimensional cardiac architecture determined by two-photon microtomy

    NASA Astrophysics Data System (ADS)

    Huang, Hayden; MacGillivray, Catherine; Kwon, Hyuk-Sang; Lammerding, Jan; Robbins, Jeffrey; Lee, Richard T.; So, Peter

    2009-07-01

    Cardiac architecture is inherently three-dimensional, yet most characterizations rely on two-dimensional histological slices or dissociated cells, which remove the native geometry of the heart. We previously developed a method for labeling intact heart sections without dissociation and imaging large volumes while preserving their three-dimensional structure. We further refine this method to permit quantitative analysis of imaged sections. After data acquisition, these sections are assembled using image-processing tools, and qualitative and quantitative information is extracted. By examining the reconstructed cardiac blocks, one can observe end-to-end adjacent cardiac myocytes (cardiac strands) changing cross-sectional geometries, merging and separating from other strands. Quantitatively, representative cross-sectional areas typically used for determining hypertrophy omit the three-dimensional component; we show that taking orientation into account can significantly alter the analysis. Using fast-Fourier transform analysis, we analyze the gross organization of cardiac strands in three dimensions. By characterizing cardiac structure in three dimensions, we are able to determine that the α crystallin mutation leads to hypertrophy with cross-sectional area increases, but not necessarily via changes in fiber orientation distribution.

  10. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    NASA Astrophysics Data System (ADS)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  11. The Application of Ultrasonic Inspection to Crimped Electrical Connections

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Perey, Daniel F.; Yost, William T.

    2010-01-01

    The development of a new ultrasonic measurement technique to quantitatively assess wire crimp terminations is discussed. The development of a prototype instrument, based on a modified, commercially available, crimp tool, is demonstrated for applying this technique when wire crimps are installed. The crimp tool has three separate crimping locations that accommodate the three different ferrule diameters. The crimp tool in this study is capable of crimping wire diameters ranging from 12 to 26 American Wire Gauge (AWG). A transducer design is presented that allows for interrogation of each of the three crimp locations on the crimp tool without reconfiguring the device. An analysis methodology, based on transmitted ultrasonic energy and timing of the first received pulse is shown to correlate to both crimp location in the tool and the AWG of the crimp/ferrule combination. The detectability of a number of the crimp failure pathologies, such as missing strands, partially inserted wires and incomplete crimp compression, is discussed. A wave propagation model, solved by finite element analysis, describes the compressional ultrasonic wave propagation through the junction during the crimping process.

  12. Proactive Encouragement of Interdisciplinary Research Teams in a Business School Environment: Strategy and Results

    ERIC Educational Resources Information Center

    Adams, Susan M.; Carter, Nathan C.; Hadlock, Charles R.; Haughton, Dominique M.; Sirbu, George

    2008-01-01

    This case study describes efforts to promote collaborative research across traditional boundaries in a business-oriented university as part of an institutional transformation. We model this activity within the framework of social network analysis and use quantitative tools from that field to characterize resulting impacts. (Contains 4 tables and 2…

  13. Pedagogical Beliefs in Work-Based Learning: An Analysis and Implications of Teachers' Belief Orientations

    ERIC Educational Resources Information Center

    Abukari, Abdulai

    2014-01-01

    This paper presents findings of a research project that aimed to critically examine the pedagogical beliefs of work-based learning teachers, and their potential implication on practice and expectations of learners and employers. Based on an online tool, qualitative and quantitative (descriptive) data were generated from a purposive sample of some…

  14. Managing Technical and Cost Uncertainties During Product Development in a Simulation-Based Design Environment

    NASA Technical Reports Server (NTRS)

    Karandikar, Harsh M.

    1997-01-01

    An approach for objective and quantitative technical and cost risk analysis during product development, which is applicable from the earliest stages, is discussed. The approach is supported by a software tool called the Analytical System for Uncertainty and Risk Estimation (ASURE). Details of ASURE, the underlying concepts and its application history, are provided.

  15. The Concept of Happiness as Conveyed in Visual Representations: Analysis of the Work of Early Childhood Educators

    ERIC Educational Resources Information Center

    Russo-Zimet, Gila; Segel, Sarit

    2014-01-01

    This research was designed to examine how early-childhood educators pursuing their graduate degrees perceive the concept of happiness, as conveyed in visual representations. The research methodology combines qualitative and quantitative paradigms using the metaphoric collage, a tool used to analyze visual and verbal aspects. The research…

  16. Application of Simulation to Individualized Self-Paced Training. Final Report. TAEG Report No. 11-2.

    ERIC Educational Resources Information Center

    Lindahl, William H.; Gardner, James H.

    Computer simulation is recognized as a valuable systems analysis research tool which enables the detailed examination, evaluation, and manipulation, under stated conditions, of a system without direct action on the system. This technique provides management with quantitative data on system performance and capabilities which can be used to compare…

  17. Making a Game out of It: Using Web-Based Competitive Quizzes for Quantitative Analysis Content Review

    ERIC Educational Resources Information Center

    Grinias, James P.

    2017-01-01

    Online student-response systems provide instructors with an easy-to-use tool to instantly evaluate student comprehension. For comprehensive content review, turning this evaluation into a competitive game where students can compete against each other was found to be helpful and enjoyable for participating students. One specific online resource,…

  18. Environmental Inquiry by College Students: Original Research and Peer Review Using Web-Based Collaborative Tools. Preliminary Quantitative Data Analysis.

    ERIC Educational Resources Information Center

    Cakir, Mustafa; Carlsen, William S.

    The Environmental Inquiry (EI) program (Cornell University and Pennsylvania State University) supports inquiry based, student-centered science teaching on selected topics in the environmental sciences. Texts to support high school student research are published by the National Science Teachers Association (NSTA) in the domains of environmental…

  19. Interactive Book Reading in Early Education: A Tool to Stimulate Print Knowledge as Well as Oral Language

    ERIC Educational Resources Information Center

    Mol, Suzanne E.; Bus, Adriana G.; de Jong, Maria T.

    2009-01-01

    This meta-analysis examines to what extent interactive storybook reading stimulates two pillars of learning to read: vocabulary and print knowledge. The authors quantitatively reviewed 31 (quasi) experiments (n = 2,049 children) in which educators were trained to encourage children to be actively involved before, during, and after joint book…

  20. Effect Size Measures for Mediation Models: Quantitative Strategies for Communicating Indirect Effects

    ERIC Educational Resources Information Center

    Preacher, Kristopher J.; Kelley, Ken

    2011-01-01

    The statistical analysis of mediation effects has become an indispensable tool for helping scientists investigate processes thought to be causal. Yet, in spite of many recent advances in the estimation and testing of mediation effects, little attention has been given to methods for communicating effect size and the practical importance of those…

  1. Advancing effects analysis for integrated, large-scale wildfire risk assessment

    Treesearch

    Matthew P. Thompson; David E. Calkin; Julie W. Gilbertson-Day; Alan A. Ager

    2011-01-01

    In this article, we describe the design and development of a quantitative, geospatial risk assessment tool intended to facilitate monitoring trends in wildfire risk over time and to provide information useful in prioritizing fuels treatments and mitigation measures. The research effort is designed to develop, from a strategic view, a first approximation of how both...

  2. Price Analysis and the Serials Situation: Trying to Solve an Age-Old Problem.

    ERIC Educational Resources Information Center

    Meyers, Barbara; Fleming, Janice L.

    1991-01-01

    Discussion of journal pricing and its effects on academic libraries focuses on data from the Optical Society of America's pricing study that used price per 1,000 words as a quantitative evaluative tool. Data collection methodology is described, and implications of the results for library collection development are suggested. (eight references)…

  3. Elementary Analysis of the Special Relativistic Combination of Velocities, Wigner Rotation and Thomas Precession

    ERIC Educational Resources Information Center

    O'Donnell, Kane; Visser, Matt

    2011-01-01

    The purpose of this paper is to provide an elementary introduction to the qualitative and quantitative results of velocity combination in special relativity, including the Wigner rotation and Thomas precession. We utilize only the most familiar tools of special relativity, in arguments presented at three differing levels: (1) utterly elementary,…

  4. An optimized method to calculate error correction capability of tool influence function in frequency domain

    NASA Astrophysics Data System (ADS)

    Wang, Jia; Hou, Xi; Wan, Yongjian; Shi, Chunyan

    2017-10-01

    An optimized method to calculate error correction capability of tool influence function (TIF) in certain polishing conditions will be proposed based on smoothing spectral function. The basic mathematical model for this method will be established in theory. A set of polishing experimental data with rigid conformal tool is used to validate the optimized method. The calculated results can quantitatively indicate error correction capability of TIF for different spatial frequency errors in certain polishing conditions. The comparative analysis with previous method shows that the optimized method is simpler in form and can get the same accuracy results with less calculating time in contrast to previous method.

  5. New insight in quantitative analysis of vascular permeability during immune reaction (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Kalchenko, Vyacheslav; Molodij, Guillaume; Kuznetsov, Yuri; Smolyakov, Yuri; Israeli, David; Meglinski, Igor; Harmelin, Alon

    2016-03-01

    The use of fluorescence imaging of vascular permeability becomes a golden standard for assessing the inflammation process during experimental immune response in vivo. The use of the optical fluorescence imaging provides a very useful and simple tool to reach this purpose. The motivation comes from the necessity of a robust and simple quantification and data presentation of inflammation based on a vascular permeability. Changes of the fluorescent intensity, as a function of time is a widely accepted method to assess the vascular permeability during inflammation related to the immune response. In the present study we propose to bring a new dimension by applying a more sophisticated approach to the analysis of vascular reaction by using a quantitative analysis based on methods derived from astronomical observations, in particular by using a space-time Fourier filtering analysis followed by a polynomial orthogonal modes decomposition. We demonstrate that temporal evolution of the fluorescent intensity observed at certain pixels correlates quantitatively to the blood flow circulation at normal conditions. The approach allows to determine the regions of permeability and monitor both the fast kinetics related to the contrast material distribution in the circulatory system and slow kinetics associated with extravasation of the contrast material. Thus, we introduce a simple and convenient method for fast quantitative visualization of the leakage related to the inflammatory (immune) reaction in vivo.

  6. Test-Analysis Correlation for Space Shuttle External Tank Foam Impacting RCC Wing Leading Edge Component Panels

    NASA Technical Reports Server (NTRS)

    Lyle, Karen H.

    2008-01-01

    The Space Shuttle Columbia Accident Investigation Board recommended that NASA develop, validate, and maintain a modeling tool capable of predicting the damage threshold for debris impacts on the Space Shuttle Reinforced Carbon-Carbon (RCC) wing leading edge and nosecap assembly. The results presented in this paper are one part of a multi-level approach that supported the development of the predictive tool used to recertify the shuttle for flight following the Columbia Accident. The assessment of predictive capability was largely based on test analysis comparisons for simpler component structures. This paper provides comparisons of finite element simulations with test data for external tank foam debris impacts onto 6-in. square RCC flat panels. Both quantitative displacement and qualitative damage assessment correlations are provided. The comparisons show good agreement and provided the Space Shuttle Program with confidence in the predictive tool.

  7. Developing a postal screening tool for frailty in primary care: a secondary data analysis.

    PubMed

    Kydd, Lauren

    2016-07-01

    The purpose of this secondary data analysis (SDA) was to review a subset of quantitative and qualitative paired data sets from a returned postal screening tool (PST) completed by patients and compare them to the clinical letters composed by elderly care community nurses (ECCN) following patient assessment to ascertain the tool's reliability and validity. The aim was to understand to what extent the problems identified by patients in PSTs aligned with actual or potential problems identified by the ECCNs. The researcher examined this connection to establish whether the PST was a valid, reliable approach to proactive care. The findings of this SDA indicated that patients did understand the PST. Many appropriate referrals were made as a result of the ECCN visit that would not have occurred if the PST had not been sent. This article focuses specifically upon the physiotherapy section as this was the area where the most red flags were identified.

  8. Indirect Observation in Everyday Contexts: Concepts and Methodological Guidelines within a Mixed Methods Framework.

    PubMed

    Anguera, M Teresa; Portell, Mariona; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2018-01-01

    Indirect observation is a recent concept in systematic observation. It largely involves analyzing textual material generated either indirectly from transcriptions of audio recordings of verbal behavior in natural settings (e.g., conversation, group discussions) or directly from narratives (e.g., letters of complaint, tweets, forum posts). It may also feature seemingly unobtrusive objects that can provide relevant insights into daily routines. All these materials constitute an extremely rich source of information for studying everyday life, and they are continuously growing with the burgeoning of new technologies for data recording, dissemination, and storage. Narratives are an excellent vehicle for studying everyday life, and quantitization is proposed as a means of integrating qualitative and quantitative elements. However, this analysis requires a structured system that enables researchers to analyze varying forms and sources of information objectively. In this paper, we present a methodological framework detailing the steps and decisions required to quantitatively analyze a set of data that was originally qualitative. We provide guidelines on study dimensions, text segmentation criteria, ad hoc observation instruments, data quality controls, and coding and preparation of text for quantitative analysis. The quality control stage is essential to ensure that the code matrices generated from the qualitative data are reliable. We provide examples of how an indirect observation study can produce data for quantitative analysis and also describe the different software tools available for the various stages of the process. The proposed method is framed within a specific mixed methods approach that involves collecting qualitative data and subsequently transforming these into matrices of codes (not frequencies) for quantitative analysis to detect underlying structures and behavioral patterns. The data collection and quality control procedures fully meet the requirement of flexibility and provide new perspectives on data integration in the study of biopsychosocial aspects in everyday contexts.

  9. Adaptation of a Simple Microfluidic Platform for High-Dimensional Quantitative Morphological Analysis of Human Mesenchymal Stromal Cells on Polystyrene-Based Substrates.

    PubMed

    Lam, Johnny; Marklein, Ross A; Jimenez-Torres, Jose A; Beebe, David J; Bauer, Steven R; Sung, Kyung E

    2017-12-01

    Multipotent stromal cells (MSCs, often called mesenchymal stem cells) have garnered significant attention within the field of regenerative medicine because of their purported ability to differentiate down musculoskeletal lineages. Given the inherent heterogeneity of MSC populations, recent studies have suggested that cell morphology may be indicative of MSC differentiation potential. Toward improving current methods and developing simple yet effective approaches for the morphological evaluation of MSCs, we combined passive pumping microfluidic technology with high-dimensional morphological characterization to produce robust tools for standardized high-throughput analysis. Using ultraviolet (UV) light as a modality for reproducible polystyrene substrate modification, we show that MSCs seeded on microfluidic straight channel devices incorporating UV-exposed substrates exhibited morphological changes that responded accordingly to the degree of substrate modification. Substrate modification also effected greater morphological changes in MSCs seeded at a lower rather than higher density within microfluidic channels. Despite largely comparable trends in morphology, MSCs seeded in microscale as opposed to traditional macroscale platforms displayed much higher sensitivity to changes in substrate properties. In summary, we adapted and qualified microfluidic cell culture platforms comprising simple straight channel arrays as a viable and robust tool for high-throughput quantitative morphological analysis to study cell-material interactions.

  10. PeptideDepot: Flexible Relational Database for Visual Analysis of Quantitative Proteomic Data and Integration of Existing Protein Information

    PubMed Central

    Yu, Kebing; Salomon, Arthur R.

    2010-01-01

    Recently, dramatic progress has been achieved in expanding the sensitivity, resolution, mass accuracy, and scan rate of mass spectrometers able to fragment and identify peptides through tandem mass spectrometry (MS/MS). Unfortunately, this enhanced ability to acquire proteomic data has not been accompanied by a concomitant increase in the availability of flexible tools allowing users to rapidly assimilate, explore, and analyze this data and adapt to a variety of experimental workflows with minimal user intervention. Here we fill this critical gap by providing a flexible relational database called PeptideDepot for organization of expansive proteomic data sets, collation of proteomic data with available protein information resources, and visual comparison of multiple quantitative proteomic experiments. Our software design, built upon the synergistic combination of a MySQL database for safe warehousing of proteomic data with a FileMaker-driven graphical user interface for flexible adaptation to diverse workflows, enables proteomic end-users to directly tailor the presentation of proteomic data to the unique analysis requirements of the individual proteomics lab. PeptideDepot may be deployed as an independent software tool or integrated directly with our High Throughput Autonomous Proteomic Pipeline (HTAPP) used in the automated acquisition and post-acquisition analysis of proteomic data. PMID:19834895

  11. Eliciting women's cervical screening preferences: a mixed methods systematic review protocol.

    PubMed

    Wood, Brianne; Van Katwyk, Susan Rogers; El-Khatib, Ziad; McFaul, Susan; Taljaard, Monica; Wright, Erica; Graham, Ian D; Little, Julian

    2016-08-11

    With the accumulation of evidence regarding potential harms of cancer screening in recent years, researchers, policy-makers, and the public are becoming more critical of population-based cancer screening. Consequently, a high-quality cancer screening program should consider individuals' values and preferences when determining recommendations. In cervical cancer screening, offering women autonomy is considered a "person-centered" approach to health care services; however, it may impact the effectiveness of the program should women choose to not participate. As part of a larger project to investigate women's cervical screening preferences and correlates of these preferences, this systematic review will capture quantitative and qualitative investigations of women's cervical screening preferences and the methods used to elicit them. This mixed methods synthesis will use a thematic analysis approach to synthesize qualitative, quantitative, and mixed methods evidence. This protocol describes the methods that will be used in this investigation. A search strategy has been developed with a health librarian and peer reviewed using PRESS. Based on this strategy, five databases and the gray literature will be searched for studies that meet the inclusion criteria. The quality of the included individual studies will be examined using the Mixed Methods Appraisal Tool. Three reviewers will extract data from the primary studies on the tools or instruments used to elicit women's preferences regarding cervical cancer screening, theoretical frameworks used, outcomes measured, the outstanding themes from quantitative and qualitative evidence, and the identified preferences for cervical cancer screening. We will describe the relationships between study results and the study population, "intervention" (e.g., tool or instrument), and context. We will follow the PRISMA reporting guideline. We will compare findings across studies and between study methods (e.g., qualitative versus quantitative study designs). The strength of the synthesized findings will be assessed using the validated GRADE and CERQual tool. This review will inform the development of a tool to elicit women's cervical screening preferences. Understanding the methods used to elicit women's preferences and what is known about women's cervical screening preferences will be useful for guideline developers who wish to incorporate a woman-centered approach specifically for cervical screening guidelines. PROSPERO CRD42016035737.

  12. 2D Hydrodynamic Based Logic Modeling Tool for River Restoration Decision Analysis: A Quantitative Approach to Project Prioritization

    NASA Astrophysics Data System (ADS)

    Bandrowski, D.; Lai, Y.; Bradley, N.; Gaeuman, D. A.; Murauskas, J.; Som, N. A.; Martin, A.; Goodman, D.; Alvarez, J.

    2014-12-01

    In the field of river restoration sciences there is a growing need for analytical modeling tools and quantitative processes to help identify and prioritize project sites. 2D hydraulic models have become more common in recent years and with the availability of robust data sets and computing technology, it is now possible to evaluate large river systems at the reach scale. The Trinity River Restoration Program is now analyzing a 40 mile segment of the Trinity River to determine priority and implementation sequencing for its Phase II rehabilitation projects. A comprehensive approach and quantitative tool has recently been developed to analyze this complex river system referred to as: 2D-Hydrodynamic Based Logic Modeling (2D-HBLM). This tool utilizes various hydraulic output parameters combined with biological, ecological, and physical metrics at user-defined spatial scales. These metrics and their associated algorithms are the underpinnings of the 2D-HBLM habitat module used to evaluate geomorphic characteristics, riverine processes, and habitat complexity. The habitat metrics are further integrated into a comprehensive Logic Model framework to perform statistical analyses to assess project prioritization. The Logic Model will analyze various potential project sites by evaluating connectivity using principal component methods. The 2D-HBLM tool will help inform management and decision makers by using a quantitative process to optimize desired response variables with balancing important limiting factors in determining the highest priority locations within the river corridor to implement restoration projects. Effective river restoration prioritization starts with well-crafted goals that identify the biological objectives, address underlying causes of habitat change, and recognizes that social, economic, and land use limiting factors may constrain restoration options (Bechie et. al. 2008). Applying natural resources management actions, like restoration prioritization, is essential for successful project implementation (Conroy and Peterson, 2013). Evaluating tradeoffs and examining alternatives to improve fish habitat through optimization modeling is not just a trend but rather the scientific strategy by which management needs embrace and apply in its decision framework.

  13. chipPCR: an R package to pre-process raw data of amplification curves.

    PubMed

    Rödiger, Stefan; Burdukiewicz, Michał; Schierack, Peter

    2015-09-01

    Both the quantitative real-time polymerase chain reaction (qPCR) and quantitative isothermal amplification (qIA) are standard methods for nucleic acid quantification. Numerous real-time read-out technologies have been developed. Despite the continuous interest in amplification-based techniques, there are only few tools for pre-processing of amplification data. However, a transparent tool for precise control of raw data is indispensable in several scenarios, for example, during the development of new instruments. chipPCR is an R: package for the pre-processing and quality analysis of raw data of amplification curves. The package takes advantage of R: 's S4 object model and offers an extensible environment. chipPCR contains tools for raw data exploration: normalization, baselining, imputation of missing values, a powerful wrapper for amplification curve smoothing and a function to detect the start and end of an amplification curve. The capabilities of the software are enhanced by the implementation of algorithms unavailable in R: , such as a 5-point stencil for derivative interpolation. Simulation tools, statistical tests, plots for data quality management, amplification efficiency/quantification cycle calculation, and datasets from qPCR and qIA experiments are part of the package. Core functionalities are integrated in GUIs (web-based and standalone shiny applications), thus streamlining analysis and report generation. http://cran.r-project.org/web/packages/chipPCR. Source code: https://github.com/michbur/chipPCR. stefan.roediger@b-tu.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  14. quantGenius: implementation of a decision support system for qPCR-based gene quantification.

    PubMed

    Baebler, Špela; Svalina, Miha; Petek, Marko; Stare, Katja; Rotter, Ana; Pompe-Novak, Maruša; Gruden, Kristina

    2017-05-25

    Quantitative molecular biology remains a challenge for researchers due to inconsistent approaches for control of errors in the final results. Due to several factors that can influence the final result, quantitative analysis and interpretation of qPCR data are still not trivial. Together with the development of high-throughput qPCR platforms, there is a need for a tool allowing for robust, reliable and fast nucleic acid quantification. We have developed "quantGenius" ( http://quantgenius.nib.si ), an open-access web application for a reliable qPCR-based quantification of nucleic acids. The quantGenius workflow interactively guides the user through data import, quality control (QC) and calculation steps. The input is machine- and chemistry-independent. Quantification is performed using the standard curve approach, with normalization to one or several reference genes. The special feature of the application is the implementation of user-guided QC-based decision support system, based on qPCR standards, that takes into account pipetting errors, assay amplification efficiencies, limits of detection and quantification of the assays as well as the control of PCR inhibition in individual samples. The intermediate calculations and final results are exportable in a data matrix suitable for further statistical analysis or visualization. We additionally compare the most important features of quantGenius with similar advanced software tools and illustrate the importance of proper QC system in the analysis of qPCR data in two use cases. To our knowledge, quantGenius is the only qPCR data analysis tool that integrates QC-based decision support and will help scientists to obtain reliable results which are the basis for biologically meaningful data interpretation.

  15. Machine Learning Meta-analysis of Large Metagenomic Datasets: Tools and Biological Insights.

    PubMed

    Pasolli, Edoardo; Truong, Duy Tin; Malik, Faizan; Waldron, Levi; Segata, Nicola

    2016-07-01

    Shotgun metagenomic analysis of the human associated microbiome provides a rich set of microbial features for prediction and biomarker discovery in the context of human diseases and health conditions. However, the use of such high-resolution microbial features presents new challenges, and validated computational tools for learning tasks are lacking. Moreover, classification rules have scarcely been validated in independent studies, posing questions about the generality and generalization of disease-predictive models across cohorts. In this paper, we comprehensively assess approaches to metagenomics-based prediction tasks and for quantitative assessment of the strength of potential microbiome-phenotype associations. We develop a computational framework for prediction tasks using quantitative microbiome profiles, including species-level relative abundances and presence of strain-specific markers. A comprehensive meta-analysis, with particular emphasis on generalization across cohorts, was performed in a collection of 2424 publicly available metagenomic samples from eight large-scale studies. Cross-validation revealed good disease-prediction capabilities, which were in general improved by feature selection and use of strain-specific markers instead of species-level taxonomic abundance. In cross-study analysis, models transferred between studies were in some cases less accurate than models tested by within-study cross-validation. Interestingly, the addition of healthy (control) samples from other studies to training sets improved disease prediction capabilities. Some microbial species (most notably Streptococcus anginosus) seem to characterize general dysbiotic states of the microbiome rather than connections with a specific disease. Our results in modelling features of the "healthy" microbiome can be considered a first step toward defining general microbial dysbiosis. The software framework, microbiome profiles, and metadata for thousands of samples are publicly available at http://segatalab.cibio.unitn.it/tools/metaml.

  16. Improving Attachments of Non-Invasive (Type III) Electronic Data Loggers to Cetaceans

    DTIC Science & Technology

    2015-09-30

    animals in human care will be performed to test and validate this approach. The cadaver trials will enable controlled testing to failure or with both...quantitative metrics and analysis tools to assess the impact of a tag on the animal . Here we will present: 1) the characterization of the mechanical...fine scale motion analysis for swimming animals . 2 APPROACH Our approach is divided into four subtasks: Task 1: Forces and failure modes

  17. Haze Gray Paint and the U.S. Navy: A Procurement Process Review

    DTIC Science & Technology

    2017-12-01

    support of the fleet. The research encompasses both qualitative and quantitative analytical tools utilizing historical demand data for Silicone Alkyd...inventory level of 1K Polysiloxane in support of the fleet. The research encompasses both qualitative and quantitative analytical tools utilizing...Chapter I. C. CONCLUSIONS As discussed in the Summary section, this research used a qualitative and a quantitative approach to analyze the Polysiloxane

  18. Prioritising coastal zone management issues through fuzzy cognitive mapping approach.

    PubMed

    Meliadou, Aleka; Santoro, Francesca; Nader, Manal R; Dagher, Manale Abou; Al Indary, Shadi; Salloum, Bachir Abi

    2012-04-30

    Effective public participation is an essential component of Integrated Coastal Zone Management implementation. To promote such participation, a shared understanding of stakeholders' objectives has to be built to ultimately result in common coastal management strategies. The application of quantitative and semi-quantitative methods involving tools such as Fuzzy Cognitive Mapping is presently proposed for reaching such understanding. In this paper we apply the Fuzzy Cognitive Mapping tool to elucidate the objectives and priorities of North Lebanon's coastal productive sectors, and to formalize their coastal zone perceptions and knowledge. Then, we investigate the potential of Fuzzy Cognitive Mapping as tool for support coastal zone management. Five round table discussions were organized; one for the municipalities of the area and one for each of the main coastal productive sectors (tourism, industry, fisheries, agriculture), where the participants drew cognitive maps depicting their views. The analysis of the cognitive maps showed a large number of factors perceived as affecting the current situation of the North Lebanon coastal zone that were classified into five major categories: governance, infrastructure, environment, intersectoral interactions and sectoral initiatives. Furthermore, common problems, expectations and management objectives for all sectors were exposed. Within this context, Fuzzy Cognitive Mapping proved to be an essential tool for revealing stakeholder knowledge and perception and understanding complex relationships. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Optimization of Statistical Methods Impact on Quantitative Proteomics Data.

    PubMed

    Pursiheimo, Anna; Vehmas, Anni P; Afzal, Saira; Suomi, Tomi; Chand, Thaman; Strauss, Leena; Poutanen, Matti; Rokka, Anne; Corthals, Garry L; Elo, Laura L

    2015-10-02

    As tools for quantitative label-free mass spectrometry (MS) rapidly develop, a consensus about the best practices is not apparent. In the work described here we compared popular statistical methods for detecting differential protein expression from quantitative MS data using both controlled experiments with known quantitative differences for specific proteins used as standards as well as "real" experiments where differences in protein abundance are not known a priori. Our results suggest that data-driven reproducibility-optimization can consistently produce reliable differential expression rankings for label-free proteome tools and are straightforward in their application.

  20. Assessing the risk posed by natural hazards to infrastructures

    NASA Astrophysics Data System (ADS)

    Eidsvig, Unni; Kristensen, Krister; Vidar Vangelsten, Bjørn

    2015-04-01

    The modern society is increasingly dependent on infrastructures to maintain its function, and disruption in one of the infrastructure systems may have severe consequences. The Norwegian municipalities have, according to legislation, a duty to carry out a risk and vulnerability analysis and plan and prepare for emergencies in a short- and long term perspective. Vulnerability analysis of the infrastructures and their interdependencies is an important part of this analysis. This paper proposes a model for assessing the risk posed by natural hazards to infrastructures. The model prescribes a three level analysis with increasing level of detail, moving from qualitative to quantitative analysis. This paper focuses on the second level, which consists of a semi-quantitative analysis. The purpose of this analysis is to perform a screening of the scenarios of natural hazards threatening the infrastructures identified in the level 1 analysis and investigate the need for further analyses, i.e. level 3 quantitative analyses. The proposed level 2 analysis considers the frequency of the natural hazard, different aspects of vulnerability including the physical vulnerability of the infrastructure itself and the societal dependency on the infrastructure. An indicator-based approach is applied, ranking the indicators on a relative scale. The proposed indicators characterize the robustness of the infrastructure, the importance of the infrastructure as well as interdependencies between society and infrastructure affecting the potential for cascading effects. Each indicator is ranked on a 1-5 scale based on pre-defined ranking criteria. The aggregated risk estimate is a combination of the semi-quantitative vulnerability indicators, as well as quantitative estimates of the frequency of the natural hazard and the number of users of the infrastructure. Case studies for two Norwegian municipalities are presented, where risk to primary road, water supply and power network threatened by storm and landslide is assessed. The application examples show that the proposed model provides a useful tool for screening of undesirable events, with the ultimate goal to reduce the societal vulnerability.

  1. The effects of GeoGebra software on pre-service mathematics teachers' attitudes and views toward proof and proving

    NASA Astrophysics Data System (ADS)

    Zengin, Yılmaz

    2017-11-01

    The purpose of this study is to determine the effect of GeoGebra software on pre-service mathematics teachers' attitudes towards proof and proving and to determine pre-service teachers' pre- and post-views regarding proof. The study lasted nine weeks and the participants of the study consisted of 24 pre-service mathematics teachers. The study used the 'Attitude Scale Towards Proof and Proving' and an open-ended questionnaire that were administered before and after the intervention as data collection tools. Paired samples t-test analysis was used for the analysis of quantitative data and content and descriptive analyses were utilized for the analysis of qualitative data. As a result of the data analysis, it was determined that GeoGebra software was an effective tool in increasing pre-service teachers' attitudes towards proof and proving.

  2. Microbial-based evaluation of foaming events in full-scale wastewater treatment plants by microscopy survey and quantitative image analysis.

    PubMed

    Leal, Cristiano; Amaral, António Luís; Costa, Maria de Lourdes

    2016-08-01

    Activated sludge systems are prone to be affected by foaming occurrences causing the sludge to rise in the reactor and affecting the wastewater treatment plant (WWTP) performance. Nonetheless, there is currently a knowledge gap hindering the development of foaming events prediction tools that may be fulfilled by the quantitative monitoring of AS systems biota and sludge characteristics. As such, the present study focuses on the assessment of foaming events in full-scale WWTPs, by quantitative protozoa, metazoa, filamentous bacteria, and sludge characteristics analysis, further used to enlighten the inner relationships between these parameters. In the current study, a conventional activated sludge system (CAS) and an oxidation ditch (OD) were surveyed throughout a period of 2 and 3 months, respectively, regarding their biota and sludge characteristics. The biota community was monitored by microscopic observation, and a new filamentous bacteria index was developed to quantify their occurrence. Sludge characteristics (aggregated and filamentous biomass contents and aggregate size) were determined by quantitative image analysis (QIA). The obtained data was then processed by principal components analysis (PCA), cross-correlation analysis, and decision trees to assess the foaming occurrences, and enlighten the inner relationships. It was found that such events were best assessed by the combined use of the relative abundance of testate amoeba and nocardioform filamentous index, presenting a 92.9 % success rate for overall foaming events, and 87.5 and 100 %, respectively, for persistent and mild events.

  3. Quantitative analysis of crystalline pharmaceuticals in tablets by pattern-fitting procedure using X-ray diffraction pattern.

    PubMed

    Takehira, Rieko; Momose, Yasunori; Yamamura, Shigeo

    2010-10-15

    A pattern-fitting procedure using an X-ray diffraction pattern was applied to the quantitative analysis of binary system of crystalline pharmaceuticals in tablets. Orthorhombic crystals of isoniazid (INH) and mannitol (MAN) were used for the analysis. Tablets were prepared under various compression pressures using a direct compression method with various compositions of INH and MAN. Assuming that X-ray diffraction pattern of INH-MAN system consists of diffraction intensities from respective crystals, observed diffraction intensities were fitted to analytic expression based on X-ray diffraction theory and separated into two intensities from INH and MAN crystals by a nonlinear least-squares procedure. After separation, the contents of INH were determined by using the optimized normalization constants for INH and MAN. The correction parameter including all the factors that are beyond experimental control was required for quantitative analysis without calibration curve. The pattern-fitting procedure made it possible to determine crystalline phases in the range of 10-90% (w/w) of the INH contents. Further, certain characteristics of the crystals in the tablets, such as the preferred orientation, size of crystallite, and lattice disorder were determined simultaneously. This method can be adopted to analyze compounds whose crystal structures are known. It is a potentially powerful tool for the quantitative phase analysis and characterization of crystals in tablets and powders using X-ray diffraction patterns. Copyright 2010 Elsevier B.V. All rights reserved.

  4. Multidimensional quantitative analysis of mRNA expression within intact vertebrate embryos.

    PubMed

    Trivedi, Vikas; Choi, Harry M T; Fraser, Scott E; Pierce, Niles A

    2018-01-08

    For decades, in situ hybridization methods have been essential tools for studies of vertebrate development and disease, as they enable qualitative analyses of mRNA expression in an anatomical context. Quantitative mRNA analyses typically sacrifice the anatomy, relying on embryo microdissection, dissociation, cell sorting and/or homogenization. Here, we eliminate the trade-off between quantitation and anatomical context, using quantitative in situ hybridization chain reaction (qHCR) to perform accurate and precise relative quantitation of mRNA expression with subcellular resolution within whole-mount vertebrate embryos. Gene expression can be queried in two directions: read-out from anatomical space to expression space reveals co-expression relationships in selected regions of the specimen; conversely, read-in from multidimensional expression space to anatomical space reveals those anatomical locations in which selected gene co-expression relationships occur. As we demonstrate by examining gene circuits underlying somitogenesis, quantitative read-out and read-in analyses provide the strengths of flow cytometry expression analyses, but by preserving subcellular anatomical context, they enable bi-directional queries that open a new era for in situ hybridization. © 2018. Published by The Company of Biologists Ltd.

  5. A review of presented mathematical models in Parkinson's disease: black- and gray-box models.

    PubMed

    Sarbaz, Yashar; Pourakbari, Hakimeh

    2016-06-01

    Parkinson's disease (PD), one of the most common movement disorders, is caused by damage to the central nervous system. Despite all of the studies on PD, the formation mechanism of its symptoms remained unknown. It is still not obvious why damage only to the substantia nigra pars compacta, a small part of the brain, causes a wide range of symptoms. Moreover, the causes of brain damages remain to be fully elucidated. Exact understanding of the brain function seems to be impossible. On the other hand, some engineering tools are trying to understand the behavior and performance of complex systems. Modeling is one of the most important tools in this regard. Developing quantitative models for this disease has begun in recent decades. They are very effective not only in better understanding of the disease, offering new therapies, and its prediction and control, but also in its early diagnosis. Modeling studies include two main groups: black-box models and gray-box models. Generally, in the black-box modeling, regardless of the system information, the symptom is only considered as the output. Such models, besides the quantitative analysis studies, increase our knowledge of the disorders behavior and the disease symptoms. The gray-box models consider the involved structures in the symptoms appearance as well as the final disease symptoms. These models can effectively save time and be cost-effective for the researchers and help them select appropriate treatment mechanisms among all possible options. In this review paper, first, efforts are made to investigate some studies on PD quantitative analysis. Then, PD quantitative models will be reviewed. Finally, the results of using such models are presented to some extent.

  6. An integrated microfluidic sensor for real-time detection of RNA in seawater using preserved reagents

    NASA Astrophysics Data System (ADS)

    Tsaloglou, M.-N.; Loukas, C. M.; Ruano-López, J. M.; Morgan, H.; Mowlem, M. C.

    2012-04-01

    Quantitation of RNA sequences coding either for key metabolic proteins or highly conserved ribosomal subunits can provide insight on cell abundance, speciation and viability. Nucleic sequence-based amplification (NASBA) is an isothermal alternative to traditional nucleic acid amplification methods, such as quantitative PCR. We present here an integrated microfluidic sensor for cell concentration and lysis, RNA extraction/purification and quantitative RNA detection for environmental applications. The portable system uses pre-loaded reagents, stored as a gel on a disposable microfluidic cartridge, which is manufactured using low-cost injection moulding. The NASBA reaction is monitored real-time using a bespoke control unit which includes: an external fluorescence detector, three peristaltic micro-pumps, two heaters and temperature sensors, a battery, seven pin actuated micro-motors (or valve actuators), and an automatic cartridge insertion mechanism. The system has USB connectivity and none of the expensive components require replacing between reactions. Long-term storage of reagents is critically important for any diagnostic tool that will be used in the field, whether for medical or environmental analysis and has not been previously demonstrated for NASBA reagents on-chip. We have shown effective amplification, for as little as 500 cells of the toxic microalga Karenia brevis using reagents which had been preserved as a gel for 45 days. This is the first reported real-time isothermal RNA amplification using with on-chip preservation. Annealing of primers, amplification at 41 °C and real-time fluorescence detection using, also for the first time, an internal control and sequence-specific molecular beacons was all performed on our microfluidic sensor. Our results show excellent promise as a future quantitative tool of in situ phytoplankton analysis and other environmental applications, where long-term reagent storage and low power consumption is essential.

  7. DIGE Analysis of Human Tissues.

    PubMed

    Gelfi, Cecilia; Capitanio, Daniele

    2018-01-01

    Two-dimensional difference gel electrophoresis (2-D DIGE) is an advanced and elegant gel electrophoretic analytical tool for comparative protein assessment. It is based on two-dimensional gel electrophoresis (2-DE) separation of fluorescently labeled protein extracts. The tagging procedures are designed to not interfere with the chemical properties of proteins with respect to their pI and electrophoretic mobility, once a proper labeling protocol is followed. The two-dye or three-dye systems can be adopted and their choice depends on specific applications. Furthermore, the use of an internal pooled standard makes 2-D DIGE a highly accurate quantitative method enabling multiple protein samples to be separated on the same two-dimensional gel. The image matching and cross-gel statistical analysis generates robust quantitative results making data validation by independent technologies successful.

  8. Potential use of combining the diffusion equation with the free Shrödinger equation to improve the Optical Coherence Tomography image analysis

    NASA Astrophysics Data System (ADS)

    Cabrera Fernandez, Delia; Salinas, Harry M.; Somfai, Gabor; Puliafito, Carmen A.

    2006-03-01

    Optical coherence tomography (OCT) is a rapidly emerging medical imaging technology. In ophthalmology, OCT is a powerful tool because it enables visualization of the cross sectional structure of the retina and anterior eye with higher resolutions than any other non-invasive imaging modality. Furthermore, OCT image information can be quantitatively analyzed, enabling objective assessment of features such as macular edema and diabetes retinopathy. We present specific improvements in the quantitative analysis of the OCT system, by combining the diffusion equation with the free Shrödinger equation. In such formulation, important features of the image can be extracted by extending the analysis from the real axis to the complex domain. Experimental results indicate that our proposed novel approach has good performance in speckle noise removal, enhancement and segmentation of the various cellular layers of the retina using the OCT system.

  9. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  10. Morphological analysis of dendrites and spines by hybridization of ridge detection with twin support vector machine.

    PubMed

    Wang, Shuihua; Chen, Mengmeng; Li, Yang; Shao, Ying; Zhang, Yudong; Du, Sidan; Wu, Jane

    2016-01-01

    Dendritic spines are described as neuronal protrusions. The morphology of dendritic spines and dendrites has a strong relationship to its function, as well as playing an important role in understanding brain function. Quantitative analysis of dendrites and dendritic spines is essential to an understanding of the formation and function of the nervous system. However, highly efficient tools for the quantitative analysis of dendrites and dendritic spines are currently undeveloped. In this paper we propose a novel three-step cascaded algorithm-RTSVM- which is composed of ridge detection as the curvature structure identifier for backbone extraction, boundary location based on differences in density, the Hu moment as features and Twin Support Vector Machine (TSVM) classifiers for spine classification. Our data demonstrates that this newly developed algorithm has performed better than other available techniques used to detect accuracy and false alarm rates. This algorithm will be used effectively in neuroscience research.

  11. Simultaneous quantitative analysis of olmesartan, amlodipine and hydrochlorothiazide in their combined dosage form utilizing classical and alternating least squares based chemometric methods.

    PubMed

    Darwish, Hany W; Bakheit, Ahmed H; Abdelhameed, Ali S

    2016-03-01

    Simultaneous spectrophotometric analysis of a multi-component dosage form of olmesartan, amlodipine and hydrochlorothiazide used for the treatment of hypertension has been carried out using various chemometric methods. Multivariate calibration methods include classical least squares (CLS) executed by net analyte processing (NAP-CLS), orthogonal signal correction (OSC-CLS) and direct orthogonal signal correction (DOSC-CLS) in addition to multivariate curve resolution-alternating least squares (MCR-ALS). Results demonstrated the efficiency of the proposed methods as quantitative tools of analysis as well as their qualitative capability. The three analytes were determined precisely using the aforementioned methods in an external data set and in a dosage form after optimization of experimental conditions. Finally, the efficiency of the models was validated via comparison with the partial least squares (PLS) method in terms of accuracy and precision.

  12. The use of exergetic indicators in the food industry - A review.

    PubMed

    Zisopoulos, Filippos K; Rossier-Miranda, Francisco J; van der Goot, Atze Jan; Boom, Remko M

    2017-01-02

    Assessment of sustainability will become more relevant for the food industry in the years to come. Analysis based on exergy, including the use of exergetic indicators and Grassmann diagrams, is a useful tool for the quantitative and qualitative assessment of the efficiency of industrial food chains. In this paper, we review the methodology of exergy analysis and the exergetic indicators that are most appropriate for use in the food industry. The challenges of applying exergy analysis in industrial food chains and the specific features of food processes are also discussed.

  13. Mycotoxin analysis: an update.

    PubMed

    Krska, Rudolf; Schubert-Ullrich, Patricia; Molinelli, Alexandra; Sulyok, Michael; MacDonald, Susan; Crews, Colin

    2008-02-01

    Mycotoxin contamination of cereals and related products used for feed can cause intoxication, especially in farm animals. Therefore, efficient analytical tools for the qualitative and quantitative analysis of toxic fungal metabolites in feed are required. Current methods usually include an extraction step, a clean-up step to reduce or eliminate unwanted co-extracted matrix components and a separation step with suitably specific detection ability. Quantitative methods of analysis for most mycotoxins use immunoaffinity clean-up with high-performance liquid chromatography (HPLC) separation in combination with UV and/or fluorescence detection. Screening of samples contaminated with mycotoxins is frequently performed by thin layer chromatography (TLC), which yields qualitative or semi-quantitative results. Nowadays, enzyme-linked immunosorbent assays (ELISA) are often used for rapid screening. A number of promising methods, such as fluorescence polarization immunoassays, dipsticks, and even newer methods such as biosensors and non-invasive techniques based on infrared spectroscopy, have shown great potential for mycotoxin analysis. Currently, there is a strong trend towards the use of multi-mycotoxin methods for the simultaneous analysis of several of the important Fusarium mycotoxins, which is best achieved by LC-MS/MS (liquid chromatography with tandem mass spectrometry). This review focuses on recent developments in the determination of mycotoxins with a special emphasis on LC-MS/MS and emerging rapid methods.

  14. Accurate ECG diagnosis of atrial tachyarrhythmias using quantitative analysis: a prospective diagnostic and cost-effectiveness study.

    PubMed

    Krummen, David E; Patel, Mitul; Nguyen, Hong; Ho, Gordon; Kazi, Dhruv S; Clopton, Paul; Holland, Marian C; Greenberg, Scott L; Feld, Gregory K; Faddis, Mitchell N; Narayan, Sanjiv M

    2010-11-01

    Quantitative ECG Analysis. Optimal atrial tachyarrhythmia management is facilitated by accurate electrocardiogram interpretation, yet typical atrial flutter (AFl) may present without sawtooth F-waves or RR regularity, and atrial fibrillation (AF) may be difficult to separate from atypical AFl or rapid focal atrial tachycardia (AT). We analyzed whether improved diagnostic accuracy using a validated analysis tool significantly impacts costs and patient care. We performed a prospective, blinded, multicenter study using a novel quantitative computerized algorithm to identify atrial tachyarrhythmia mechanism from the surface ECG in patients referred for electrophysiology study (EPS). In 122 consecutive patients (age 60 ± 12 years) referred for EPS, 91 sustained atrial tachyarrhythmias were studied. ECGs were also interpreted by 9 physicians from 3 specialties for comparison and to allow healthcare system modeling. Diagnostic accuracy was compared to the diagnosis at EPS. A Markov model was used to estimate the impact of improved arrhythmia diagnosis. We found 13% of typical AFl ECGs had neither sawtooth flutter waves nor RR regularity, and were misdiagnosed by the majority of clinicians (0/6 correctly diagnosed by consensus visual interpretation) but correctly by quantitative analysis in 83% (5/6, P = 0.03). AF diagnosis was also improved through use of the algorithm (92%) versus visual interpretation (primary care: 76%, P < 0.01). Economically, we found that these improvements in diagnostic accuracy resulted in an average cost-savings of $1,303 and 0.007 quality-adjusted-life-years per patient. Typical AFl and AF are frequently misdiagnosed using visual criteria. Quantitative analysis improves diagnostic accuracy and results in improved healthcare costs and patient outcomes. © 2010 Wiley Periodicals, Inc.

  15. Survival Prediction in Pancreatic Ductal Adenocarcinoma by Quantitative Computed Tomography Image Analysis.

    PubMed

    Attiyeh, Marc A; Chakraborty, Jayasree; Doussot, Alexandre; Langdon-Embry, Liana; Mainarich, Shiana; Gönen, Mithat; Balachandran, Vinod P; D'Angelica, Michael I; DeMatteo, Ronald P; Jarnagin, William R; Kingham, T Peter; Allen, Peter J; Simpson, Amber L; Do, Richard K

    2018-04-01

    Pancreatic cancer is a highly lethal cancer with no established a priori markers of survival. Existing nomograms rely mainly on post-resection data and are of limited utility in directing surgical management. This study investigated the use of quantitative computed tomography (CT) features to preoperatively assess survival for pancreatic ductal adenocarcinoma (PDAC) patients. A prospectively maintained database identified consecutive chemotherapy-naive patients with CT angiography and resected PDAC between 2009 and 2012. Variation in CT enhancement patterns was extracted from the tumor region using texture analysis, a quantitative image analysis tool previously described in the literature. Two continuous survival models were constructed, with 70% of the data (training set) using Cox regression, first based only on preoperative serum cancer antigen (CA) 19-9 levels and image features (model A), and then on CA19-9, image features, and the Brennan score (composite pathology score; model B). The remaining 30% of the data (test set) were reserved for independent validation. A total of 161 patients were included in the analysis. Training and test sets contained 113 and 48 patients, respectively. Quantitative image features combined with CA19-9 achieved a c-index of 0.69 [integrated Brier score (IBS) 0.224] on the test data, while combining CA19-9, imaging, and the Brennan score achieved a c-index of 0.74 (IBS 0.200) on the test data. We present two continuous survival prediction models for resected PDAC patients. Quantitative analysis of CT texture features is associated with overall survival. Further work includes applying the model to an external dataset to increase the sample size for training and to determine its applicability.

  16. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  17. Computational tool for the early screening of monoclonal antibodies for their viscosities

    PubMed Central

    Agrawal, Neeraj J; Helk, Bernhard; Kumar, Sandeep; Mody, Neil; Sathish, Hasige A.; Samra, Hardeep S.; Buck, Patrick M; Li, Li; Trout, Bernhardt L

    2016-01-01

    Highly concentrated antibody solutions often exhibit high viscosities, which present a number of challenges for antibody-drug development, manufacturing and administration. The antibody sequence is a key determinant for high viscosity of highly concentrated solutions; therefore, a sequence- or structure-based tool that can identify highly viscous antibodies from their sequence would be effective in ensuring that only antibodies with low viscosity progress to the development phase. Here, we present a spatial charge map (SCM) tool that can accurately identify highly viscous antibodies from their sequence alone (using homology modeling to determine the 3-dimensional structures). The SCM tool has been extensively validated at 3 different organizations, and has proved successful in correctly identifying highly viscous antibodies. As a quantitative tool, SCM is amenable to high-throughput automated analysis, and can be effectively implemented during the antibody screening or engineering phase for the selection of low-viscosity antibodies. PMID:26399600

  18. Process analytical technology in the pharmaceutical industry: a toolkit for continuous improvement.

    PubMed

    Scott, Bradley; Wilcock, Anne

    2006-01-01

    Process analytical technology (PAT) refers to a series of tools used to ensure that quality is built into products while at the same time improving the understanding of processes, increasing efficiency, and decreasing costs. It has not been widely adopted by the pharmaceutical industry. As the setting for this paper, the current pharmaceutical manufacturing paradigm and PAT guidance to date are discussed prior to the review of PAT principles and tools, benefits, and challenges. The PAT toolkit contains process analyzers, multivariate analysis tools, process control tools, and continuous improvement/knowledge management/information technology systems. The integration and implementation of these tools is complex, and has resulted in uncertainty with respect to both regulation and validation. The paucity of staff knowledgeable in this area may complicate adoption. Studies to quantitate the benefits resulting from the adoption of PAT within the pharmaceutical industry would be a valuable addition to the qualitative studies that are currently available.

  19. Semi-quantitative methods yield greater inter- and intraobserver agreement than subjective methods for interpreting 99m technetium-hydroxymethylene-diphosphonate uptake in equine thoracic processi spinosi.

    PubMed

    van Zadelhoff, Claudia; Ehrle, Anna; Merle, Roswitha; Jahn, Werner; Lischer, Christoph

    2018-05-09

    Scintigraphy is a standard diagnostic method for evaluating horses with back pain due to suspected thoracic processus spinosus pathology. Lesion detection is based on subjective or semi-quantitative assessments of increased uptake. This retrospective, analytical study is aimed to compare semi-quantitative and subjective methods in the evaluation of scintigraphic images of the processi spinosi in the equine thoracic spine. Scintigraphic images of 20 Warmblood horses, presented for assessment of orthopedic conditions between 2014 and 2016, were included in the study. Randomized, blinded image evaluation was performed by 11 veterinarians using subjective and semi-quantitative methods. Subjective grading was performed for the analysis of red-green-blue and grayscale scintigraphic images, which were presented in full-size or as masked images. For the semi-quantitative assessment, observers placed regions of interest over each processus spinosus. The uptake ratio of each processus spinosus in comparison to a reference region of interest was determined. Subsequently, a modified semi-quantitative calculation was developed whereby only the highest counts-per-pixel for a specified number of pixels was processed. Inter- and intraobserver agreement was calculated using intraclass correlation coefficients. Inter- and intraobserver intraclass correlation coefficients were 41.65% and 71.39%, respectively, for the subjective image assessment. Additionally, a correlation between intraobserver agreement, experience, and grayscale images was identified. The inter- and intraobserver agreement was significantly increased when using semi-quantitative analysis (97.35% and 98.36%, respectively) or the modified semi-quantitative calculation (98.61% and 98.82%, respectively). The proposed modified semi-quantitative technique showed a higher inter- and intraobserver agreement when compared to other methods, which makes it a useful tool for the analysis of scintigraphic images. The association of the findings from this study with clinical and radiological examinations requires further investigation. © 2018 American College of Veterinary Radiology.

  20. A Lean Six Sigma approach to the improvement of the selenium analysis method.

    PubMed

    Cloete, Bronwyn C; Bester, André

    2012-11-02

    Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL). The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC). Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM) was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any laboratory, and represents both a management discipline, and a standardised approach to problem solving and process optimisation.

  1. Systematic review of methods for quantifying teamwork in the operating theatre

    PubMed Central

    Marshall, D.; Sykes, M.; McCulloch, P.; Shalhoub, J.; Maruthappu, M.

    2018-01-01

    Background Teamwork in the operating theatre is becoming increasingly recognized as a major factor in clinical outcomes. Many tools have been developed to measure teamwork. Most fall into two categories: self‐assessment by theatre staff and assessment by observers. A critical and comparative analysis of the validity and reliability of these tools is lacking. Methods MEDLINE and Embase databases were searched following PRISMA guidelines. Content validity was assessed using measurements of inter‐rater agreement, predictive validity and multisite reliability, and interobserver reliability using statistical measures of inter‐rater agreement and reliability. Quantitative meta‐analysis was deemed unsuitable. Results Forty‐eight articles were selected for final inclusion; self‐assessment tools were used in 18 and observational tools in 28, and there were two qualitative studies. Self‐assessment of teamwork by profession varied with the profession of the assessor. The most robust self‐assessment tool was the Safety Attitudes Questionnaire (SAQ), although this failed to demonstrate multisite reliability. The most robust observational tool was the Non‐Technical Skills (NOTECHS) system, which demonstrated both test–retest reliability (P > 0·09) and interobserver reliability (Rwg = 0·96). Conclusion Self‐assessment of teamwork by the theatre team was influenced by professional differences. Observational tools, when used by trained observers, circumvented this.

  2. Micro-computed tomography of false starts produced on bone by different hand-saws.

    PubMed

    Pelletti, Guido; Viel, Guido; Fais, Paolo; Viero, Alessia; Visentin, Sindi; Miotto, Diego; Montisci, Massimo; Cecchetto, Giovanni; Giraudo, Chiara

    2017-05-01

    The analysis of macro- and microscopic characteristics of saw marks on bones can provide useful information about the class of the tool utilized to produce the injury. The aim of the present study was to test micro-computed tomography (micro-CT) for the analysis of false starts experimentally produced on 32 human bone sections using 4 different hand-saws in order to verify the potential utility of micro-CT for distinguishing false starts produced by different saws and to correlate the morphology of the tool with that of the bone mark. Each sample was analysed through stereomicroscopy and micro-CT. Stereomicroscopic analysis allowed the identification of the false starts and the detection of the number of tool marks left by each saw. Micro-CT scans, through the integration of 3D renders and multiplanar reconstructions (MPR), allowed the identification of the shape of each false start correlating it to the injuring tool. Our results suggest that micro-CT could be a useful technique for assessing false starts produced by different classes of saws, providing accurate morphological profiles of the bone marks with all the advantages of high resolution 3D imaging (e.g., high accuracy, non-destructive analysis, preservation and documentation of evidence). However, further studies are necessary to integrate qualitative data with quantitative metrical analysis in order to further characterize the false start and the related injuring tool. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. A writer's guide to education scholarship: Quantitative methodologies for medical education research (part 1).

    PubMed

    Thoma, Brent; Camorlinga, Paola; Chan, Teresa M; Hall, Andrew Koch; Murnaghan, Aleisha; Sherbino, Jonathan

    2018-01-01

    Quantitative research is one of the many research methods used to help educators advance their understanding of questions in medical education. However, little research has been done on how to succeed in publishing in this area. We conducted a scoping review to identify key recommendations and reporting guidelines for quantitative educational research and scholarship. Medline, ERIC, and Google Scholar were searched for English-language articles published between 2006 and January 2016 using the search terms, "research design," "quantitative," "quantitative methods," and "medical education." A hand search was completed for additional references during the full-text review. Titles/abstracts were reviewed by two authors (BT, PC) and included if they focused on quantitative research in medical education and outlined reporting guidelines, or provided recommendations on conducting quantitative research. One hundred articles were reviewed in parallel with the first 30 used for calibration and the subsequent 70 to calculate Cohen's kappa coefficient. Two reviewers (BT, PC) conducted a full text review and extracted recommendations and reporting guidelines. A simple thematic analysis summarized the extracted recommendations. Sixty-one articles were reviewed in full, and 157 recommendations were extracted. The thematic analysis identified 86 items, 14 categories, and 3 themes. Fourteen quality evaluation tools and reporting guidelines were found. Discussion This paper provides guidance for junior researchers in the form of key quality markers and reporting guidelines. We hope that quantitative researchers in medical education will be informed by the results and that further work will be done to refine the list of recommendations.

  4. Kinematics of mechanical and adhesional micromanipulation under a scanning electron microscope

    NASA Astrophysics Data System (ADS)

    Saito, Shigeki; Miyazaki, Hideki T.; Sato, Tomomasa; Takahashi, Kunio

    2002-11-01

    In this paper, the kinematics of mechanical and adhesional micromanipulation using a needle-shaped tool under a scanning electron microscope is analyzed. A mode diagram is derived to indicate the possible micro-object behavior for the specified operational conditions. Based on the diagram, a reasonable method for pick and place operation is proposed. The keys to successful analysis are to introduce adhesional and rolling-resistance factors into the kinematic system consisting of a sphere, a needle-shaped tool, and a substrate, and to consider the time dependence of these factors due to the electron-beam (EB) irradiation. Adhesional force and the lower limit of maximum rolling resistance are evaluated quantitatively in theoretical and experimental ways. This analysis shows that it is possible to control the fracture of either the tool-sphere or substrate-sphere interface of the system selectively by the tool-loading angle and that such a selective fracture of the interfaces enables reliable pick or place operation even under EB irradiation. Although the conventional micromanipulation was not repeatable because the technique was based on an empirically effective method, this analysis should provide us with a guideline to reliable micromanipulation.

  5. MaGelLAn 1.0: a software to facilitate quantitative and population genetic analysis of maternal inheritance by combination of molecular and pedigree information.

    PubMed

    Ristov, Strahil; Brajkovic, Vladimir; Cubric-Curik, Vlatka; Michieli, Ivan; Curik, Ino

    2016-09-10

    Identification of genes or even nucleotides that are responsible for quantitative and adaptive trait variation is a difficult task due to the complex interdependence between a large number of genetic and environmental factors. The polymorphism of the mitogenome is one of the factors that can contribute to quantitative trait variation. However, the effects of the mitogenome have not been comprehensively studied, since large numbers of mitogenome sequences and recorded phenotypes are required to reach the adequate power of analysis. Current research in our group focuses on acquiring the necessary mitochondria sequence information and analysing its influence on the phenotype of a quantitative trait. To facilitate these tasks we have produced software for processing pedigrees that is optimised for maternal lineage analysis. We present MaGelLAn 1.0 (maternal genealogy lineage analyser), a suite of four Python scripts (modules) that is designed to facilitate the analysis of the impact of mitogenome polymorphism on quantitative trait variation by combining molecular and pedigree information. MaGelLAn 1.0 is primarily used to: (1) optimise the sampling strategy for molecular analyses; (2) identify and correct pedigree inconsistencies; and (3) identify maternal lineages and assign the corresponding mitogenome sequences to all individuals in the pedigree, this information being used as input to any of the standard software for quantitative genetic (association) analysis. In addition, MaGelLAn 1.0 allows computing the mitogenome (maternal) effective population sizes and probability of mitogenome (maternal) identity that are useful for conservation management of small populations. MaGelLAn is the first tool for pedigree analysis that focuses on quantitative genetic analyses of mitogenome data. It is conceived with the purpose to significantly reduce the effort in handling and preparing large pedigrees for processing the information linked to maternal lines. The software source code, along with the manual and the example files can be downloaded at http://lissp.irb.hr/software/magellan-1-0/ and https://github.com/sristov/magellan .

  6. How can my research paper be useful for future meta-analyses on forest restoration practices?

    Treesearch

    Enrique Andivia; Pedro Villar‑Salvador; Juan A. Oliet; Jaime Puertolas; R. Kasten Dumroese

    2018-01-01

    Statistical meta-analysis is a powerful and useful tool to quantitatively synthesize the information conveyed in published studies on a particular topic. It allows identifying and quantifying overall patterns and exploring causes of variation. The inclusion of published works in meta-analyses requires, however, a minimum quality standard of the reported data and...

  7. Symposium II: Mechanochemistry in Materials Science, MRS Fall Meeting, Nov 30-Dec 4, 2009, Boston, MA

    DTIC Science & Technology

    2010-09-02

    Dynamic Mechanical Analysis (DMA). The fracture behavior of the mechanophore-linked polymer is also examined through the Double Cleavage Drilled ...multinary complex structures. Structural, microstructural, and chemical characterizations were explored by metrological tools to support this...simple hydrocarbons in order to quantitatively define structure-property relationships for reacting materials under shock compression. Embedded gauge

  8. A Quantitative Experimental Study of the Effectiveness of Systems to Identify Network Attackers

    ERIC Educational Resources Information Center

    Handorf, C. Russell

    2016-01-01

    This study analyzed the meta-data collected from a honeypot that was run by the Federal Bureau of Investigation for a period of 5 years. This analysis compared the use of existing industry methods and tools, such as Intrusion Detection System alerts, network traffic flow and system log traffic, within the Open Source Security Information Manager…

  9. A Primer on Value-Added Models: Towards a Better Understanding of the Quantitative Analysis of Student Achievement

    ERIC Educational Resources Information Center

    Nakamura, Yugo

    2013-01-01

    Value-added models (VAMs) have received considerable attention as a tool to transform our public education system. However, as VAMs are studied by researchers from a broad range of academic disciplines who remain divided over the best methods in analyzing the models and stakeholders without the extensive statistical background have been excluded…

  10. Model Analysis and Model Creation: Capturing the Task-Model Structure of Quantitative Item Domains. Research Report. ETS RR-06-11

    ERIC Educational Resources Information Center

    Deane, Paul; Graf, Edith Aurora; Higgins, Derrick; Futagi, Yoko; Lawless, René

    2006-01-01

    This study focuses on the relationship between item modeling and evidence-centered design (ECD); it considers how an appropriately generalized item modeling software tool can support systematic identification and exploitation of task-model variables, and then examines the feasibility of this goal, using linear-equation items as a test case. The…

  11. INFRARED SPECTROSCOPY: A TOOL FOR DETERMINATION OF THE DEGREE OF CONVERSION IN DENTAL COMPOSITES

    PubMed Central

    Moraes, Luciene Gonçalves Palmeira; Rocha, Renata Sanches Ferreira; Menegazzo, Lívia Maluf; de AraÚjo, Eudes Borges; Yukimitu, Keizo; Moraes, João Carlos Silos

    2008-01-01

    Infrared spectroscopy is one of the most widely used techniques for measurement of conversion degree in dental composites. However, to obtain good quality spectra and quantitative analysis from spectral data, appropriate expertise and knowledge of the technique are mandatory. This paper presents important details to use infrared spectroscopy for determination of the conversion degree. PMID:19089207

  12. Harry S. Truman Dam and Reservoir, Missouri, Holocene Adaptations Within the Lower Pomme de Terre River Valley, Missouri. Volume 2.

    DTIC Science & Technology

    1982-06-01

    B. 1966 Quantitative analysis of Upper Paleolithic stone tools. American Antiquity 68(2-2) :356-394. Scully, E. G. 1951 Some central Mississippi...Programming for the Social Sciences. Holt, Rinehart and Winston, New York. White, A. M. 1973 Le Malpas Rockshelter: a study of late Paleolithic technology in

  13. Dual-isotope Cryo-imaging Quantitative Autoradiography (CIQA): Anvestigating Antibody-Drug Conjugate Distribution And Payload Delivery Through Imaging.

    PubMed

    Ilovich, Ohad; Qutaish, Mohammed; Hesterman, Jacob; Orcutt, Kelly; Hoppin, Jack; Polyak, Ildiko; Seaman, Marc; Abu-Yousif, Adnan; Cvet, Donna; Bradley, Daniel

    2018-05-04

    In vitro properties of antibody drug conjugates (ADCs) such as binding, internalization, and cytotoxicity are often well characterized prior to in vivo studies. Interpretation of in vivo studies could significantly be enhanced by molecular imaging tools. We present here a dual-isotope cryo-imaging quantitative autoradiography (CIQA) methodology combined with advanced 3D imaging and analysis allowing for the simultaneous study of both antibody and payload distribution in tissues of interest. in a pre-clinical setting. Methods: TAK-264, an investigational anti-guanylyl cyclase C (GCC) targeting ADC was synthesized utilizing tritiated Monomethyl auristatin E (MMAE). The tritiated ADC was then conjugated to DTPA, labeled with indium-111 and evaluated in vivo in GCC-positive and GCC-negative tumor-bearing animals. Results: Cryo-imaging Quantitative Autoradiography (CIQA) reveals the time course of drug release from ADC and its distribution into various tumor regions seemingly impenetrablethat are less accessible to the antibody. For GCC-positive tumors, a representative section obtained 96 hours post tracer injection showed only 0.8% of the voxels have co-localized signal versus over 15% of the voxels for a GCC-negative tumor section., suggesting successful and specific cleaving of the toxin in the antigen positive lesions. Conclusion: The combination of a veteran established autoradiography technology with advanced image analysis methodologies affords an experimental tool that can support detailed characterization of ADC tumor penetration and pharmacokinetics. Copyright © 2018 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  14. Remote operated vehicle with CO{sub 2} blasting (ROVCO{sub 2}): Volume 1. Final report, September 1993--July 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-06-01

    This report documents the second phase of the Remote Operated Vehicle with CO{sub 2} Blasting (ROVCO{sub 2}) Program. The ROVCO{sub 2} Program`s goal is to develop and demonstrate a tool to improve the productivity of concrete floor decontamination. The second phase integrated non-developmental subsystems on to the ROVCO{sub 2} system and performed quantitative decontamination effectiveness, productivity, and reliability testings. The report documents these development activities and the analysis of cost and performance. The results show that the ROVCO{sub 2} system is an efficient decontamination tool.

  15. Development of a Biological Science Quantitative Reasoning Exam (BioSQuaRE)

    PubMed Central

    Stanhope, Liz; Ziegler, Laura; Haque, Tabassum; Le, Laura; Vinces, Marcelo; Davis, Gregory K.; Zieffler, Andrew; Brodfuehrer, Peter; Preest, Marion; M. Belitsky, Jason; Umbanhowar, Charles; Overvoorde, Paul J.

    2017-01-01

    Multiple reports highlight the increasingly quantitative nature of biological research and the need to innovate means to ensure that students acquire quantitative skills. We present a tool to support such innovation. The Biological Science Quantitative Reasoning Exam (BioSQuaRE) is an assessment instrument designed to measure the quantitative skills of undergraduate students within a biological context. The instrument was developed by an interdisciplinary team of educators and aligns with skills included in national reports such as BIO2010, Scientific Foundations for Future Physicians, and Vision and Change. Undergraduate biology educators also confirmed the importance of items included in the instrument. The current version of the BioSQuaRE was developed through an iterative process using data from students at 12 postsecondary institutions. A psychometric analysis of these data provides multiple lines of evidence for the validity of inferences made using the instrument. Our results suggest that the BioSQuaRE will prove useful to faculty and departments interested in helping students acquire the quantitative competencies they need to successfully pursue biology, and useful to biology students by communicating the importance of quantitative skills. We invite educators to use the BioSQuaRE at their own institutions. PMID:29196427

  16. Arenal-type pyroclastic flows: A probabilistic event tree risk analysis

    NASA Astrophysics Data System (ADS)

    Meloy, Anthony F.

    2006-09-01

    A quantitative hazard-specific scenario-modelling risk analysis is performed at Arenal volcano, Costa Rica for the newly recognised Arenal-type pyroclastic flow (ATPF) phenomenon using an event tree framework. These flows are generated by the sudden depressurisation and fragmentation of an active basaltic andesite lava pool as a result of a partial collapse of the crater wall. The deposits of this type of flow include angular blocks and juvenile clasts, which are rarely found in other types of pyroclastic flow. An event tree analysis (ETA) is a useful tool and framework in which to analyse and graphically present the probabilities of the occurrence of many possible events in a complex system. Four event trees are created in the analysis, three of which are extended to investigate the varying individual risk faced by three generic representatives of the surrounding community: a resident, a worker, and a tourist. The raw numerical risk estimates determined by the ETA are converted into a set of linguistic expressions (i.e. VERY HIGH, HIGH, MODERATE etc.) using an established risk classification scale. Three individually tailored semi-quantitative risk maps are then created from a set of risk conversion tables to show how the risk varies for each individual in different areas around the volcano. In some cases, by relocating from the north to the south, the level of risk can be reduced by up to three classes. While the individual risk maps may be broadly applicable, and therefore of interest to the general community, the risk maps and associated probability values generated in the ETA are intended to be used by trained professionals and government agencies to evaluate the risk and effectively manage the long-term development of infrastructure and habitation. With the addition of fresh monitoring data, the combination of both long- and short-term event trees would provide a comprehensive and consistent method of risk analysis (both during and pre-crisis), and as such, an ETA is considered to be a valuable quantitative decision support tool.

  17. Combination of nano-material enrichment and dead-end filtration for uniform and rapid sample preparation in matrix-assisted laser desorption/ionization mass spectrometry.

    PubMed

    Wu, Zengnan; Khan, Mashooq; Mao, Sifeng; Lin, Ling; Lin, Jin-Ming

    2018-05-01

    Matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) is a fast analysis tool for the detection of a wide range of analytes. However, heterogeneous distribution of matrix/analyte cocrystal, variation in signal intensity and poor experimental reproducibility at different locations of the same spot means difficulty in quantitative analysis. In this work, carbon nanotubes (CNTs) were employed as adsorbent for analyte cum matrix on a conductive porous membrane as a novel mass target plate. The sample pretreatment step was achieved by enrichment and dead-end filtration and dried by a solid-liquid separation. This approach enables the homogeneous distribution of analyte in the matrix, good shot-to-shot reproducibility in signals and quantitative detection of peptide and protein at different concentrations with correlation coefficient (R 2 ) of 0.9920 and 0.9909, respectively. The simple preparation of sample in a short time, uniform distribution of analyte, easy quantitative detection, and high reproducibility makes this technique useful and may diversify the application of MALDI-MS for quantitative detection of a variety of proteins. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. PIQMIe: a web server for semi-quantitative proteomics data management and analysis

    PubMed Central

    Kuzniar, Arnold; Kanaar, Roland

    2014-01-01

    We present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates peptide and (non-redundant) protein identifications and quantitations from multiple experiments with additional biological information on the protein entries, and makes the linked data available in the form of a light-weight relational database, which enables dedicated data analyses (e.g. in R) and user-driven queries. Using the web interface, users are presented with a concise summary of their proteomics experiments in numerical and graphical forms, as well as with a searchable protein grid and interactive visualization tools to aid in the rapid assessment of the experiments and in the identification of proteins of interest. The web server not only provides data access through a web interface but also supports programmatic access through RESTful web service. The web server is available at http://piqmie.semiqprot-emc.cloudlet.sara.nl or http://www.bioinformatics.nl/piqmie. This website is free and open to all users and there is no login requirement. PMID:24861615

  19. Approaching human language with complex networks

    NASA Astrophysics Data System (ADS)

    Cong, Jin; Liu, Haitao

    2014-12-01

    The interest in modeling and analyzing human language with complex networks is on the rise in recent years and a considerable body of research in this area has already been accumulated. We survey three major lines of linguistic research from the complex network approach: 1) characterization of human language as a multi-level system with complex network analysis; 2) linguistic typological research with the application of linguistic networks and their quantitative measures; and 3) relationships between the system-level complexity of human language (determined by the topology of linguistic networks) and microscopic linguistic (e.g., syntactic) features (as the traditional concern of linguistics). We show that the models and quantitative tools of complex networks, when exploited properly, can constitute an operational methodology for linguistic inquiry, which contributes to the understanding of human language and the development of linguistics. We conclude our review with suggestions for future linguistic research from the complex network approach: 1) relationships between the system-level complexity of human language and microscopic linguistic features; 2) expansion of research scope from the global properties to other levels of granularity of linguistic networks; and 3) combination of linguistic network analysis with other quantitative studies of language (such as quantitative linguistics).

  20. PIQMIe: a web server for semi-quantitative proteomics data management and analysis.

    PubMed

    Kuzniar, Arnold; Kanaar, Roland

    2014-07-01

    We present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates peptide and (non-redundant) protein identifications and quantitations from multiple experiments with additional biological information on the protein entries, and makes the linked data available in the form of a light-weight relational database, which enables dedicated data analyses (e.g. in R) and user-driven queries. Using the web interface, users are presented with a concise summary of their proteomics experiments in numerical and graphical forms, as well as with a searchable protein grid and interactive visualization tools to aid in the rapid assessment of the experiments and in the identification of proteins of interest. The web server not only provides data access through a web interface but also supports programmatic access through RESTful web service. The web server is available at http://piqmie.semiqprot-emc.cloudlet.sara.nl or http://www.bioinformatics.nl/piqmie. This website is free and open to all users and there is no login requirement. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. Detection and Characterization of Boundary-Layer Transition in Flight at Supersonic Conditions Using Infrared Thermography

    NASA Technical Reports Server (NTRS)

    Banks, Daniel W.

    2008-01-01

    Infrared thermography is a powerful tool for investigating fluid mechanics on flight vehicles. (Can be used to visualize and characterize transition, shock impingement, separation etc.). Updated onboard F-15 based system was used to visualize supersonic boundary layer transition test article. (Tollmien-Schlichting and cross-flow dominant flow fields). Digital Recording improves image quality and analysis capability. (Allows accurate quantitative (temperature) measurements, Greater enhancement through image processing allows analysis of smaller scale phenomena).

  2. Development and single-laboratory validation of a UHPLC-MS/MS method for quantitation of microcystins and nodularin in natural water, cyanobacteria, shellfish and algal supplement tablet powders.

    PubMed

    Turner, Andrew D; Waack, Julia; Lewis, Adam; Edwards, Christine; Lawton, Linda

    2018-02-01

    A simple, rapid UHPLC-MS/MS method has been developed and optimised for the quantitation of microcystins and nodularin in wide variety of sample matrices. Microcystin analogues targeted were MC-LR, MC-RR, MC-LA, MC-LY, MC-LF, LC-LW, MC-YR, MC-WR, [Asp3] MC-LR, [Dha7] MC-LR, MC-HilR and MC-HtyR. Optimisation studies were conducted to develop a simple, quick and efficient extraction protocol without the need for complex pre-analysis concentration procedures, together with a rapid sub 5min chromatographic separation of toxins in shellfish and algal supplement tablet powders, as well as water and cyanobacterial bloom samples. Validation studies were undertaken on each matrix-analyte combination to the full method performance characteristics following international guidelines. The method was found to be specific and linear over the full calibration range. Method sensitivity in terms of limits of detection, quantitation and reporting were found to be significantly improved in comparison to LC-UV methods and applicable to the analysis of each of the four matrices. Overall, acceptable recoveries were determined for each of the matrices studied, with associated precision and within-laboratory reproducibility well within expected guidance limits. Results from the formalised ruggedness analysis of all available cyanotoxins, showed that the method was robust for all parameters investigated. The results presented here show that the optimised LC-MS/MS method for cyanotoxins is fit for the purpose of detection and quantitation of a range of microcystins and nodularin in shellfish, algal supplement tablet powder, water and cyanobacteria. The method provides a valuable early warning tool for the rapid, routine extraction and analysis of natural waters, cyanobacterial blooms, algal powders, food supplements and shellfish tissues, enabling monitoring labs to supplement traditional microscopy techniques and report toxicity results within a short timeframe of sample receipt. The new method, now accredited to ISO17025 standard, is simple, quick, applicable to multiple matrices and is highly suitable for use as a routine, high-throughout, fast turnaround regulatory monitoring tool. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. 2L-PCA: a two-level principal component analyzer for quantitative drug design and its applications.

    PubMed

    Du, Qi-Shi; Wang, Shu-Qing; Xie, Neng-Zhong; Wang, Qing-Yan; Huang, Ri-Bo; Chou, Kuo-Chen

    2017-09-19

    A two-level principal component predictor (2L-PCA) was proposed based on the principal component analysis (PCA) approach. It can be used to quantitatively analyze various compounds and peptides about their functions or potentials to become useful drugs. One level is for dealing with the physicochemical properties of drug molecules, while the other level is for dealing with their structural fragments. The predictor has the self-learning and feedback features to automatically improve its accuracy. It is anticipated that 2L-PCA will become a very useful tool for timely providing various useful clues during the process of drug development.

  4. Translational PK/PD of Anti-Infective Therapeutics

    PubMed Central

    Rathi, Chetan; Lee, Richard E.; Meibohm, Bernd

    2016-01-01

    Translational PK/PD modeling has emerged as a critical technique for quantitative analysis of the relationship between dose, exposure and response of antibiotics. By combining model components for pharmacokinetics, bacterial growth kinetics and concentration-dependent drug effects, these models are able to quantitatively capture and simulate the complex interplay between antibiotic, bacterium and host organism. Fine-tuning of these basic model structures allows to further account for complicating factors such as resistance development, combination therapy, or host responses. With this tool set at hand, mechanism-based PK/PD modeling and simulation allows to develop optimal dosing regimens for novel and established antibiotics for maximum efficacy and minimal resistance development. PMID:27978987

  5. Two-dimensional grayscale ultrasound and spectral Doppler waveform evaluation of dogs with chronic enteropathies.

    PubMed

    Gaschen, Lorrie; Kircher, Patrick

    2007-08-01

    Sonography is an important diagnostic tool to examine the gastrointestinal tract of dogs with chronic diarrhea. Two-dimensional grayscale ultrasound parameters to assess for various enteropathies primarily focus on wall thickness and layering. Mild, generalized thickening of the intestinal wall with maintenance of the wall layering is common in inflammatory bowel disease. Quantitative and semi-quantitative spectral Doppler arterial waveform analysis can be utilized for various enteropathies, including inflammatory bowel disease and food allergies. Dogs with inflammatory bowel disease have inadequate hemodynamic responses during digestion of food. Dogs with food allergies have prolonged vasodilation and lower resistive and pulsatility indices after eating allergen-inducing foods.

  6. Investigating the Validity of Two Widely Used Quantitative Text Tools

    ERIC Educational Resources Information Center

    Cunningham, James W.; Hiebert, Elfrieda H.; Mesmer, Heidi Anne

    2018-01-01

    In recent years, readability formulas have gained new prominence as a basis for selecting texts for learning and assessment. Variables that quantitative tools count (e.g., word frequency, sentence length) provide valid measures of text complexity insofar as they accurately predict representative and high-quality criteria. The longstanding…

  7. Qualitative and Quantitative Management Tools Used by Financial Officers in Public Research Universities

    ERIC Educational Resources Information Center

    Trexler, Grant Lewis

    2012-01-01

    This dissertation set out to identify effective qualitative and quantitative management tools used by financial officers (CFOs) in carrying out their management functions of planning, decision making, organizing, staffing, communicating, motivating, leading and controlling at a public research university. In addition, impediments to the use of…

  8. Quantitative Decision Making.

    ERIC Educational Resources Information Center

    Baldwin, Grover H.

    The use of quantitative decision making tools provides the decision maker with a range of alternatives among which to decide, permits acceptance and use of the optimal solution, and decreases risk. Training line administrators in the use of these tools can help school business officials obtain reliable information upon which to base district…

  9. Combining qualitative and quantitative methods to analyze serious games outcomes: A pilot study for a new cognitive screening tool.

    PubMed

    Vallejo, Vanessa; Mitache, Andrei V; Tarnanas, Ioannis; Muri, Rene; Mosimann, Urs P; Nef, Tobias

    2015-08-01

    Computer games for a serious purpose - so called serious games can provide additional information for the screening and diagnosis of cognitive impairment. Moreover, they have the advantage of being an ecological tool by involving daily living tasks. However, there is a need for better comprehensive designs regarding the acceptance of this technology, as the target population is older adults that are not used to interact with novel technologies. Moreover given the complexity of the diagnosis and the need for precise assessment, an evaluation of the best approach to analyze the performance data is required. The present study examines the usability of a new screening tool and proposes several new outlines for data analysis.

  10. Environmental impact assessment for alternative-energy power plants in México.

    PubMed

    González-Avila, María E; Beltrán-Morales, Luis Felipe; Braker, Elizabeth; Ortega-Rubio, Alfredo

    2006-07-01

    Ten Environmental Impact Assessment Reports (EIAR) were reviewed for projects involving alternative power plants in Mexico developed during the last twelve years. Our analysis focused on the methods used to assess the impacts produced by hydroelectric and geothermal power projects. These methods used to assess impacts in EIARs ranged from the most simple, descriptive criteria, to quantitative models. These methods are not concordant with the level of the EIAR required by the environmental authority or even, with the kind of project developed. It is concluded that there is no correlation between the tools used to assess impacts and the assigned type of the EIAR. Because the methods to assess impacts produced by these power projects have not changed during 2000 years, we propose a quantitative method, based on ecological criteria and tools, to assess the impacts produced by hydroelectric and geothermal plants, according to the specific characteristics of the project. The proposed method is supported by environmental norms, and can assist environmental authorities in assigning the correct level and tools to be applied to hydroelectric and geothermal projects. The proposed method can be adapted to other production activities in Mexico and to other countries.

  11. GeneSCF: a real-time based functional enrichment tool with support for multiple organisms.

    PubMed

    Subhash, Santhilal; Kanduri, Chandrasekhar

    2016-09-13

    High-throughput technologies such as ChIP-sequencing, RNA-sequencing, DNA sequencing and quantitative metabolomics generate a huge volume of data. Researchers often rely on functional enrichment tools to interpret the biological significance of the affected genes from these high-throughput studies. However, currently available functional enrichment tools need to be updated frequently to adapt to new entries from the functional database repositories. Hence there is a need for a simplified tool that can perform functional enrichment analysis by using updated information directly from the source databases such as KEGG, Reactome or Gene Ontology etc. In this study, we focused on designing a command-line tool called GeneSCF (Gene Set Clustering based on Functional annotations), that can predict the functionally relevant biological information for a set of genes in a real-time updated manner. It is designed to handle information from more than 4000 organisms from freely available prominent functional databases like KEGG, Reactome and Gene Ontology. We successfully employed our tool on two of published datasets to predict the biologically relevant functional information. The core features of this tool were tested on Linux machines without the need for installation of more dependencies. GeneSCF is more reliable compared to other enrichment tools because of its ability to use reference functional databases in real-time to perform enrichment analysis. It is an easy-to-integrate tool with other pipelines available for downstream analysis of high-throughput data. More importantly, GeneSCF can run multiple gene lists simultaneously on different organisms thereby saving time for the users. Since the tool is designed to be ready-to-use, there is no need for any complex compilation and installation procedures.

  12. New Methods for Analysis of Spatial Distribution and Coaggregation of Microbial Populations in Complex Biofilms

    PubMed Central

    Almstrand, Robert; Daims, Holger; Persson, Frank; Sörensson, Fred

    2013-01-01

    In biofilms, microbial activities form gradients of substrates and electron acceptors, creating a complex landscape of microhabitats, often resulting in structured localization of the microbial populations present. To understand the dynamic interplay between and within these populations, quantitative measurements and statistical analysis of their localization patterns within the biofilms are necessary, and adequate automated tools for such analyses are needed. We have designed and applied new methods for fluorescence in situ hybridization (FISH) and digital image analysis of directionally dependent (anisotropic) multispecies biofilms. A sequential-FISH approach allowed multiple populations to be detected in a biofilm sample. This was combined with an automated tool for vertical-distribution analysis by generating in silico biofilm slices and the recently developed Inflate algorithm for coaggregation analysis of microbial populations in anisotropic biofilms. As a proof of principle, we show distinct stratification patterns of the ammonia oxidizers Nitrosomonas oligotropha subclusters I and II and the nitrite oxidizer Nitrospira sublineage I in three different types of wastewater biofilms, suggesting niche differentiation between the N. oligotropha subclusters, which could explain their coexistence in the same biofilms. Coaggregation analysis showed that N. oligotropha subcluster II aggregated closer to Nitrospira than did N. oligotropha subcluster I in a pilot plant nitrifying trickling filter (NTF) and a moving-bed biofilm reactor (MBBR), but not in a full-scale NTF, indicating important ecophysiological differences between these phylogenetically closely related subclusters. By using high-resolution quantitative methods applicable to any multispecies biofilm in general, the ecological interactions of these complex ecosystems can be understood in more detail. PMID:23892743

  13. Patient-Centered Communication and Health Assessment with Youth

    PubMed Central

    Munro, Michelle L.; Darling-Fisher, Cynthia S.; Ronis, David L.; Villarruel, Antonia M.; Pardee, Michelle; Faleer, Hannah; Fava, Nicole M.

    2014-01-01

    Background Patient-centered communication is the hallmark of care that incorporates the perspective of patients to provide tailored care that meets their needs and desires. However, at this time there has been limited evaluation of patient-provider communication involving youth. Objectives This manuscript will report on results from secondary analysis of data obtained during a participatory research-based randomized control trial designed to test a sexual risk event history calendar intervention with youth to address the following research questions: (a) Based on the event history calendar’s (EHC) inclusion of contextual factors, does the EHC demonstrate improved communication outcomes (i.e., amount, satisfaction, mutuality, client involvement, client satisfaction, patient-provider interaction, and patient-centeredness) when compared to the Guidelines for Adolescent Preventive Services (GAPS) tool? and (b) How do patients and providers describe the characteristics of each tool in regards to patient-centered communication? Method This report will utilize a sequential explanatory mixed methods approach to evaluate communication. A split plot design with one between factor (i.e., communication structure between EHC and GAPS) and one within factor (i.e., time between pretest and posttest) was used for analyses of data collection from male and female youth (n=186) and providers (n=9). Quantitative analysis of survey data evaluated changes in communication from pre-test to post-test. Qualitative data collected from open-ended questions, audio-taped visits, and exit interviews was employed to enhance interpretation of quantitative findings. Results Patient-centered communication using assessment tools (EHC and GAPS) with youth demonstrated improved communication outcomes both quantitatively and qualitatively. Additional analyses with subgroups of males and Arab-Americans demonstrated better post-intervention scores among the EHC group in certain aspects of communication. Qualitative results revealed that the EHC demonstrated improved outcomes in the four components of patient-centered communication including: validation of the patient’s perspective; viewing the patient within context; reaching a shared understanding on needs and preferences; and helping the patient share power in the healthcare interaction. Discussion Though both tools provided a framework from which to conduct a clinical visit, the integrated time-linked assessment captured by the EHC enhanced the patient-centered communication in select groups compared to GAPS. PMID:24165214

  14. Molecular and Cellular Quantitative Microscopy: theoretical investigations, technological developments and applications to neurobiology

    NASA Astrophysics Data System (ADS)

    Esposito, Alessandro

    2006-05-01

    This PhD project aims at the development and evaluation of microscopy techniques for the quantitative detection of molecular interactions and cellular features. The primarily investigated techniques are Fαrster Resonance Energy Transfer imaging and Fluorescence Lifetime Imaging Microscopy. These techniques have the capability to quantitatively probe the biochemical environment of fluorophores. An automated microscope capable of unsupervised operation has been developed that enables the investigation of molecular and cellular properties at high throughput levels and the analysis of cellular heterogeneity. State-of-the-art Förster Resonance Energy Transfer imaging, Fluorescence Lifetime Imaging Microscopy, Confocal Laser Scanning Microscopy and the newly developed tools have been combined with cellular and molecular biology techniques for the investigation of protein-protein interactions, oligomerization and post-translational modifications of α-Synuclein and Tau, two proteins involved in Parkinson’s and Alzheimer’s disease, respectively. The high inter-disciplinarity of this project required the merging of the expertise of both the Molecular Biophysics Group at the Debye Institute - Utrecht University and the Cell Biophysics Group at the European Neuroscience Institute - Gαttingen University. This project was conducted also with the support and the collaboration of the Center for the Molecular Physiology of the Brain (Göttingen), particularly with the groups associated with the Molecular Quantitative Microscopy and Parkinson’s Disease and Aggregopathies areas. This work demonstrates that molecular and cellular quantitative microscopy can be used in combination with high-throughput screening as a powerful tool for the investigation of the molecular mechanisms of complex biological phenomena like those occurring in neurodegenerative diseases.

  15. A Unique Digital Electrocardiographic Repository for the Development of Quantitative Electrocardiography and Cardiac Safety: The Telemetric and Holter ECG Warehouse (THEW)

    PubMed Central

    Couderc, Jean-Philippe

    2010-01-01

    The sharing of scientific data reinforces open scientific inquiry; it encourages diversity of analysis and opinion while promoting new research and facilitating the education of next generations of scientists. In this article, we present an initiative for the development of a repository containing continuous electrocardiographic information and their associated clinical information. This information is shared with the worldwide scientific community in order to improve quantitative electrocardiology and cardiac safety. First, we present the objectives of the initiative and its mission. Then, we describe the resources available in this initiative following three components: data, expertise and tools. The Data available in the Telemetric and Holter ECG Warehouse (THEW) includes continuous ECG signals and associated clinical information. The initiative attracted various academic and private partners whom expertise covers a large list of research arenas related to quantitative electrocardiography; their contribution to the THEW promotes cross-fertilization of scientific knowledge, resources, and ideas that will advance the field of quantitative electrocardiography. Finally, the tools of the THEW include software and servers to access and review the data available in the repository. To conclude, the THEW is an initiative developed to benefit the scientific community and to advance the field of quantitative electrocardiography and cardiac safety. It is a new repository designed to complement the existing ones such as Physionet, the AHA-BIH Arrhythmia Database, and the CSE database. The THEW hosts unique datasets from clinical trials and drug safety studies that, so far, were not available to the worldwide scientific community. PMID:20863512

  16. SNPassoc: an R package to perform whole genome association studies.

    PubMed

    González, Juan R; Armengol, Lluís; Solé, Xavier; Guinó, Elisabet; Mercader, Josep M; Estivill, Xavier; Moreno, Víctor

    2007-03-01

    The popularization of large-scale genotyping projects has led to the widespread adoption of genetic association studies as the tool of choice in the search for single nucleotide polymorphisms (SNPs) underlying susceptibility to complex diseases. Although the analysis of individual SNPs is a relatively trivial task, when the number is large and multiple genetic models need to be explored it becomes necessary a tool to automate the analyses. In order to address this issue, we developed SNPassoc, an R package to carry out most common analyses in whole genome association studies. These analyses include descriptive statistics and exploratory analysis of missing values, calculation of Hardy-Weinberg equilibrium, analysis of association based on generalized linear models (either for quantitative or binary traits), and analysis of multiple SNPs (haplotype and epistasis analysis). Package SNPassoc is available at CRAN from http://cran.r-project.org. A tutorial is available on Bioinformatics online and in http://davinci.crg.es/estivill_lab/snpassoc.

  17. Visual Aggregate Analysis of Eligibility Features of Clinical Trials

    PubMed Central

    He, Zhe; Carini, Simona; Sim, Ida; Weng, Chunhua

    2015-01-01

    Objective To develop a method for profiling the collective populations targeted for recruitment by multiple clinical studies addressing the same medical condition using one eligibility feature each time. Methods Using a previously published database COMPACT as the backend, we designed a scalable method for visual aggregate analysis of clinical trial eligibility features. This method consists of four modules for eligibility feature frequency analysis, query builder, distribution analysis, and visualization, respectively. This method is capable of analyzing (1) frequently used qualitative and quantitative features for recruiting subjects for a selected medical condition, (2) distribution of study enrollment on consecutive value points or value intervals of each quantitative feature, and (3) distribution of studies on the boundary values, permissible value ranges, and value range widths of each feature. All analysis results were visualized using Google Charts API. Five recruited potential users assessed the usefulness of this method for identifying common patterns in any selected eligibility feature for clinical trial participant selection. Results We implemented this method as a Web-based analytical system called VITTA (Visual Analysis Tool of Clinical Study Target Populations). We illustrated the functionality of VITTA using two sample queries involving quantitative features BMI and HbA1c for conditions “hypertension” and “Type 2 diabetes”, respectively. The recruited potential users rated the user-perceived usefulness of VITTA with an average score of 86.4/100. Conclusions We contributed a novel aggregate analysis method to enable the interrogation of common patterns in quantitative eligibility criteria and the collective target populations of multiple related clinical studies. A larger-scale study is warranted to formally assess the usefulness of VITTA among clinical investigators and sponsors in various therapeutic areas. PMID:25615940

  18. Visual aggregate analysis of eligibility features of clinical trials.

    PubMed

    He, Zhe; Carini, Simona; Sim, Ida; Weng, Chunhua

    2015-04-01

    To develop a method for profiling the collective populations targeted for recruitment by multiple clinical studies addressing the same medical condition using one eligibility feature each time. Using a previously published database COMPACT as the backend, we designed a scalable method for visual aggregate analysis of clinical trial eligibility features. This method consists of four modules for eligibility feature frequency analysis, query builder, distribution analysis, and visualization, respectively. This method is capable of analyzing (1) frequently used qualitative and quantitative features for recruiting subjects for a selected medical condition, (2) distribution of study enrollment on consecutive value points or value intervals of each quantitative feature, and (3) distribution of studies on the boundary values, permissible value ranges, and value range widths of each feature. All analysis results were visualized using Google Charts API. Five recruited potential users assessed the usefulness of this method for identifying common patterns in any selected eligibility feature for clinical trial participant selection. We implemented this method as a Web-based analytical system called VITTA (Visual Analysis Tool of Clinical Study Target Populations). We illustrated the functionality of VITTA using two sample queries involving quantitative features BMI and HbA1c for conditions "hypertension" and "Type 2 diabetes", respectively. The recruited potential users rated the user-perceived usefulness of VITTA with an average score of 86.4/100. We contributed a novel aggregate analysis method to enable the interrogation of common patterns in quantitative eligibility criteria and the collective target populations of multiple related clinical studies. A larger-scale study is warranted to formally assess the usefulness of VITTA among clinical investigators and sponsors in various therapeutic areas. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Review of Software Tools for Design and Analysis of Large scale MRM Proteomic Datasets

    PubMed Central

    Colangelo, Christopher M.; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi

    2013-01-01

    Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow. PMID:23702368

  20. Time-varying surface electromyography topography as a prognostic tool for chronic low back pain rehabilitation.

    PubMed

    Hu, Yong; Kwok, Jerry Weilun; Tse, Jessica Yuk-Hang; Luk, Keith Dip-Kei

    2014-06-01

    Nonsurgical rehabilitation therapy is a commonly used strategy to treat chronic low back pain (LBP). The selection of the most appropriate therapeutic options is still a big challenge in clinical practices. Surface electromyography (sEMG) topography has been proposed to be an objective assessment of LBP rehabilitation. The quantitative analysis of dynamic sEMG would provide an objective tool of prognosis for LBP rehabilitation. To evaluate the prognostic value of quantitative sEMG topographic analysis and to verify the accuracy of the performance of proposed time-varying topographic parameters for identifying the patients who have better response toward the rehabilitation program. A retrospective study of consecutive patients. Thirty-eight patients with chronic nonspecific LBP and 43 healthy subjects. The accuracy of the time-varying quantitative sEMG topographic analysis for monitoring LBP rehabilitation progress was determined by calculating the corresponding receiver-operating characteristic (ROC) curves. Physiologic measure was the sEMG during lumbar flexion and extension. Patients who suffered from chronic nonspecific LBP without the history of back surgery and any medical conditions causing acute exacerbation of LBP during the clinical test were enlisted to perform the clinical test during the 12-week physiotherapy (PT) treatment. Low back pain patients were classified into two groups: "responding" and "nonresponding" based on the clinical assessment. The responding group referred to the LBP patients who began to recover after the PT treatment, whereas the nonresponding group referred to some LBP patients who did not recover or got worse after the treatment. The results of the time-varying analysis in the responding group were compared with those in the nonresponding group. In addition, the accuracy of the analysis was analyzed through ROC curves. The time-varying analysis showed discrepancies in the root-mean-square difference (RMSD) parameters between the responding and nonresponding groups. The relative area (RA) and relative width (RW) of RMSD at flexion and extension in the responding group were significantly lower than those in the nonresponding group (p<.05). The areas under the ROC curve of RA and RW of RMSD at flexion and extension were greater than 0.7 and were statistically significant. The quantitative time-varying analysis of sEMG topography showed significant difference between the healthy and LBP groups. The discrepancies in quantitative dynamic sEMG topography of LBP group from normal group, in terms of RA and RW of RMSD at flexion and extension, were able to identify those LBP subjects who would respond to a conservative rehabilitation program focused on functional restoration of lumbar muscle. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. PHOXTRACK-a tool for interpreting comprehensive datasets of post-translational modifications of proteins.

    PubMed

    Weidner, Christopher; Fischer, Cornelius; Sauer, Sascha

    2014-12-01

    We introduce PHOXTRACK (PHOsphosite-X-TRacing Analysis of Causal Kinases), a user-friendly freely available software tool for analyzing large datasets of post-translational modifications of proteins, such as phosphorylation, which are commonly gained by mass spectrometry detection. In contrast to other currently applied data analysis approaches, PHOXTRACK uses full sets of quantitative proteomics data and applies non-parametric statistics to calculate whether defined kinase-specific sets of phosphosite sequences indicate statistically significant concordant differences between various biological conditions. PHOXTRACK is an efficient tool for extracting post-translational information of comprehensive proteomics datasets to decipher key regulatory proteins and to infer biologically relevant molecular pathways. PHOXTRACK will be maintained over the next years and is freely available as an online tool for non-commercial use at http://phoxtrack.molgen.mpg.de. Users will also find a tutorial at this Web site and can additionally give feedback at https://groups.google.com/d/forum/phoxtrack-discuss. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  2. Emerging Tools to Estimate and to Predict Exposures to ...

    EPA Pesticide Factsheets

    The timely assessment of the human and ecological risk posed by thousands of existing and emerging commercial chemicals is a critical challenge facing EPA in its mission to protect public health and the environment The US EPA has been conducting research to enhance methods used to estimate and forecast exposures for tens of thousands of chemicals. This research is aimed at both assessing risks and supporting life cycle analysis, by developing new models and tools for high throughput exposure screening and prioritization, as well as databases that support these and other tools, especially regarding consumer products. The models and data address usage, and take advantage of quantitative structural activity relationships (QSARs) for both inherent chemical properties and function (why the chemical is a product ingredient). To make them more useful and widely available, the new tools, data and models are designed to be: • Flexible • Intraoperative • Modular (useful to more than one, stand-alone application) • Open (publicly available software) Presented at the Society for Risk Analysis Forum: Risk Governance for Key Enabling Technologies, Venice, Italy, March 1-3, 2017

  3. Influence of export control policy on the competitiveness of machine tool producing organizations

    NASA Astrophysics Data System (ADS)

    Ahrstrom, Jeffrey D.

    The possible influence of export control policies on producers of export controlled machine tools is examined in this quantitative study. International market competitiveness theories hold that market controlling policies such as export control regulations may influence an organization's ability to compete (Burris, 2010). Differences in domestic application of export control policy on machine tool exports may impose throttling effects on the competitiveness of participating firms (Freedenberg, 2010). Commodity shipments from Japan, Germany, and the United States to the Russian market will be examined using descriptive statistics; gravity modeling of these specific markets provides a foundation for comparison to actual shipment data; and industry participant responses to a user developed survey will provide additional data for analysis using a Kruskal-Wallis one-way analysis of variance. There is scarce academic research data on the topic of export control effects within the machine tool industry. Research results may be of interest to industry leadership in market participation decisions, advocacy arguments, and strategic planning. Industry advocates and export policy decision makers could find data of interest in supporting positions for or against modifications of export control policies.

  4. Practical applications of the bioinformatics toolbox for narrowing quantitative trait loci.

    PubMed

    Burgess-Herbert, Sarah L; Cox, Allison; Tsaih, Shirng-Wern; Paigen, Beverly

    2008-12-01

    Dissecting the genes involved in complex traits can be confounded by multiple factors, including extensive epistatic interactions among genes, the involvement of epigenetic regulators, and the variable expressivity of traits. Although quantitative trait locus (QTL) analysis has been a powerful tool for localizing the chromosomal regions underlying complex traits, systematically identifying the causal genes remains challenging. Here, through its application to plasma levels of high-density lipoprotein cholesterol (HDL) in mice, we demonstrate a strategy for narrowing QTL that utilizes comparative genomics and bioinformatics techniques. We show how QTL detected in multiple crosses are subjected to both combined cross analysis and haplotype block analysis; how QTL from one species are mapped to the concordant regions in another species; and how genomewide scans associating haplotype groups with their phenotypes can be used to prioritize the narrowed regions. Then we illustrate how these individual methods for narrowing QTL can be systematically integrated for mouse chromosomes 12 and 15, resulting in a significantly reduced number of candidate genes, often from hundreds to <10. Finally, we give an example of how additional bioinformatics resources can be combined with experiments to determine the most likely quantitative trait genes.

  5. Tropical Pacific moisture variability: Its detection, synoptic structure and consequences in the general circulation

    NASA Technical Reports Server (NTRS)

    Mcguirk, James P.

    1990-01-01

    Satellite data analysis tools are developed and implemented for the diagnosis of atmospheric circulation systems over the tropical Pacific Ocean. The tools include statistical multi-variate procedures, a multi-spectral radiative transfer model, and the global spectral forecast model at NMC. Data include in-situ observations; satellite observations from VAS (moisture, infrared and visible) NOAA polar orbiters (including Tiros Operational Satellite System (TOVS) multi-channel sounding data and OLR grids) and scanning multichannel microwave radiometer (SMMR); and European Centre for Medium Weather Forecasts (ECHMWF) analyses. A primary goal is a better understanding of the relation between synoptic structures of the area, particularly tropical plumes, and the general circulation, especially the Hadley circulation. A second goal is the definition of the quantitative structure and behavior of all Pacific tropical synoptic systems. Finally, strategies are examined for extracting new and additional information from existing satellite observations. Although moisture structure is emphasized, thermal patterns are also analyzed. Both horizontal and vertical structures are studied and objective quantitative results are emphasized.

  6. Infrared spectroscopy as a tool to characterise starch ordered structure--a joint FTIR-ATR, NMR, XRD and DSC study.

    PubMed

    Warren, Frederick J; Gidley, Michael J; Flanagan, Bernadine M

    2016-03-30

    Starch has a heterogeneous, semi-crystalline granular structure and the degree of ordered structure can affect its behaviour in foods and bioplastics. A range of methodologies are employed to study starch structure; differential scanning calorimetry, (13)C nuclear magnetic resonance, X-ray diffraction and Fourier transform infrared spectroscopy (FTIR). Despite the appeal of FTIR as a rapid, non-destructive methodology, there is currently no systematically defined quantitative relationship between FTIR spectral features and other starch structural measures. Here, we subject 61 starch samples to structural analysis, and systematically correlate FTIR spectra with other measures of starch structure. A hydration dependent peak position shift in the FTIR spectra of starch is observed, resulting from increased molecular order, but with complex, non-linear behaviour. We demonstrate that FTIR is a tool that can quantitatively probe short range interactions in starch structure. However, the assumptions of linear relationships between starch ordered structure and peak ratios are overly simplistic. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Measurement and Prediction of the Thermomechanical Response of Shape Memory Alloy Hybrid Composite Beams

    NASA Technical Reports Server (NTRS)

    Davis, Brian; Turner, Travis L.; Seelecke, Stefan

    2008-01-01

    An experimental and numerical investigation into the static and dynamic responses of shape memory alloy hybrid composite (SMAHC) beams is performed to provide quantitative validation of a recently commercialized numerical analysis/design tool for SMAHC structures. The SMAHC beam specimens consist of a composite matrix with embedded pre-strained SMA actuators, which act against the mechanical boundaries of the structure when thermally activated to adaptively stiffen the structure. Numerical results are produced from the numerical model as implemented into the commercial finite element code ABAQUS. A rigorous experimental investigation is undertaken to acquire high fidelity measurements including infrared thermography and projection moire interferometry for full-field temperature and displacement measurements, respectively. High fidelity numerical results are also obtained from the numerical model and include measured parameters, such as geometric imperfection and thermal load. Excellent agreement is achieved between the predicted and measured results of the static and dynamic thermomechanical response, thereby providing quantitative validation of the numerical tool.

  8. Harnessing cell-to-cell variations to probe bacterial structure and biophysics

    NASA Astrophysics Data System (ADS)

    Cass, Julie A.

    Advances in microscopy and biotechnology have given us novel insights into cellular biology and physics. While bacteria were long considered to be relatively unstructured, the development of fluorescence microscopy techniques, and spatially and temporally resolved high-throughput quantitative studies, have uncovered that the bacterial cell is highly organized, and its structure rigorously maintained. In this thesis I will describe our gateTool software, designed to harness cell-to-cell variations to probe bacterial structure, and discuss two exciting aspects of structure that we have employed gateTool to investigate: (i) chromosome organization and the cellular mechanisms for controlling DNA dynamics, and (ii) the study of cell wall synthesis, and how the genes in the synthesis pathway impact cellular shape. In the first project, we develop a spatial and temporal mapping of cell-cycle-dependent chromosomal organization, and use this quantitative map to discover that chromosomal loci segregate from midcell with universal dynamics. In the second project, I describe preliminary time- lapse and snapshot imaging analysis suggesting phentoypical coherence across peptidoglycan synthesis pathways.

  9. Reliability and Productivity Modeling for the Optimization of Separated Spacecraft Interferometers

    NASA Technical Reports Server (NTRS)

    Kenny, Sean (Technical Monitor); Wertz, Julie

    2002-01-01

    As technological systems grow in capability, they also grow in complexity. Due to this complexity, it is no longer possible for a designer to use engineering judgement to identify the components that have the largest impact on system life cycle metrics, such as reliability, productivity, cost, and cost effectiveness. One way of identifying these key components is to build quantitative models and analysis tools that can be used to aid the designer in making high level architecture decisions. Once these key components have been identified, two main approaches to improving a system using these components exist: add redundancy or improve the reliability of the component. In reality, the most effective approach to almost any system will be some combination of these two approaches, in varying orders of magnitude for each component. Therefore, this research tries to answer the question of how to divide funds, between adding redundancy and improving the reliability of components, to most cost effectively improve the life cycle metrics of a system. While this question is relevant to any complex system, this research focuses on one type of system in particular: Separate Spacecraft Interferometers (SSI). Quantitative models are developed to analyze the key life cycle metrics of different SSI system architectures. Next, tools are developed to compare a given set of architectures in terms of total performance, by coupling different life cycle metrics together into one performance metric. Optimization tools, such as simulated annealing and genetic algorithms, are then used to search the entire design space to find the "optimal" architecture design. Sensitivity analysis tools have been developed to determine how sensitive the results of these analyses are to uncertain user defined parameters. Finally, several possibilities for the future work that could be done in this area of research are presented.

  10. Systematic review and meta-analysis: tools for the information age.

    PubMed

    Weatherall, Mark

    2017-11-01

    The amount of available biomedical information is vast and growing. Natural limitations of the way clinicians and researchers approach this treasure trove of information comprise difficulties locating the information, and once located, cognitive biases may lead to inappropriate use of the information. Systematic reviews and meta-analyses represent important tools in the information age to improve knowledge and action. Systematic reviews represent a census approach to identifying literature to avoid non-response bias. They are a necessary prelude to producing combined quantitative summaries of associations or treatment effects. Meta-analysis comprises the arithmetical techniques for producing combined summaries from individual study reports. Careful, thoughtful and rigorous use of these tools is likely to enhance knowledge and action. Use of standard guidelines, such as the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, or embedding these activities within collaborative groups such as the Cochrane Collaboration, are likely to lead to more useful systematic review and meta-analysis reporting. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  11. Bioinformatics tools for quantitative and functional metagenome and metatranscriptome data analysis in microbes.

    PubMed

    Niu, Sheng-Yong; Yang, Jinyu; McDermaid, Adam; Zhao, Jing; Kang, Yu; Ma, Qin

    2017-05-08

    Metagenomic and metatranscriptomic sequencing approaches are more frequently being used to link microbiota to important diseases and ecological changes. Many analyses have been used to compare the taxonomic and functional profiles of microbiota across habitats or individuals. While a large portion of metagenomic analyses focus on species-level profiling, some studies use strain-level metagenomic analyses to investigate the relationship between specific strains and certain circumstances. Metatranscriptomic analysis provides another important insight into activities of genes by examining gene expression levels of microbiota. Hence, combining metagenomic and metatranscriptomic analyses will help understand the activity or enrichment of a given gene set, such as drug-resistant genes among microbiome samples. Here, we summarize existing bioinformatics tools of metagenomic and metatranscriptomic data analysis, the purpose of which is to assist researchers in deciding the appropriate tools for their microbiome studies. Additionally, we propose an Integrated Meta-Function mapping pipeline to incorporate various reference databases and accelerate functional gene mapping procedures for both metagenomic and metatranscriptomic analyses. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. State of the art in bile analysis in forensic toxicology.

    PubMed

    Bévalot, F; Cartiser, N; Bottinelli, C; Guitton, J; Fanton, L

    2016-02-01

    In forensic toxicology, alternative matrices to blood are useful in case of limited, unavailable or unusable blood sample, suspected postmortem redistribution or long drug intake-to-sampling interval. The present article provides an update on the state of knowledge for the use of bile in forensic toxicology, through a review of the Medline literature from 1970 to May 2015. Bile physiology and technical aspects of analysis (sampling, storage, sample preparation and analytical methods) are reported, to highlight specificities and consequences from an analytical and interpretative point of view. A table summarizes cause of death and quantification in bile and blood of 133 compounds from more than 200 case reports, providing a useful tool for forensic physicians and toxicologists involved in interpreting bile analysis. Qualitative and quantitative interpretation is discussed. As bile/blood concentration ratios are high for numerous molecules or metabolites, bile is a matrix of choice for screening when blood concentrations are low or non-detectable: e.g., cases of weak exposure or long intake-to-death interval. Quantitative applications have been little investigated, but small molecules with low bile/blood concentration ratios seem to be good candidates for quantitative bile-based interpretation. Further experimental data on the mechanism and properties of biliary extraction of xenobiotics of forensic interest are required to improve quantitative interpretation. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  13. Quantitative contrast enhanced magnetic resonance imaging for the evaluation of peripheral arterial disease: a comparative study versus standard digital angiography.

    PubMed

    Pavlovic, Chris; Futamatsu, Hideki; Angiolillo, Dominick J; Guzman, Luis A; Wilke, Norbert; Siragusa, Daniel; Wludyka, Peter; Percy, Robert; Northrup, Martin; Bass, Theodore A; Costa, Marco A

    2007-04-01

    The purpose of this study is to evaluate the accuracy of semiautomated analysis of contrast enhanced magnetic resonance angiography (MRA) in patients who have undergone standard angiographic evaluation for peripheral vascular disease (PVD). Magnetic resonance angiography is an important tool for evaluating PVD. Although this technique is both safe and noninvasive, the accuracy and reproducibility of quantitative measurements of disease severity using MRA in the clinical setting have not been fully investigated. 43 lesions in 13 patients who underwent both MRA and digital subtraction angiography (DSA) of iliac and common femoral arteries within 6 months were analyzed using quantitative magnetic resonance angiography (QMRA) and quantitative vascular analysis (QVA). Analysis was repeated by a second operator and by the same operator in approximately 1 month time. QMRA underestimated percent diameter stenosis (%DS) compared to measurements made with QVA by 2.47%. Limits of agreement between the two methods were +/- 9.14%. Interobserver variability in measurements of %DS were +/- 12.58% for QMRA and +/- 10.04% for QVA. Intraobserver variability of %DS for QMRA was +/- 4.6% and for QVA was +/- 8.46%. QMRA displays a high level of agreement to QVA when used to determine stenosis severity in iliac and common femoral arteries. Similar levels of interobserver and intraobserver variability are present with each method. Overall, QMRA represents a useful method to quantify severity of PVD.

  14. Sample normalization methods in quantitative metabolomics.

    PubMed

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Toward Accurate and Quantitative Comparative Metagenomics

    PubMed Central

    Nayfach, Stephen; Pollard, Katherine S.

    2016-01-01

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  16. Analysis of DNA interactions using single-molecule force spectroscopy.

    PubMed

    Ritzefeld, Markus; Walhorn, Volker; Anselmetti, Dario; Sewald, Norbert

    2013-06-01

    Protein-DNA interactions are involved in many biochemical pathways and determine the fate of the corresponding cell. Qualitative and quantitative investigations on these recognition and binding processes are of key importance for an improved understanding of biochemical processes and also for systems biology. This review article focusses on atomic force microscopy (AFM)-based single-molecule force spectroscopy and its application to the quantification of forces and binding mechanisms that lead to the formation of protein-DNA complexes. AFM and dynamic force spectroscopy are exciting tools that allow for quantitative analysis of biomolecular interactions. Besides an overview on the method and the most important immobilization approaches, the physical basics of the data evaluation is described. Recent applications of AFM-based force spectroscopy to investigate DNA intercalation, complexes involving DNA aptamers and peptide- and protein-DNA interactions are given.

  17. Toward Accurate and Quantitative Comparative Metagenomics.

    PubMed

    Nayfach, Stephen; Pollard, Katherine S

    2016-08-25

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Assessment and improvement of statistical tools for comparative proteomics analysis of sparse data sets with few experimental replicates.

    PubMed

    Schwämmle, Veit; León, Ileana Rodríguez; Jensen, Ole Nørregaard

    2013-09-06

    Large-scale quantitative analyses of biological systems are often performed with few replicate experiments, leading to multiple nonidentical data sets due to missing values. For example, mass spectrometry driven proteomics experiments are frequently performed with few biological or technical replicates due to sample-scarcity or due to duty-cycle or sensitivity constraints, or limited capacity of the available instrumentation, leading to incomplete results where detection of significant feature changes becomes a challenge. This problem is further exacerbated for the detection of significant changes on the peptide level, for example, in phospho-proteomics experiments. In order to assess the extent of this problem and the implications for large-scale proteome analysis, we investigated and optimized the performance of three statistical approaches by using simulated and experimental data sets with varying numbers of missing values. We applied three tools, including standard t test, moderated t test, also known as limma, and rank products for the detection of significantly changing features in simulated and experimental proteomics data sets with missing values. The rank product method was improved to work with data sets containing missing values. Extensive analysis of simulated and experimental data sets revealed that the performance of the statistical analysis tools depended on simple properties of the data sets. High-confidence results were obtained by using the limma and rank products methods for analyses of triplicate data sets that exhibited more than 1000 features and more than 50% missing values. The maximum number of differentially represented features was identified by using limma and rank products methods in a complementary manner. We therefore recommend combined usage of these methods as a novel and optimal way to detect significantly changing features in these data sets. This approach is suitable for large quantitative data sets from stable isotope labeling and mass spectrometry experiments and should be applicable to large data sets of any type. An R script that implements the improved rank products algorithm and the combined analysis is available.

  19. Ultrafast Screening and Quantitation of Pesticides in Food and Environmental Matrices by Solid-Phase Microextraction-Transmission Mode (SPME-TM) and Direct Analysis in Real Time (DART).

    PubMed

    Gómez-Ríos, Germán Augusto; Gionfriddo, Emanuela; Poole, Justen; Pawliszyn, Janusz

    2017-07-05

    The direct interface of microextraction technologies to mass spectrometry (MS) has unquestionably revolutionized the speed and efficacy at which complex matrices are analyzed. Solid Phase Micro Extraction-Transmission Mode (SPME-TM) is a technology conceived as an effective synergy between sample preparation and ambient ionization. Succinctly, the device consists of a mesh coated with polymeric particles that extracts analytes of interest present in a given sample matrix. This coated mesh acts as a transmission-mode substrate for Direct Analysis in Real Time (DART), allowing for rapid and efficient thermal desorption/ionization of analytes previously concentrated on the coating, and dramatically lowering the limits of detection attained by sole DART analysis. In this study, we present SPME-TM as a novel tool for the ultrafast enrichment of pesticides present in food and environmental matrices and their quantitative determination by MS via DART ionization. Limits of quantitation in the subnanogram per milliliter range can be attained, while total analysis time does not exceed 2 min per sample. In addition to target information obtained via tandem MS, retrospective studies of the same sample via high-resolution mass spectrometry (HRMS) were accomplished by thermally desorbing a different segment of the microextraction device.

  20. Recent Achievements in Characterizing the Histone Code and Approaches to Integrating Epigenomics and Systems Biology.

    PubMed

    Janssen, K A; Sidoli, S; Garcia, B A

    2017-01-01

    Functional epigenetic regulation occurs by dynamic modification of chromatin, including genetic material (i.e., DNA methylation), histone proteins, and other nuclear proteins. Due to the highly complex nature of the histone code, mass spectrometry (MS) has become the leading technique in identification of single and combinatorial histone modifications. MS has now overcome antibody-based strategies due to its automation, high resolution, and accurate quantitation. Moreover, multiple approaches to analysis have been developed for global quantitation of posttranslational modifications (PTMs), including large-scale characterization of modification coexistence (middle-down and top-down proteomics), which is not currently possible with any other biochemical strategy. Recently, our group and others have simplified and increased the effectiveness of analyzing histone PTMs by improving multiple MS methods and data analysis tools. This review provides an overview of the major achievements in the analysis of histone PTMs using MS with a focus on the most recent improvements. We speculate that the workflow for histone analysis at its state of the art is highly reliable in terms of identification and quantitation accuracy, and it has the potential to become a routine method for systems biology thanks to the possibility of integrating histone MS results with genomics and proteomics datasets. © 2017 Elsevier Inc. All rights reserved.

  1. Investigating the quality of mental models deployed by undergraduate engineering students in creating explanations: The case of thermally activated phenomena

    NASA Astrophysics Data System (ADS)

    Fazio, Claudio; Battaglia, Onofrio Rosario; Di Paola, Benedetto

    2013-12-01

    This paper describes a method aimed at pointing out the quality of the mental models undergraduate engineering students deploy when asked to create explanations for phenomena or processes and/or use a given model in the same context. Student responses to a specially designed written questionnaire are quantitatively analyzed using researcher-generated categories of reasoning, based on the physics education research literature on student understanding of the relevant physics content. The use of statistical implicative analysis tools allows us to successfully identify clusters of students with respect to the similarity to the reasoning categories, defined as “practical or everyday,” “descriptive,” or “explicative.” Through the use of similarity and implication indexes our method also enables us to study the consistency in students’ deployment of mental models. A qualitative analysis of interviews conducted with students after they had completed the questionnaire is used to clarify some aspects which emerged from the quantitative analysis and validate the results obtained. Some implications of this joint use of quantitative and qualitative analysis for the design of a learning environment focused on the understanding of some aspects of the world at the level of causation and mechanisms of functioning are discussed.

  2. Uncertainty analysis in vulnerability estimations for elements at risk- a review of concepts and some examples on landslides

    NASA Astrophysics Data System (ADS)

    Ciurean, R. L.; Glade, T.

    2012-04-01

    Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.

  3. Automated quantitative gait analysis during overground locomotion in the rat: its application to spinal cord contusion and transection injuries.

    PubMed

    Hamers, F P; Lankhorst, A J; van Laar, T J; Veldhuis, W B; Gispen, W H

    2001-02-01

    Analysis of locomotion is an important tool in the study of peripheral and central nervous system damage. Most locomotor scoring systems in rodents are based either upon open field locomotion assessment, for example, the BBB score or upon foot print analysis. The former yields a semiquantitative description of locomotion as a whole, whereas the latter generates quantitative data on several selected gait parameters. In this paper, we describe the use of a newly developed gait analysis method that allows easy quantitation of a large number of locomotion parameters during walkway crossing. We were able to extract data on interlimb coordination, swing duration, paw print areas (total over stance, and at 20-msec time resolution), stride length, and base of support: Similar data can not be gathered by any single previously described method. We compare changes in gait parameters induced by two different models of spinal cord injury in rats, transection of the dorsal half of the spinal cord and spinal cord contusion injury induced by the NYU or MASCIS device. Although we applied this method to rats with spinal cord injury, the usefulness of this method is not limited to rats or to the investigation of spinal cord injuries alone.

  4. High and low frequency unfolded partial least squares regression based on empirical mode decomposition for quantitative analysis of fuel oil samples.

    PubMed

    Bian, Xihui; Li, Shujuan; Lin, Ligang; Tan, Xiaoyao; Fan, Qingjie; Li, Ming

    2016-06-21

    Accurate prediction of the model is fundamental to the successful analysis of complex samples. To utilize abundant information embedded over frequency and time domains, a novel regression model is presented for quantitative analysis of hydrocarbon contents in the fuel oil samples. The proposed method named as high and low frequency unfolded PLSR (HLUPLSR), which integrates empirical mode decomposition (EMD) and unfolded strategy with partial least squares regression (PLSR). In the proposed method, the original signals are firstly decomposed into a finite number of intrinsic mode functions (IMFs) and a residue by EMD. Secondly, the former high frequency IMFs are summed as a high frequency matrix and the latter IMFs and residue are summed as a low frequency matrix. Finally, the two matrices are unfolded to an extended matrix in variable dimension, and then the PLSR model is built between the extended matrix and the target values. Coupled with Ultraviolet (UV) spectroscopy, HLUPLSR has been applied to determine hydrocarbon contents of light gas oil and diesel fuels samples. Comparing with single PLSR and other signal processing techniques, the proposed method shows superiority in prediction ability and better model interpretation. Therefore, HLUPLSR method provides a promising tool for quantitative analysis of complex samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Towards program theory validation: Crowdsourcing the qualitative analysis of participant experiences.

    PubMed

    Harman, Elena; Azzam, Tarek

    2018-02-01

    This exploratory study examines a novel tool for validating program theory through crowdsourced qualitative analysis. It combines a quantitative pattern matching framework traditionally used in theory-driven evaluation with crowdsourcing to analyze qualitative interview data. A sample of crowdsourced participants are asked to read an interview transcript and identify whether program theory components (Activities and Outcomes) are discussed and to highlight the most relevant passage about that component. The findings indicate that using crowdsourcing to analyze qualitative data can differentiate between program theory components that are supported by a participant's experience and those that are not. This approach expands the range of tools available to validate program theory using qualitative data, thus strengthening the theory-driven approach. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. A software tool for automatic classification and segmentation of 2D/3D medical images

    NASA Astrophysics Data System (ADS)

    Strzelecki, Michal; Szczypinski, Piotr; Materka, Andrzej; Klepaczko, Artur

    2013-02-01

    Modern medical diagnosis utilizes techniques of visualization of human internal organs (CT, MRI) or of its metabolism (PET). However, evaluation of acquired images made by human experts is usually subjective and qualitative only. Quantitative analysis of MR data, including tissue classification and segmentation, is necessary to perform e.g. attenuation compensation, motion detection, and correction of partial volume effect in PET images, acquired with PET/MR scanners. This article presents briefly a MaZda software package, which supports 2D and 3D medical image analysis aiming at quantification of image texture. MaZda implements procedures for evaluation, selection and extraction of highly discriminative texture attributes combined with various classification, visualization and segmentation tools. Examples of MaZda application in medical studies are also provided.

  7. Analytical aspects of hydrogen exchange mass spectrometry

    PubMed Central

    Engen, John R.; Wales, Thomas E.

    2016-01-01

    The analytical aspects of measuring hydrogen exchange by mass spectrometry are reviewed. The nature of analytical selectivity in hydrogen exchange is described followed by review of the analytical tools required to accomplish fragmentation, separation, and the mass spectrometry measurements under restrictive exchange quench conditions. In contrast to analytical quantitation that relies on measurements of peak intensity or area, quantitation in hydrogen exchange mass spectrometry depends on measuring a mass change with respect to an undeuterated or deuterated control, resulting in a value between zero and the maximum amount of deuterium that could be incorporated. Reliable quantitation is a function of experimental fidelity and to achieve high measurement reproducibility, a large number of experimental variables must be controlled during sample preparation and analysis. The method also reports on important qualitative aspects of the sample, including conformational heterogeneity and population dynamics. PMID:26048552

  8. A review of empirical research related to the use of small quantitative samples in clinical outcome scale development.

    PubMed

    Houts, Carrie R; Edwards, Michael C; Wirth, R J; Deal, Linda S

    2016-11-01

    There has been a notable increase in the advocacy of using small-sample designs as an initial quantitative assessment of item and scale performance during the scale development process. This is particularly true in the development of clinical outcome assessments (COAs), where Rasch analysis has been advanced as an appropriate statistical tool for evaluating the developing COAs using a small sample. We review the benefits such methods are purported to offer from both a practical and statistical standpoint and detail several problematic areas, including both practical and statistical theory concerns, with respect to the use of quantitative methods, including Rasch-consistent methods, with small samples. The feasibility of obtaining accurate information and the potential negative impacts of misusing large-sample statistical methods with small samples during COA development are discussed.

  9. Quantitative and qualitative approaches in educational research — problems and examples of controlled understanding through interpretive methods

    NASA Astrophysics Data System (ADS)

    Neumann, Karl

    1987-06-01

    In the methodological discussion of recent years it has become apparent that many research problems, including problems relating to the theory of educational science, cannot be solved by using quantitative methods. The multifaceted aspects of human behaviour and all its environment-bound subtle nuances, especially the process of education or the development of identity, cannot fully be taken into account within a rigid neopositivist approach. In employing the paradigm of symbolic interactionism as a suitable model for the analysis of processes of education and formation, the research has generally to start out from complex reciprocal social interactions instead of unambigious connections of causes. In analysing several particular methodological problems, the article demonstrates some weaknesses of quantitative approaches and then shows the advantages in and the necessity for using qualitative research tools.

  10. Quantitative evaluation of contrast-enhanced ultrasound after intravenous administration of a microbubble contrast agent for differentiation of benign and malignant thyroid nodules: assessment of diagnostic accuracy.

    PubMed

    Nemec, Ursula; Nemec, Stefan F; Novotny, Clemens; Weber, Michael; Czerny, Christian; Krestan, Christian R

    2012-06-01

    To investigate the diagnostic accuracy, through quantitative analysis, of contrast-enhanced ultrasound (CEUS), using a microbubble contrast agent, in the differentiation of thyroid nodules. This prospective study enrolled 46 patients with solitary, scintigraphically non-functional thyroid nodules. These patients were scheduled for surgery and underwent preoperative CEUS with pulse-inversion harmonic imaging after intravenous microbubble contrast medium administration. Using histology as a standard of reference, time-intensity curves of benign and malignant nodules were compared by means of peak enhancement and wash-out enhancement relative to the baseline intensity using a mixed model ANOVA. ROC analysis was performed to assess the diagnostic accuracy in the differentiation of benign and malignant nodules on CEUS. The complete CEUS data of 42 patients (31/42 [73.8%] benign and 11/42 [26.2%] malignant nodules) revealed a significant difference (P < 0.001) in enhancement between benign and malignant nodules. Furthermore, based on ROC analysis, CEUS demonstrated sensitivity of 76.9%, specificity of 84.8% and accuracy of 82.6%. Quantitative analysis of CEUS using a microbubble contrast agent allows the differentiation of benign and malignant thyroid nodules and may potentially serve, in addition to grey-scale and Doppler ultrasound, as an adjunctive tool in the assessment of patients with thyroid nodules. • Contrast-enhanced ultrasound (CEUS) helps differentiate between benign and malignant thyroid nodules. • Quantitative CEUS analysis yields sensitivity of 76.9% and specificity of 84.8%. • CEUS may be a potentially useful adjunct in assessing thyroid nodules.

  11. Fabrication de couches minces a memoire de forme et effets de l'irradiation ionique

    NASA Astrophysics Data System (ADS)

    Goldberg, Florent

    1998-09-01

    Nickel and titanium when combined in the right stoichiometric proportion (1:1) can form alloys showing the shape memory effect. Within the scope of this thesis, thin films of such alloys have been successfully produced by sputtering. Precise control of composition is crucial in order to obtain the shape memory effect. A combination of analytical tools which can accurately determine the behavior of such materials is also required (calorimetric analysis, crystallography, composition analysis, etc.). Rutherford backscattering spectrometry has been used for quantitative composition analysis. Thereafter irradiation of films with light ions (He+) of few MeV was shown to allow lowering of the characteristic premartensitic transformation temperatures while preserving the shape memory effect. Those results open the door to a new field of research, particularly for ion irradiation and its potential use as a tool to modify the thermomechanical behavior of shape memory thin film actuators.

  12. TASI: A software tool for spatial-temporal quantification of tumor spheroid dynamics.

    PubMed

    Hou, Yue; Konen, Jessica; Brat, Daniel J; Marcus, Adam I; Cooper, Lee A D

    2018-05-08

    Spheroid cultures derived from explanted cancer specimens are an increasingly utilized resource for studying complex biological processes like tumor cell invasion and metastasis, representing an important bridge between the simplicity and practicality of 2-dimensional monolayer cultures and the complexity and realism of in vivo animal models. Temporal imaging of spheroids can capture the dynamics of cell behaviors and microenvironments, and when combined with quantitative image analysis methods, enables deep interrogation of biological mechanisms. This paper presents a comprehensive open-source software framework for Temporal Analysis of Spheroid Imaging (TASI) that allows investigators to objectively characterize spheroid growth and invasion dynamics. TASI performs spatiotemporal segmentation of spheroid cultures, extraction of features describing spheroid morpho-phenotypes, mathematical modeling of spheroid dynamics, and statistical comparisons of experimental conditions. We demonstrate the utility of this tool in an analysis of non-small cell lung cancer spheroids that exhibit variability in metastatic and proliferative behaviors.

  13. OpenMS: a flexible open-source software platform for mass spectrometry data analysis.

    PubMed

    Röst, Hannes L; Sachsenberg, Timo; Aiche, Stephan; Bielow, Chris; Weisser, Hendrik; Aicheler, Fabian; Andreotti, Sandro; Ehrlich, Hans-Christian; Gutenbrunner, Petra; Kenar, Erhan; Liang, Xiao; Nahnsen, Sven; Nilse, Lars; Pfeuffer, Julianus; Rosenberger, George; Rurik, Marc; Schmitt, Uwe; Veit, Johannes; Walzer, Mathias; Wojnar, David; Wolski, Witold E; Schilling, Oliver; Choudhary, Jyoti S; Malmström, Lars; Aebersold, Ruedi; Reinert, Knut; Kohlbacher, Oliver

    2016-08-30

    High-resolution mass spectrometry (MS) has become an important tool in the life sciences, contributing to the diagnosis and understanding of human diseases, elucidating biomolecular structural information and characterizing cellular signaling networks. However, the rapid growth in the volume and complexity of MS data makes transparent, accurate and reproducible analysis difficult. We present OpenMS 2.0 (http://www.openms.de), a robust, open-source, cross-platform software specifically designed for the flexible and reproducible analysis of high-throughput MS data. The extensible OpenMS software implements common mass spectrometric data processing tasks through a well-defined application programming interface in C++ and Python and through standardized open data formats. OpenMS additionally provides a set of 185 tools and ready-made workflows for common mass spectrometric data processing tasks, which enable users to perform complex quantitative mass spectrometric analyses with ease.

  14. MASS SPECTROMETRY-BASED METABOLOMICS

    PubMed Central

    Dettmer, Katja; Aronov, Pavel A.; Hammock, Bruce D.

    2007-01-01

    This review presents an overview of the dynamically developing field of mass spectrometry-based metabolomics. Metabolomics aims at the comprehensive and quantitative analysis of wide arrays of metabolites in biological samples. These numerous analytes have very diverse physico-chemical properties and occur at different abundance levels. Consequently, comprehensive metabolomics investigations are primarily a challenge for analytical chemistry and specifically mass spectrometry has vast potential as a tool for this type of investigation. Metabolomics require special approaches for sample preparation, separation, and mass spectrometric analysis. Current examples of those approaches are described in this review. It primarily focuses on metabolic fingerprinting, a technique that analyzes all detectable analytes in a given sample with subsequent classification of samples and identification of differentially expressed metabolites, which define the sample classes. To perform this complex task, data analysis tools, metabolite libraries, and databases are required. Therefore, recent advances in metabolomics bioinformatics are also discussed. PMID:16921475

  15. Advances in the quantification of mitochondrial function in primary human immune cells through extracellular flux analysis.

    PubMed

    Nicholas, Dequina; Proctor, Elizabeth A; Raval, Forum M; Ip, Blanche C; Habib, Chloe; Ritou, Eleni; Grammatopoulos, Tom N; Steenkamp, Devin; Dooms, Hans; Apovian, Caroline M; Lauffenburger, Douglas A; Nikolajczyk, Barbara S

    2017-01-01

    Numerous studies show that mitochondrial energy generation determines the effectiveness of immune responses. Furthermore, changes in mitochondrial function may regulate lymphocyte function in inflammatory diseases like type 2 diabetes. Analysis of lymphocyte mitochondrial function has been facilitated by introduction of 96-well format extracellular flux (XF96) analyzers, but the technology remains imperfect for analysis of human lymphocytes. Limitations in XF technology include the lack of practical protocols for analysis of archived human cells, and inadequate data analysis tools that require manual quality checks. Current analysis tools for XF outcomes are also unable to automatically assess data quality and delete untenable data from the relatively high number of biological replicates needed to power complex human cell studies. The objectives of work presented herein are to test the impact of common cellular manipulations on XF outcomes, and to develop and validate a new automated tool that objectively analyzes a virtually unlimited number of samples to quantitate mitochondrial function in immune cells. We present significant improvements on previous XF analyses of primary human cells that will be absolutely essential to test the prediction that changes in immune cell mitochondrial function and fuel sources support immune dysfunction in chronic inflammatory diseases like type 2 diabetes.

  16. Challenges in Higher Education Research: The Use of Quantitative Tools in Comparative Analyses

    ERIC Educational Resources Information Center

    Reale, Emanuela

    2014-01-01

    Despite the value of the comparative perspective for the study of higher education is widely recognised, there is little consensus about the specific methodological approaches. Quantitative tools outlined their relevance for addressing comparative analyses since they are supposed to reducing the complexity, finding out and graduating similarities…

  17. Leadership Trust in Virtual Teams Using Communication Tools: A Quantitative Correlational Study

    ERIC Educational Resources Information Center

    Clark, Robert Lynn

    2014-01-01

    The purpose of this quantitative correlational study was to address leadership trust in virtual teams using communication tools in a small south-central, family-owned pharmaceutical organization, with multiple dispersed locations located in the United States. The results of the current research study could assist leaders to develop a communication…

  18. Simulations of Carnival Rides and Rube Goldberg Machines for the Visualization of Concepts of Statics and Dynamics

    ERIC Educational Resources Information Center

    Howard, William; Williams, Richard; Yao, Jason

    2010-01-01

    Solid modeling is widely used as a teaching tool in summer activities with high school students. The addition of motion analysis allows concepts from statics and dynamics to be introduced to students in both qualitative and quantitative ways. Two sets of solid modeling projects--carnival rides and Rube Goldberg machines--are shown to allow the…

  19. Software for Automated Image-to-Image Co-registration

    NASA Technical Reports Server (NTRS)

    Benkelman, Cody A.; Hughes, Heidi

    2007-01-01

    The project objectives are: a) Develop software to fine-tune image-to-image co-registration, presuming images are orthorectified prior to input; b) Create a reusable software development kit (SDK) to enable incorporation of these tools into other software; d) provide automated testing for quantitative analysis; and e) Develop software that applies multiple techniques to achieve subpixel precision in the co-registration of image pairs.

  20. FT. Sam 91 Whiskey Combat Medic Medical Simulation Training Quantitative Integration Enhancement Program

    DTIC Science & Technology

    2011-07-01

    joined the project team in the statistical and research coordination role. Dr. Collin is an employee at the University of Pittsburgh. A successful...3. Submit to Ft. Detrick Completed Milestone: Statistical analysis planning 1. Review planned data metrics and data gathering tools...approach to performance assessment for continuous quality improvement.  Analyzing data with modern statistical techniques to determine the

  1. Web-based automation of green building rating index and life cycle cost analysis

    NASA Astrophysics Data System (ADS)

    Shahzaib Khan, Jam; Zakaria, Rozana; Aminuddin, Eeydzah; IzieAdiana Abidin, Nur; Sahamir, Shaza Rina; Ahmad, Rosli; Nafis Abas, Darul

    2018-04-01

    Sudden decline in financial markets and economic meltdown has slow down adaptation and lowered interest of investors towards green certified buildings due to their higher initial costs. Similarly, it is essential to fetch investor’s attention towards more development of green buildings through automated tools for the construction projects. Though, historical dearth is found on the automation of green building rating tools that brings up an essential gap to develop an automated analog computerized programming tool. This paper present a proposed research aim to develop an integrated web-based automated analog computerized programming that applies green building rating assessment tool, green technology and life cycle cost analysis. It also emphasizes to identify variables of MyCrest and LCC to be integrated and developed in a framework then transformed into automated analog computerized programming. A mix methodology of qualitative and quantitative survey and its development portray the planned to carry MyCrest-LCC integration to an automated level. In this study, the preliminary literature review enriches better understanding of Green Building Rating Tools (GBRT) integration to LCC. The outcome of this research is a pave way for future researchers to integrate other efficient tool and parameters that contributes towards green buildings and future agendas.

  2. Influence of sample preparation and reliability of automated numerical refocusing in stain-free analysis of dissected tissues with quantitative phase digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Kemper, Björn; Lenz, Philipp; Bettenworth, Dominik; Krausewitz, Philipp; Domagk, Dirk; Ketelhut, Steffi

    2015-05-01

    Digital holographic microscopy (DHM) has been demonstrated to be a versatile tool for high resolution non-destructive quantitative phase imaging of surfaces and multi-modal minimally-invasive monitoring of living cell cultures in-vitro. DHM provides quantitative monitoring of physiological processes through functional imaging and structural analysis which, for example, gives new insight into signalling of cellular water permeability and cell morphology changes due to toxins and infections. Also the analysis of dissected tissues quantitative DHM phase contrast prospects application fields by stain-free imaging and the quantification of tissue density changes. We show that DHM allows imaging of different tissue layers with high contrast in unstained tissue sections. As the investigation of fixed samples represents a very important application field in pathology, we also analyzed the influence of the sample preparation. The retrieved data demonstrate that the quality of quantitative DHM phase images of dissected tissues depends strongly on the fixing method and common staining agents. As in DHM the reconstruction is performed numerically, multi-focus imaging is achieved from a single digital hologram. Thus, we evaluated the automated refocussing feature of DHM for application on different types of dissected tissues and revealed that on moderately stained samples highly reproducible holographic autofocussing can be achieved. Finally, it is demonstrated that alterations of the spatial refractive index distribution in murine and human tissue samples represent a reliable absolute parameter that is related of different degrees of inflammation in experimental colitis and Crohn's disease. This paves the way towards the usage of DHM in digital pathology for automated histological examinations and further studies to elucidate the translational potential of quantitative phase microscopy for the clinical management of patients, e.g., with inflammatory bowel disease.

  3. C-reactive protein estimation: a quantitative analysis for three nonsteroidal anti-inflammatory drugs: a randomized control trial.

    PubMed

    Salgia, Gaurav; Kulkarni, Deepak G; Shetty, Lakshmi

    2015-01-01

    C-reactive protein (CRP) estimation for quantitative analysis to assess anti-inflammatory action of nonsteroidal anti-inflammatory drugs (NSAIDs) after surgery in maxillofacial surgery. This study was to evaluate the efficacy of CRP as a quantitative analysis for objective assessment of efficacy of three NSAIDs in postoperative inflammation and pain control. The parallel study group design of randomization was done. Totally 60 patients were divided into three groups. CRP was evaluated at baseline and postoperatively (immediate and 72 h) after surgical removal of impacted lower third molar. The respective group received the drugs by random coding postoperatively. The assessment of pain control and inflammation using NSAIDs postoperatively after surgical removal of impacted lower third molar was qualitatively and quantitatively assessed with CRP levels. The blood sample of the patient was assessed immediate postoperatively and after 72 h. The visual analog scale (VAS) was used for assessment of pain and its correlation with CRP levels. Comparison of difference in levels of CRP levels had P < 0.05 with immediate postoperative and baseline levels. The duration of surgery with association of CRP levels P = 0.425 which was nonsignificant. The pain score was increased with mefenamic acid (P = 0.003), which was significant on VAS. Diclofenac had the best anti-inflammatory action. There was a significant increase in CRP levels in immediate postoperative values and 72 h. CRP test proved to be a useful indicator as a quantitative assessment tool for monitoring postsurgical inflammation and therapeutic effects of various anti-inflammatory drugs. CRP test is a useful indicator for quantitative assessment for comparative evaluation of NSAIDs.

  4. A Web-Based Decision Tool to Improve Contraceptive Counseling for Women With Chronic Medical Conditions: Protocol For a Mixed Methods Implementation Study

    PubMed Central

    Damschroder, Laura J; Fetters, Michael D; Zikmund-Fisher, Brian J; Crabtree, Benjamin F; Hudson, Shawna V; Ruffin IV, Mack T; Fucinari, Juliana; Kang, Minji; Taichman, L Susan; Creswell, John W

    2018-01-01

    Background Women with chronic medical conditions, such as diabetes and hypertension, have a higher risk of pregnancy-related complications compared with women without medical conditions and should be offered contraception if desired. Although evidence based guidelines for contraceptive selection in the presence of medical conditions are available via the United States Medical Eligibility Criteria (US MEC), these guidelines are underutilized. Research also supports the use of decision tools to promote shared decision making between patients and providers during contraceptive counseling. Objective The overall goal of the MiHealth, MiChoice project is to design and implement a theory-driven, Web-based tool that incorporates the US MEC (provider-level intervention) within the vehicle of a contraceptive decision tool for women with chronic medical conditions (patient-level intervention) in community-based primary care settings (practice-level intervention). This will be a 3-phase study that includes a predesign phase, a design phase, and a testing phase in a randomized controlled trial. This study protocol describes phase 1 and aim 1, which is to determine patient-, provider-, and practice-level factors that are relevant to the design and implementation of the contraceptive decision tool. Methods This is a mixed methods implementation study. To customize the delivery of the US MEC in the decision tool, we selected high-priority constructs from the Consolidated Framework for Implementation Research and the Theoretical Domains Framework to drive data collection and analysis at the practice and provider level, respectively. A conceptual model that incorporates constructs from the transtheoretical model and the health beliefs model undergirds patient-level data collection and analysis and will inform customization of the decision tool for this population. We will recruit 6 community-based primary care practices and conduct quantitative surveys and semistructured qualitative interviews with women who have chronic medical conditions, their primary care providers (PCPs), and clinic staff, as well as field observations of practice activities. Quantitative survey data will be summarized with simple descriptive statistics and relationships between participant characteristics and contraceptive recommendations (for PCPs), and current contraceptive use (for patients) will be examined using Fisher exact test. We will conduct thematic analysis of qualitative data from interviews and field observations. The integration of data will occur by comparing, contrasting, and synthesizing qualitative and quantitative findings to inform the future development and implementation of the intervention. Results We are currently enrolling practices and anticipate study completion in 15 months. Conclusions This protocol describes the first phase of a multiphase mixed methods study to develop and implement a Web-based decision tool that is customized to meet the needs of women with chronic medical conditions in primary care settings. Study findings will promote contraceptive counseling via shared decision making and reflect evidence-based guidelines for contraceptive selection. Trial Registration ClinicalTrials.gov NCT03153644; https://clinicaltrials.gov/ct2/show/NCT03153644 (Archived by WebCite at http://www.webcitation.org/6yUkA5lK8) PMID:29669707

  5. Precise quantitation of 136 urinary proteins by LC/MRM-MS using stable isotope labeled peptides as internal standards for biomarker discovery and/or verification studies.

    PubMed

    Percy, Andrew J; Yang, Juncong; Hardie, Darryl B; Chambers, Andrew G; Tamura-Wells, Jessica; Borchers, Christoph H

    2015-06-15

    Spurred on by the growing demand for panels of validated disease biomarkers, increasing efforts have focused on advancing qualitative and quantitative tools for more highly multiplexed and sensitive analyses of a multitude of analytes in various human biofluids. In quantitative proteomics, evolving strategies involve the use of the targeted multiple reaction monitoring (MRM) mode of mass spectrometry (MS) with stable isotope-labeled standards (SIS) used for internal normalization. Using that preferred approach with non-invasive urine samples, we have systematically advanced and rigorously assessed the methodology toward the precise quantitation of the largest, multiplexed panel of candidate protein biomarkers in human urine to date. The concentrations of the 136 proteins span >5 orders of magnitude (from 8.6 μg/mL to 25 pg/mL), with average CVs of 8.6% over process triplicate. Detailed here is our quantitative method, the analysis strategy, a feasibility application to prostate cancer samples, and a discussion of the utility of this method in translational studies. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Combining U.S.-based prioritization tools to improve screening level accountability for environmental impact: the case of the chemical manufacturing industry.

    PubMed

    Zhou, Xiaoying; Schoenung, Julie M

    2009-12-15

    There are two quantitative indicators that are most widely used to assess the extent of compliance of industrial facilities with environmental regulations: the quantity of hazardous waste generated and the amount of toxics released. These indicators, albeit useful in terms of some environmental monitoring, fail to account for direct or indirect effects on human and environmental health, especially when aggregating total quantity of releases for a facility or industry sector. Thus, there is a need for a more comprehensive approach that can prioritize a particular chemical (or industry sector) on the basis of its relevant environmental performance and impact on human health. Accordingly, the objective of the present study is to formulate an aggregation of tools that can simultaneously capture multiple effects and several environmental impact categories. This approach allows us to compare and combine results generated with the aid of select U.S.-based quantitative impact assessment tools, thereby supplementing compliance-based metrics such as data from the U.S. Toxic Release Inventory. A case study, which presents findings for the U.S. chemical manufacturing industry, is presented to illustrate the aggregation of these tools. Environmental impacts due to both upstream and manufacturing activities are also evaluated for each industry sector. The proposed combinatorial analysis allows for a more robust evaluation for rating and prioritizing the environmental impacts of industrial waste.

  7. ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density.

    PubMed

    Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro

    2018-01-01

    The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done.

  8. ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density

    PubMed Central

    Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro

    2018-01-01

    The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done. PMID:29765345

  9. Nuclear magnetic resonance and high-performance liquid chromatography techniques for the characterization of bioactive compounds from Humulus lupulus L. (hop).

    PubMed

    Bertelli, Davide; Brighenti, Virginia; Marchetti, Lucia; Reik, Anna; Pellati, Federica

    2018-06-01

    Humulus lupulus L. (hop) represents one of the most cultivated crops, it being a key ingredient in the brewing process. Many health-related properties have been described for hop extracts, making this plant gain more interest in the field of pharmaceutical and nutraceutical research. Among the analytical tools available for the phytochemical characterization of plant extracts, quantitative nuclear magnetic resonance (qNMR) represents a new and powerful technique. In this ambit, the present study was aimed at the development of a new, simple, and efficient qNMR method for the metabolite fingerprinting of bioactive compounds in hop cones, taking advantage of the novel ERETIC 2 tool. To the best of our knowledge, this is the first attempt to apply this method to complex matrices of natural origin, such as hop extracts. The qNMR method set up in this study was applied to the quantification of both prenylflavonoids and bitter acids in eight hop cultivars. The performance of this analytical method was compared with that of HPLC-UV/DAD, which represents the most frequently used technique in the field of natural product analysis. The quantitative data obtained for hop samples by means of the two aforementioned techniques highlighted that the amount of bioactive compounds was slightly higher when qNMR was applied, although the order of magnitude of the values was the same. The accuracy of qNMR was comparable to that of the chromatographic method, thus proving to be a reliable tool for the analysis of these secondary metabolites in hop extracts. Graphical abstract Graphical abstract related to the extraction and analytical methods applied in this work for the analysis of bioactive compounds in Humulus lupulus L. (hop) cones.

  10. Frequency analysis for modulation-enhanced powder diffraction.

    PubMed

    Chernyshov, Dmitry; Dyadkin, Vadim; van Beek, Wouter; Urakawa, Atsushi

    2016-07-01

    Periodic modulation of external conditions on a crystalline sample with a consequent analysis of periodic diffraction response has been recently proposed as a tool to enhance experimental sensitivity for minor structural changes. Here the intensity distributions for both a linear and nonlinear structural response induced by a symmetric and periodic stimulus are analysed. The analysis is further extended for powder diffraction when an external perturbation changes not only the intensity of Bragg lines but also their positions. The derived results should serve as a basis for a quantitative modelling of modulation-enhanced diffraction data measured in real conditions.

  11. A Method for Comprehensive Glycosite-Mapping and Direct Quantitation of Serum Glycoproteins.

    PubMed

    Hong, Qiuting; Ruhaak, L Renee; Stroble, Carol; Parker, Evan; Huang, Jincui; Maverakis, Emanual; Lebrilla, Carlito B

    2015-12-04

    A comprehensive glycan map was constructed for the top eight abundant glycoproteins in plasma using both specific and nonspecific enzyme digestions followed by nano liquid chromatography (LC)-chip/quadrupole time-of-flight mass spectrometry (MS) analysis. Glycopeptides were identified using an in-house software tool, GPFinder. A sensitive and reproducible multiple reaction monitoring (MRM) technique on a triple quadrupole MS was developed and applied to quantify immunoglobulins G, A, M, and their site-specific glycans simultaneously and directly from human serum/plasma without protein enrichments. A total of 64 glycopeptides and 15 peptides were monitored for IgG, IgA, and IgM in a 20 min ultra high performance (UP)LC gradient. The absolute protein contents were quantified using peptide calibration curves. The glycopeptide ion abundances were normalized to the respective protein abundances to separate protein glycosylation from protein expression. This technique yields higher method reproducibility and less sample loss when compared with the quantitation method that involves protein enrichments. The absolute protein quantitation has a wide linear range (3-4 orders of magnitude) and low limit of quantitation (femtomole level). This rapid and robust quantitation technique, which provides quantitative information for both proteins and glycosylation, will further facilitate disease biomarker discoveries.

  12. Chemiluminescence microarrays in analytical chemistry: a critical review.

    PubMed

    Seidel, Michael; Niessner, Reinhard

    2014-09-01

    Multi-analyte immunoassays on microarrays and on multiplex DNA microarrays have been described for quantitative analysis of small organic molecules (e.g., antibiotics, drugs of abuse, small molecule toxins), proteins (e.g., antibodies or protein toxins), and microorganisms, viruses, and eukaryotic cells. In analytical chemistry, multi-analyte detection by use of analytical microarrays has become an innovative research topic because of the possibility of generating several sets of quantitative data for different analyte classes in a short time. Chemiluminescence (CL) microarrays are powerful tools for rapid multiplex analysis of complex matrices. A wide range of applications for CL microarrays is described in the literature dealing with analytical microarrays. The motivation for this review is to summarize the current state of CL-based analytical microarrays. Combining analysis of different compound classes on CL microarrays reduces analysis time, cost of reagents, and use of laboratory space. Applications are discussed, with examples from food safety, water safety, environmental monitoring, diagnostics, forensics, toxicology, and biosecurity. The potential and limitations of research on multiplex analysis by use of CL microarrays are discussed in this review.

  13. Chemical Fingerprint and Quantitative Analysis for the Quality Evaluation of Docynia dcne Leaves by High-Performance Liquid Chromatography Coupled with Chemometrics Analysis.

    PubMed

    Zhang, Xiaoyu; Mei, Xueran; Wang, Zhanguo; Wu, Jing; Liu, Gang; Hu, Huiling; Li, Qijuan

    2018-05-24

    Docynia dcne leaf from the genus of Docynia Dcne (including three species of Docynia delavayi, Docynia indica and Docynia longiunguis.) is an important raw material of local ethnic minority tea, ethnomedicines and food supplements in southwestern areas of China. However, D. dcne leaves from these three species are usually used confusingly, which could influence the therapeutic effect of it. A rapid and effective method for the chemical fingerprint and quantitative analysis to evaluate the quality of D. dcne leaves was established. The chemometric methods, including similarity analysis, hierarchical cluster analysis and partial least-squares discrimination analysis, were applied to distinguish 30 batches of D. dcne leaf samples from these three species. The above results could validate each other and successfully group these samples into three categories which were closely related to the species of D. dcne leaves. Moreover, isoquercitrin and phlorizin were screened as the chemical markers to evaluate the quality of D. dcne leaves from different species. And the contents of isoquercitrin and phlorizin varied remarkably in these samples, with ranges of 6.41-38.84 and 95.73-217.76 mg/g, respectively. All the results indicated that an integration method of chemical fingerprint couple with chemometrics analysis and quantitative assessment was a powerful and beneficial tool for quality control of D. dcne leaves, and could be applied also for differentiation and quality control of other herbal preparations.

  14. Analysis of High-Throughput ELISA Microarray Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Amanda M.; Daly, Don S.; Zangar, Richard C.

    Our research group develops analytical methods and software for the high-throughput analysis of quantitative enzyme-linked immunosorbent assay (ELISA) microarrays. ELISA microarrays differ from DNA microarrays in several fundamental aspects and most algorithms for analysis of DNA microarray data are not applicable to ELISA microarrays. In this review, we provide an overview of the steps involved in ELISA microarray data analysis and how the statistically sound algorithms we have developed provide an integrated software suite to address the needs of each data-processing step. The algorithms discussed are available in a set of open-source software tools (http://www.pnl.gov/statistics/ProMAT).

  15. "What else are you worried about?" – Integrating textual responses into quantitative social science research

    PubMed Central

    Brümmer, Martin; Schmukle, Stefan C.; Goebel, Jan; Wagner, Gert G.

    2017-01-01

    Open-ended questions have routinely been included in large-scale survey and panel studies, yet there is some perplexity about how to actually incorporate the answers to such questions into quantitative social science research. Tools developed recently in the domain of natural language processing offer a wide range of options for the automated analysis of such textual data, but their implementation has lagged behind. In this study, we demonstrate straightforward procedures that can be applied to process and analyze textual data for the purposes of quantitative social science research. Using more than 35,000 textual answers to the question “What else are you worried about?” from participants of the German Socio-economic Panel Study (SOEP), we (1) analyzed characteristics of respondents that determined whether they answered the open-ended question, (2) used the textual data to detect relevant topics that were reported by the respondents, and (3) linked the features of the respondents to the worries they reported in their textual data. The potential uses as well as the limitations of the automated analysis of textual data are discussed. PMID:28759628

  16. High-resolution dynamic imaging and quantitative analysis of lung cancer xenografts in nude mice using clinical PET/CT

    PubMed Central

    Wang, Ying Yi; Wang, Kai; Xu, Zuo Yu; Song, Yan; Wang, Chu Nan; Zhang, Chong Qing; Sun, Xi Lin; Shen, Bao Zhong

    2017-01-01

    Considering the general application of dedicated small-animal positron emission tomography/computed tomography is limited, an acceptable alternative in many situations might be clinical PET/CT. To estimate the feasibility of using clinical PET/CT with [F-18]-fluoro-2-deoxy-D-glucose for high-resolution dynamic imaging and quantitative analysis of cancer xenografts in nude mice. Dynamic clinical PET/CT scans were performed on xenografts for 60 min after injection with [F-18]-fluoro-2-deoxy-D-glucose. Scans were reconstructed with or without SharpIR method in two phases. And mice were sacrificed to extracting major organs and tumors, using ex vivo γ-counting as a reference. Strikingly, we observed that the image quality and the correlation between the all quantitive data from clinical PET/CT and the ex vivo counting was better with the SharpIR reconstructions than without. Our data demonstrate that clinical PET/CT scanner with SharpIR reconstruction is a valuable tool for imaging small animals in preclinical cancer research, offering dynamic imaging parameters, good image quality and accurate data quatification. PMID:28881772

  17. High-resolution dynamic imaging and quantitative analysis of lung cancer xenografts in nude mice using clinical PET/CT.

    PubMed

    Wang, Ying Yi; Wang, Kai; Xu, Zuo Yu; Song, Yan; Wang, Chu Nan; Zhang, Chong Qing; Sun, Xi Lin; Shen, Bao Zhong

    2017-08-08

    Considering the general application of dedicated small-animal positron emission tomography/computed tomography is limited, an acceptable alternative in many situations might be clinical PET/CT. To estimate the feasibility of using clinical PET/CT with [F-18]-fluoro-2-deoxy-D-glucose for high-resolution dynamic imaging and quantitative analysis of cancer xenografts in nude mice. Dynamic clinical PET/CT scans were performed on xenografts for 60 min after injection with [F-18]-fluoro-2-deoxy-D-glucose. Scans were reconstructed with or without SharpIR method in two phases. And mice were sacrificed to extracting major organs and tumors, using ex vivo γ-counting as a reference. Strikingly, we observed that the image quality and the correlation between the all quantitive data from clinical PET/CT and the ex vivo counting was better with the SharpIR reconstructions than without. Our data demonstrate that clinical PET/CT scanner with SharpIR reconstruction is a valuable tool for imaging small animals in preclinical cancer research, offering dynamic imaging parameters, good image quality and accurate data quatification.

  18. A quantitative benefit-risk assessment approach to improve decision making in drug development: Application of a multicriteria decision analysis model in the development of combination therapy for overactive bladder.

    PubMed

    de Greef-van der Sandt, I; Newgreen, D; Schaddelee, M; Dorrepaal, C; Martina, R; Ridder, A; van Maanen, R

    2016-04-01

    A multicriteria decision analysis (MCDA) approach was developed and used to estimate the benefit-risk of solifenacin and mirabegron and their combination in the treatment of overactive bladder (OAB). The objectives were 1) to develop an MCDA tool to compare drug effects in OAB quantitatively, 2) to establish transparency in the evaluation of the benefit-risk profile of various dose combinations, and 3) to quantify the added value of combination use compared to monotherapies. The MCDA model was developed using efficacy, safety, and tolerability attributes and the results of a phase II factorial design combination study were evaluated. Combinations of solifenacin 5 mg and mirabegron 25 mg and mirabegron 50 (5+25 and 5+50) scored the highest clinical utility and supported combination therapy development of solifenacin and mirabegron for phase III clinical development at these dose regimens. This case study underlines the benefit of using a quantitative approach in clinical drug development programs. © 2015 The American Society for Clinical Pharmacology and Therapeutics.

  19. "What else are you worried about?" - Integrating textual responses into quantitative social science research.

    PubMed

    Rohrer, Julia M; Brümmer, Martin; Schmukle, Stefan C; Goebel, Jan; Wagner, Gert G

    2017-01-01

    Open-ended questions have routinely been included in large-scale survey and panel studies, yet there is some perplexity about how to actually incorporate the answers to such questions into quantitative social science research. Tools developed recently in the domain of natural language processing offer a wide range of options for the automated analysis of such textual data, but their implementation has lagged behind. In this study, we demonstrate straightforward procedures that can be applied to process and analyze textual data for the purposes of quantitative social science research. Using more than 35,000 textual answers to the question "What else are you worried about?" from participants of the German Socio-economic Panel Study (SOEP), we (1) analyzed characteristics of respondents that determined whether they answered the open-ended question, (2) used the textual data to detect relevant topics that were reported by the respondents, and (3) linked the features of the respondents to the worries they reported in their textual data. The potential uses as well as the limitations of the automated analysis of textual data are discussed.

  20. Evaluating quantitative 3-D image analysis as a design tool for low enriched uranium fuel compacts for the transient reactor test facility: A preliminary study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kane, J. J.; van Rooyen, I. J.; Craft, A. E.

    In this study, 3-D image analysis when combined with a non-destructive examination technique such as X-ray computed tomography (CT) provides a highly quantitative tool for the investigation of a material’s structure. In this investigation 3-D image analysis and X-ray CT were combined to analyze the microstructure of a preliminary subsized fuel compact for the Transient Reactor Test Facility’s low enriched uranium conversion program to assess the feasibility of the combined techniques for use in the optimization of the fuel compact fabrication process. The quantitative image analysis focused on determining the size and spatial distribution of the surrogate fuel particles andmore » the size, shape, and orientation of voids within the compact. Additionally, the maximum effect of microstructural features on heat transfer through the carbonaceous matrix of the preliminary compact was estimated. The surrogate fuel particles occupied 0.8% of the compact by volume with a log-normal distribution of particle sizes with a mean diameter of 39 μm and a standard deviation of 16 μm. Roughly 39% of the particles had a diameter greater than the specified maximum particle size of 44 μm suggesting that the particles agglomerate during fabrication. The local volume fraction of particles also varies significantly within the compact although uniformities appear to be evenly dispersed throughout the analysed volume. The voids produced during fabrication were on average plate-like in nature with their major axis oriented perpendicular to the compaction direction of the compact. Finally, the microstructure, mainly the large preferentially oriented voids, may cause a small degree of anisotropy in the thermal diffusivity within the compact. α∥/α⊥, the ratio of thermal diffusivities parallel to and perpendicular to the compaction direction are expected to be no less than 0.95 with an upper bound of 1.« less

  1. Evaluating quantitative 3-D image analysis as a design tool for low enriched uranium fuel compacts for the transient reactor test facility: A preliminary study

    DOE PAGES

    Kane, J. J.; van Rooyen, I. J.; Craft, A. E.; ...

    2016-02-05

    In this study, 3-D image analysis when combined with a non-destructive examination technique such as X-ray computed tomography (CT) provides a highly quantitative tool for the investigation of a material’s structure. In this investigation 3-D image analysis and X-ray CT were combined to analyze the microstructure of a preliminary subsized fuel compact for the Transient Reactor Test Facility’s low enriched uranium conversion program to assess the feasibility of the combined techniques for use in the optimization of the fuel compact fabrication process. The quantitative image analysis focused on determining the size and spatial distribution of the surrogate fuel particles andmore » the size, shape, and orientation of voids within the compact. Additionally, the maximum effect of microstructural features on heat transfer through the carbonaceous matrix of the preliminary compact was estimated. The surrogate fuel particles occupied 0.8% of the compact by volume with a log-normal distribution of particle sizes with a mean diameter of 39 μm and a standard deviation of 16 μm. Roughly 39% of the particles had a diameter greater than the specified maximum particle size of 44 μm suggesting that the particles agglomerate during fabrication. The local volume fraction of particles also varies significantly within the compact although uniformities appear to be evenly dispersed throughout the analysed volume. The voids produced during fabrication were on average plate-like in nature with their major axis oriented perpendicular to the compaction direction of the compact. Finally, the microstructure, mainly the large preferentially oriented voids, may cause a small degree of anisotropy in the thermal diffusivity within the compact. α∥/α⊥, the ratio of thermal diffusivities parallel to and perpendicular to the compaction direction are expected to be no less than 0.95 with an upper bound of 1.« less

  2. PyQuant: A Versatile Framework for Analysis of Quantitative Mass Spectrometry Data.

    PubMed

    Mitchell, Christopher J; Kim, Min-Sik; Na, Chan Hyun; Pandey, Akhilesh

    2016-08-01

    Quantitative mass spectrometry data necessitates an analytical pipeline that captures the accuracy and comprehensiveness of the experiments. Currently, data analysis is often coupled to specific software packages, which restricts the analysis to a given workflow and precludes a more thorough characterization of the data by other complementary tools. To address this, we have developed PyQuant, a cross-platform mass spectrometry data quantification application that is compatible with existing frameworks and can be used as a stand-alone quantification tool. PyQuant supports most types of quantitative mass spectrometry data including SILAC, NeuCode, (15)N, (13)C, or (18)O and chemical methods such as iTRAQ or TMT and provides the option of adding custom labeling strategies. In addition, PyQuant can perform specialized analyses such as quantifying isotopically labeled samples where the label has been metabolized into other amino acids and targeted quantification of selected ions independent of spectral assignment. PyQuant is capable of quantifying search results from popular proteomic frameworks such as MaxQuant, Proteome Discoverer, and the Trans-Proteomic Pipeline in addition to several standalone search engines. We have found that PyQuant routinely quantifies a greater proportion of spectral assignments, with increases ranging from 25-45% in this study. Finally, PyQuant is capable of complementing spectral assignments between replicates to quantify ions missed because of lack of MS/MS fragmentation or that were omitted because of issues such as spectra quality or false discovery rates. This results in an increase of biologically useful data available for interpretation. In summary, PyQuant is a flexible mass spectrometry data quantification platform that is capable of interfacing with a variety of existing formats and is highly customizable, which permits easy configuration for custom analysis. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.

  3. The Road to Reason.

    ERIC Educational Resources Information Center

    Wright, Benjamin D.

    2000-01-01

    Summarizes the distinctions between qualitative and quantitative research and shows their complementary aspects. Shows there is no contradiction or conflict between the qualitative and the quantitative and discusses Rasch measurement as the construction tool of quantitative research. (SLD)

  4. TRAM (Transcriptome Mapper): database-driven creation and analysis of transcriptome maps from multiple sources

    PubMed Central

    2011-01-01

    Background Several tools have been developed to perform global gene expression profile data analysis, to search for specific chromosomal regions whose features meet defined criteria as well as to study neighbouring gene expression. However, most of these tools are tailored for a specific use in a particular context (e.g. they are species-specific, or limited to a particular data format) and they typically accept only gene lists as input. Results TRAM (Transcriptome Mapper) is a new general tool that allows the simple generation and analysis of quantitative transcriptome maps, starting from any source listing gene expression values for a given gene set (e.g. expression microarrays), implemented as a relational database. It includes a parser able to assign univocal and updated gene symbols to gene identifiers from different data sources. Moreover, TRAM is able to perform intra-sample and inter-sample data normalization, including an original variant of quantile normalization (scaled quantile), useful to normalize data from platforms with highly different numbers of investigated genes. When in 'Map' mode, the software generates a quantitative representation of the transcriptome of a sample (or of a pool of samples) and identifies if segments of defined lengths are over/under-expressed compared to the desired threshold. When in 'Cluster' mode, the software searches for a set of over/under-expressed consecutive genes. Statistical significance for all results is calculated with respect to genes localized on the same chromosome or to all genome genes. Transcriptome maps, showing differential expression between two sample groups, relative to two different biological conditions, may be easily generated. We present the results of a biological model test, based on a meta-analysis comparison between a sample pool of human CD34+ hematopoietic progenitor cells and a sample pool of megakaryocytic cells. Biologically relevant chromosomal segments and gene clusters with differential expression during the differentiation toward megakaryocyte were identified. Conclusions TRAM is designed to create, and statistically analyze, quantitative transcriptome maps, based on gene expression data from multiple sources. The release includes FileMaker Pro database management runtime application and it is freely available at http://apollo11.isto.unibo.it/software/, along with preconfigured implementations for mapping of human, mouse and zebrafish transcriptomes. PMID:21333005

  5. Properties of O dwarf stars in 30 Doradus

    NASA Astrophysics Data System (ADS)

    Sabín-Sanjulián, Carolina; VFTS Collaboration

    2017-11-01

    We perform a quantitative spectroscopic analysis of 105 presumably single O dwarf stars in 30 Doradus, located within the Large Magellanic Cloud. We use mid-to-high resolution multi-epoch optical spectroscopic data obtained within the VLT-FLAMES Tarantula Survey. Stellar and wind parameters are derived by means of the automatic tool iacob-gbat, which is based on a large grid of fastwind models. We also benefit from the Bayesian tool bonnsai to estimate evolutionary masses. We provide a spectral calibration for the effective temperature of O dwarf stars in the LMC, deal with the mass discrepancy problem and investigate the wind properties of the sample.

  6. Quantitative fluorescence angiography for neurosurgical interventions.

    PubMed

    Weichelt, Claudia; Duscha, Philipp; Steinmeier, Ralf; Meyer, Tobias; Kuß, Julia; Cimalla, Peter; Kirsch, Matthias; Sobottka, Stephan B; Koch, Edmund; Schackert, Gabriele; Morgenstern, Ute

    2013-06-01

    Present methods for quantitative measurement of cerebral perfusion during neurosurgical operations require additional technology for measurement, data acquisition, and processing. This study used conventional fluorescence video angiography--as an established method to visualize blood flow in brain vessels--enhanced by a quantifying perfusion software tool. For these purposes, the fluorescence dye indocyanine green is given intravenously, and after activation by a near-infrared light source the fluorescence signal is recorded. Video data are analyzed by software algorithms to allow quantification of the blood flow. Additionally, perfusion is measured intraoperatively by a reference system. Furthermore, comparing reference measurements using a flow phantom were performed to verify the quantitative blood flow results of the software and to validate the software algorithm. Analysis of intraoperative video data provides characteristic biological parameters. These parameters were implemented in the special flow phantom for experimental validation of the developed software algorithms. Furthermore, various factors that influence the determination of perfusion parameters were analyzed by means of mathematical simulation. Comparing patient measurement, phantom experiment, and computer simulation under certain conditions (variable frame rate, vessel diameter, etc.), the results of the software algorithms are within the range of parameter accuracy of the reference methods. Therefore, the software algorithm for calculating cortical perfusion parameters from video data presents a helpful intraoperative tool without complex additional measurement technology.

  7. Visualizing vascular structures in virtual environments

    NASA Astrophysics Data System (ADS)

    Wischgoll, Thomas

    2013-01-01

    In order to learn more about the cause of coronary heart diseases and develop diagnostic tools, the extraction and visualization of vascular structures from volumetric scans for further analysis is an important step. By determining a geometric representation of the vasculature, the geometry can be inspected and additional quantitative data calculated and incorporated into the visualization of the vasculature. To provide a more user-friendly visualization tool, virtual environment paradigms can be utilized. This paper describes techniques for interactive rendering of large-scale vascular structures within virtual environments. This can be applied to almost any virtual environment configuration, such as CAVE-type displays. Specifically, the tools presented in this paper were tested on a Barco I-Space and a large 62x108 inch passive projection screen with a Kinect sensor for user tracking.

  8. Indirect Observation in Everyday Contexts: Concepts and Methodological Guidelines within a Mixed Methods Framework

    PubMed Central

    Anguera, M. Teresa; Portell, Mariona; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2018-01-01

    Indirect observation is a recent concept in systematic observation. It largely involves analyzing textual material generated either indirectly from transcriptions of audio recordings of verbal behavior in natural settings (e.g., conversation, group discussions) or directly from narratives (e.g., letters of complaint, tweets, forum posts). It may also feature seemingly unobtrusive objects that can provide relevant insights into daily routines. All these materials constitute an extremely rich source of information for studying everyday life, and they are continuously growing with the burgeoning of new technologies for data recording, dissemination, and storage. Narratives are an excellent vehicle for studying everyday life, and quantitization is proposed as a means of integrating qualitative and quantitative elements. However, this analysis requires a structured system that enables researchers to analyze varying forms and sources of information objectively. In this paper, we present a methodological framework detailing the steps and decisions required to quantitatively analyze a set of data that was originally qualitative. We provide guidelines on study dimensions, text segmentation criteria, ad hoc observation instruments, data quality controls, and coding and preparation of text for quantitative analysis. The quality control stage is essential to ensure that the code matrices generated from the qualitative data are reliable. We provide examples of how an indirect observation study can produce data for quantitative analysis and also describe the different software tools available for the various stages of the process. The proposed method is framed within a specific mixed methods approach that involves collecting qualitative data and subsequently transforming these into matrices of codes (not frequencies) for quantitative analysis to detect underlying structures and behavioral patterns. The data collection and quality control procedures fully meet the requirement of flexibility and provide new perspectives on data integration in the study of biopsychosocial aspects in everyday contexts. PMID:29441028

  9. A Bayesian deconvolution strategy for immunoprecipitation-based DNA methylome analysis

    PubMed Central

    Down, Thomas A.; Rakyan, Vardhman K.; Turner, Daniel J.; Flicek, Paul; Li, Heng; Kulesha, Eugene; Gräf, Stefan; Johnson, Nathan; Herrero, Javier; Tomazou, Eleni M.; Thorne, Natalie P.; Bäckdahl, Liselotte; Herberth, Marlis; Howe, Kevin L.; Jackson, David K.; Miretti, Marcos M.; Marioni, John C.; Birney, Ewan; Hubbard, Tim J. P.; Durbin, Richard; Tavaré, Simon; Beck, Stephan

    2009-01-01

    DNA methylation is an indispensible epigenetic modification of mammalian genomes. Consequently there is great interest in strategies for genome-wide/whole-genome DNA methylation analysis, and immunoprecipitation-based methods have proven to be a powerful option. Such methods are rapidly shifting the bottleneck from data generation to data analysis, necessitating the development of better analytical tools. Until now, a major analytical difficulty associated with immunoprecipitation-based DNA methylation profiling has been the inability to estimate absolute methylation levels. Here we report the development of a novel cross-platform algorithm – Bayesian Tool for Methylation Analysis (Batman) – for analyzing Methylated DNA Immunoprecipitation (MeDIP) profiles generated using arrays (MeDIP-chip) or next-generation sequencing (MeDIP-seq). The latter is an approach we have developed to elucidate the first high-resolution whole-genome DNA methylation profile (DNA methylome) of any mammalian genome. MeDIP-seq/MeDIP-chip combined with Batman represent robust, quantitative, and cost-effective functional genomic strategies for elucidating the function of DNA methylation. PMID:18612301

  10. The current role of high-resolution mass spectrometry in food analysis.

    PubMed

    Kaufmann, Anton

    2012-05-01

    High-resolution mass spectrometry (HRMS), which is used for residue analysis in food, has gained wider acceptance in the last few years. This development is due to the availability of more rugged, sensitive, and selective instrumentation. The benefits provided by HRMS over classical unit-mass-resolution tandem mass spectrometry are considerable. These benefits include the collection of full-scan spectra, which provides greater insight into the composition of a sample. Consequently, the analyst has the freedom to measure compounds without previous compound-specific tuning, the possibility of retrospective data analysis, and the capability of performing structural elucidations of unknown or suspected compounds. HRMS strongly competes with classical tandem mass spectrometry in the field of quantitative multiresidue methods (e.g., pesticides and veterinary drugs). It is one of the most promising tools when moving towards nontargeted approaches. Certain hardware and software issues still have to be addressed by the instrument manufacturers for it to dislodge tandem mass spectrometry from its position as the standard trace analysis tool.

  11. Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2002-07

    USGS Publications Warehouse

    Pearson, D.K.; Gary, R.H.; Wilson, Z.D.

    2007-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is particularly useful when analyzing a wide variety of spatial data such as with remote sensing and spatial analysis. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This document presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup from 2002 through 2007.

  12. ampliMethProfiler: a pipeline for the analysis of CpG methylation profiles of targeted deep bisulfite sequenced amplicons.

    PubMed

    Scala, Giovanni; Affinito, Ornella; Palumbo, Domenico; Florio, Ermanno; Monticelli, Antonella; Miele, Gennaro; Chiariotti, Lorenzo; Cocozza, Sergio

    2016-11-25

    CpG sites in an individual molecule may exist in a binary state (methylated or unmethylated) and each individual DNA molecule, containing a certain number of CpGs, is a combination of these states defining an epihaplotype. Classic quantification based approaches to study DNA methylation are intrinsically unable to fully represent the complexity of the underlying methylation substrate. Epihaplotype based approaches, on the other hand, allow methylation profiles of cell populations to be studied at the single molecule level. For such investigations, next-generation sequencing techniques can be used, both for quantitative and for epihaplotype analysis. Currently available tools for methylation analysis lack output formats that explicitly report CpG methylation profiles at the single molecule level and that have suited statistical tools for their interpretation. Here we present ampliMethProfiler, a python-based pipeline for the extraction and statistical epihaplotype analysis of amplicons from targeted deep bisulfite sequencing of multiple DNA regions. ampliMethProfiler tool provides an easy and user friendly way to extract and analyze the epihaplotype composition of reads from targeted bisulfite sequencing experiments. ampliMethProfiler is written in python language and requires a local installation of BLAST and (optionally) QIIME tools. It can be run on Linux and OS X platforms. The software is open source and freely available at http://amplimethprofiler.sourceforge.net .

  13. Integration of PKPD relationships into benefit–risk analysis

    PubMed Central

    Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar

    2015-01-01

    Aim Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit–risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit–risk assessment. In addition, we propose the use of pharmacokinetic–pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. Methods A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit–risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. Results A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit–risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit–risk balance before extensive evidence is generated in clinical practice. Conclusions Benefit–risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. PMID:25940398

  14. Integration of PKPD relationships into benefit-risk analysis.

    PubMed

    Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar

    2015-11-01

    Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit-risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit-risk assessment. In addition, we propose the use of pharmacokinetic-pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit-risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit-risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit-risk balance before extensive evidence is generated in clinical practice. Benefit-risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. © 2015 The British Pharmacological Society.

  15. dCLIP: a computational approach for comparative CLIP-seq analyses

    PubMed Central

    2014-01-01

    Although comparison of RNA-protein interaction profiles across different conditions has become increasingly important to understanding the function of RNA-binding proteins (RBPs), few computational approaches have been developed for quantitative comparison of CLIP-seq datasets. Here, we present an easy-to-use command line tool, dCLIP, for quantitative CLIP-seq comparative analysis. The two-stage method implemented in dCLIP, including a modified MA normalization method and a hidden Markov model, is shown to be able to effectively identify differential binding regions of RBPs in four CLIP-seq datasets, generated by HITS-CLIP, iCLIP and PAR-CLIP protocols. dCLIP is freely available at http://qbrc.swmed.edu/software/. PMID:24398258

  16. Standardisation of DNA quantitation by image analysis: quality control of instrumentation.

    PubMed

    Puech, M; Giroud, F

    1999-05-01

    DNA image analysis is frequently performed in clinical practice as a prognostic tool and to improve diagnosis. The precision of prognosis and diagnosis depends on the accuracy of analysis and particularly on the quality of image analysis systems. It has been reported that image analysis systems used for DNA quantification differ widely in their characteristics (Thunissen et al.: Cytometry 27: 21-25, 1997). This induces inter-laboratory variations when the same sample is analysed in different laboratories. In microscopic image analysis, the principal instrumentation errors arise from the optical and electronic parts of systems. They bring about problems of instability, non-linearity, and shading and glare phenomena. The aim of this study is to establish tools and standardised quality control procedures for microscopic image analysis systems. Specific reference standard slides have been developed to control instability, non-linearity, shading and glare phenomena and segmentation efficiency. Some systems have been controlled with these tools and these quality control procedures. Interpretation criteria and accuracy limits of these quality control procedures are proposed according to the conclusions of a European project called PRESS project (Prototype Reference Standard Slide). Beyond these limits, tested image analysis systems are not qualified to realise precise DNA analysis. The different procedures presented in this work determine if an image analysis system is qualified to deliver sufficiently precise DNA measurements for cancer case analysis. If the controlled systems are beyond the defined limits, some recommendations are given to find a solution to the problem.

  17. An overview of quantitative approaches in Gestalt perception.

    PubMed

    Jäkel, Frank; Singh, Manish; Wichmann, Felix A; Herzog, Michael H

    2016-09-01

    Gestalt psychology is often criticized as lacking quantitative measurements and precise mathematical models. While this is true of the early Gestalt school, today there are many quantitative approaches in Gestalt perception and the special issue of Vision Research "Quantitative Approaches in Gestalt Perception" showcases the current state-of-the-art. In this article we give an overview of these current approaches. For example, ideal observer models are one of the standard quantitative tools in vision research and there is a clear trend to try and apply this tool to Gestalt perception and thereby integrate Gestalt perception into mainstream vision research. More generally, Bayesian models, long popular in other areas of vision research, are increasingly being employed to model perceptual grouping as well. Thus, although experimental and theoretical approaches to Gestalt perception remain quite diverse, we are hopeful that these quantitative trends will pave the way for a unified theory. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. White Matter Fiber-based Analysis of T1w/T2w Ratio Map.

    PubMed

    Chen, Haiwei; Budin, Francois; Noel, Jean; Prieto, Juan Carlos; Gilmore, John; Rasmussen, Jerod; Wadhwa, Pathik D; Entringer, Sonja; Buss, Claudia; Styner, Martin

    2017-02-01

    To develop, test, evaluate and apply a novel tool for the white matter fiber-based analysis of T1w/T2w ratio maps quantifying myelin content. The cerebral white matter in the human brain develops from a mostly non-myelinated state to a nearly fully mature white matter myelination within the first few years of life. High resolution T1w/T2w ratio maps are believed to be effective in quantitatively estimating myelin content on a voxel-wise basis. We propose the use of a fiber-tract-based analysis of such T1w/T2w ratio data, as it allows us to separate fiber bundles that a common regional analysis imprecisely groups together, and to associate effects to specific tracts rather than large, broad regions. We developed an intuitive, open source tool to facilitate such fiber-based studies of T1w/T2w ratio maps. Via its Graphical User Interface (GUI) the tool is accessible to non-technical users. The framework uses calibrated T1w/T2w ratio maps and a prior fiber atlas as an input to generate profiles of T1w/T2w values. The resulting fiber profiles are used in a statistical analysis that performs along-tract functional statistical analysis. We applied this approach to a preliminary study of early brain development in neonates. We developed an open-source tool for the fiber based analysis of T1w/T2w ratio maps and tested it in a study of brain development.

  19. White matter fiber-based analysis of T1w/T2w ratio map

    NASA Astrophysics Data System (ADS)

    Chen, Haiwei; Budin, Francois; Noel, Jean; Prieto, Juan Carlos; Gilmore, John; Rasmussen, Jerod; Wadhwa, Pathik D.; Entringer, Sonja; Buss, Claudia; Styner, Martin

    2017-02-01

    Purpose: To develop, test, evaluate and apply a novel tool for the white matter fiber-based analysis of T1w/T2w ratio maps quantifying myelin content. Background: The cerebral white matter in the human brain develops from a mostly non-myelinated state to a nearly fully mature white matter myelination within the first few years of life. High resolution T1w/T2w ratio maps are believed to be effective in quantitatively estimating myelin content on a voxel-wise basis. We propose the use of a fiber-tract-based analysis of such T1w/T2w ratio data, as it allows us to separate fiber bundles that a common regional analysis imprecisely groups together, and to associate effects to specific tracts rather than large, broad regions. Methods: We developed an intuitive, open source tool to facilitate such fiber-based studies of T1w/T2w ratio maps. Via its Graphical User Interface (GUI) the tool is accessible to non-technical users. The framework uses calibrated T1w/T2w ratio maps and a prior fiber atlas as an input to generate profiles of T1w/T2w values. The resulting fiber profiles are used in a statistical analysis that performs along-tract functional statistical analysis. We applied this approach to a preliminary study of early brain development in neonates. Results: We developed an open-source tool for the fiber based analysis of T1w/T2w ratio maps and tested it in a study of brain development.

  20. Analysis of artifacts suggests DGGE should not be used for quantitative diversity analysis.

    PubMed

    Neilson, Julia W; Jordan, Fiona L; Maier, Raina M

    2013-03-01

    PCR-denaturing gradient gel electrophoresis (PCR-DGGE) is widely used in microbial ecology for the analysis of comparative community structure. However, artifacts generated during PCR-DGGE of mixed template communities impede the application of this technique to quantitative analysis of community diversity. The objective of the current study was to employ an artificial bacterial community to document and analyze artifacts associated with multiband signatures and preferential template amplification and to highlight their impacts on the use of this technique for quantitative diversity analysis. Six bacterial species (three Betaproteobacteria, two Alphaproteobacteria, and one Firmicutes) were amplified individually and in combinations with primers targeting the V7/V8 region of the 16S rRNA gene. Two of the six isolates produced multiband profiles demonstrating that band number does not correlate directly with α-diversity. Analysis of the multiple bands from one of these isolates confirmed that both bands had identical sequences which lead to the hypothesis that the multiband pattern resulted from two distinct structural conformations of the same amplicon. In addition, consistent preferential amplification was demonstrated following pairwise amplifications of the six isolates. DGGE and real time PCR analysis identified primer mismatch and PCR inhibition due to 16S rDNA secondary structure as the most probable causes of preferential amplification patterns. Reproducible DGGE community profiles generated in this study confirm that PCR-DGGE provides an excellent high-throughput tool for comparative community structure analysis, but that method-specific artifacts preclude its use for accurate comparative diversity analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. A PCR primer bank for quantitative gene expression analysis.

    PubMed

    Wang, Xiaowei; Seed, Brian

    2003-12-15

    Although gene expression profiling by microarray analysis is a useful tool for assessing global levels of transcriptional activity, variability associated with the data sets usually requires that observed differences be validated by some other method, such as real-time quantitative polymerase chain reaction (real-time PCR). However, non-specific amplification of non-target genes is frequently observed in the latter, confounding the analysis in approximately 40% of real-time PCR attempts when primer-specific labels are not used. Here we present an experimentally validated algorithm for the identification of transcript-specific PCR primers on a genomic scale that can be applied to real-time PCR with sequence-independent detection methods. An online database, PrimerBank, has been created for researchers to retrieve primer information for their genes of interest. PrimerBank currently contains 147 404 primers encompassing most known human and mouse genes. The primer design algorithm has been tested by conventional and real-time PCR for a subset of 112 primer pairs with a success rate of 98.2%.

  2. Approaching human language with complex networks.

    PubMed

    Cong, Jin; Liu, Haitao

    2014-12-01

    The interest in modeling and analyzing human language with complex networks is on the rise in recent years and a considerable body of research in this area has already been accumulated. We survey three major lines of linguistic research from the complex network approach: 1) characterization of human language as a multi-level system with complex network analysis; 2) linguistic typological research with the application of linguistic networks and their quantitative measures; and 3) relationships between the system-level complexity of human language (determined by the topology of linguistic networks) and microscopic linguistic (e.g., syntactic) features (as the traditional concern of linguistics). We show that the models and quantitative tools of complex networks, when exploited properly, can constitute an operational methodology for linguistic inquiry, which contributes to the understanding of human language and the development of linguistics. We conclude our review with suggestions for future linguistic research from the complex network approach: 1) relationships between the system-level complexity of human language and microscopic linguistic features; 2) expansion of research scope from the global properties to other levels of granularity of linguistic networks; and 3) combination of linguistic network analysis with other quantitative studies of language (such as quantitative linguistics). Copyright © 2014 Elsevier B.V. All rights reserved.

  3. A tool for assessment of heart failure prescribing quality: A systematic review and meta-analysis.

    PubMed

    El Hadidi, Seif; Darweesh, Ebtissam; Byrne, Stephen; Bermingham, Margaret

    2018-04-16

    Heart failure (HF) guidelines aim to standardise patient care. Internationally, prescribing practice in HF may deviate from guidelines and so a standardised tool is required to assess prescribing quality. A systematic review and meta-analysis were performed to identify a quantitative tool for measuring adherence to HF guidelines and its clinical implications. Eleven electronic databases were searched to include studies reporting a comprehensive tool for measuring adherence to prescribing guidelines in HF patients aged ≥18 years. Qualitative studies or studies measuring prescription rates alone were excluded. Study quality was assessed using the Good ReseArch for Comparative Effectiveness Checklist. In total, 2455 studies were identified. Sixteen eligible full-text articles were included (n = 14 354 patients, mean age 69 ± 8 y). The Guideline Adherence Index (GAI), and its modified versions, was the most frequently cited tool (n = 13). Other tools identified were the Individualised Reconciled Evidence Recommendations, the Composite Heart Failure Performance, and the Heart Failure Scale. The meta-analysis included the GAI studies of good to high quality. The average GAI-3 was 62%. Compared to low GAI, high GAI patients had lower mortality rate (7.6% vs 33.9%) and lower rehospitalisation rates (23.5% vs 24.5%); both P ≤ .05. High GAI was associated with reduced risk of mortality (hazard ratio = 0.29, 95% confidence interval, 0.06-0.51) and rehospitalisation (hazard ratio = 0.64, 95% confidence interval, 0.41-1.00). No tool was used to improve prescribing quality. The GAI is the most frequently used tool to assess guideline adherence in HF. High GAI is associated with improved HF outcomes. Copyright © 2018 John Wiley & Sons, Ltd.

  4. Analyzing Human-Landscape Interactions: Tools That Integrate

    NASA Astrophysics Data System (ADS)

    Zvoleff, Alex; An, Li

    2014-01-01

    Humans have transformed much of Earth's land surface, giving rise to loss of biodiversity, climate change, and a host of other environmental issues that are affecting human and biophysical systems in unexpected ways. To confront these problems, environmental managers must consider human and landscape systems in integrated ways. This means making use of data obtained from a broad range of methods (e.g., sensors, surveys), while taking into account new findings from the social and biophysical science literatures. New integrative methods (including data fusion, simulation modeling, and participatory approaches) have emerged in recent years to address these challenges, and to allow analysts to provide information that links qualitative and quantitative elements for policymakers. This paper brings attention to these emergent tools while providing an overview of the tools currently in use for analysis of human-landscape interactions. Analysts are now faced with a staggering array of approaches in the human-landscape literature—in an attempt to bring increased clarity to the field, we identify the relative strengths of each tool, and provide guidance to analysts on the areas to which each tool is best applied. We discuss four broad categories of tools: statistical methods (including survival analysis, multi-level modeling, and Bayesian approaches), GIS and spatial analysis methods, simulation approaches (including cellular automata, agent-based modeling, and participatory modeling), and mixed-method techniques (such as alternative futures modeling and integrated assessment). For each tool, we offer an example from the literature of its application in human-landscape research. Among these tools, participatory approaches are gaining prominence for analysts to make the broadest possible array of information available to researchers, environmental managers, and policymakers. Further development of new approaches of data fusion and integration across sites or disciplines pose an important challenge for future work in integrating human and landscape components.

  5. Detection of soybean in soy-based meat substitutes.

    PubMed

    Abd Allah, M A; Foda, Y H; el-Dashlouty, S; el-Sanafiry, N Y; Abu Salem, F M

    1986-01-01

    The statistical analysis of the available data indicated that the straight line equations of protein, fat, fibre, calcium, methionine, and lysine could successively be used for forecasting the added soy percent in a given recipe. On the other hand, the areas of the identified bands in the electropherograms of the investigated samples were considered a reasonable tool for the quantitative determination of whole soybean in soy-based meat substitutes.

  6. In-Line Detection and Measurement of Molecular Contamination in Semiconductor Process Solutions

    NASA Astrophysics Data System (ADS)

    Wang, Jason; West, Michael; Han, Ye; McDonald, Robert C.; Yang, Wenjing; Ormond, Bob; Saini, Harmesh

    2005-09-01

    This paper discusses a fully automated metrology tool for detection and quantitative measurement of contamination, including cationic, anionic, metallic, organic, and molecular species present in semiconductor process solutions. The instrument is based on an electrospray ionization time-of-flight mass spectrometer (ESI-TOF/MS) platform. The tool can be used in diagnostic or analytical modes to understand process problems in addition to enabling routine metrology functions. Metrology functions include in-line contamination measurement with near real-time trend analysis. This paper discusses representative organic and molecular contamination measurement results in production process problem solving efforts. The examples include the analysis and identification of organic compounds in SC-1 pre-gate clean solution; urea, NMP (N-Methyl-2-pyrrolidone) and phosphoric acid contamination in UPW; and plasticizer and an organic sulfur-containing compound found in isopropyl alcohol (IPA). It is expected that these unique analytical and metrology capabilities will improve the understanding of the effect of organic and molecular contamination on device performance and yield. This will permit the development of quantitative correlations between contamination levels and process degradation. It is also expected that the ability to perform routine process chemistry metrology will lead to corresponding improvements in manufacturing process control and yield, the ability to avoid excursions and will improve the overall cost effectiveness of the semiconductor manufacturing process.

  7. Selection of reference standard during method development using the analytical hierarchy process.

    PubMed

    Sun, Wan-yang; Tong, Ling; Li, Dong-xiang; Huang, Jing-yi; Zhou, Shui-ping; Sun, Henry; Bi, Kai-shun

    2015-03-25

    Reference standard is critical for ensuring reliable and accurate method performance. One important issue is how to select the ideal one from the alternatives. Unlike the optimization of parameters, the criteria of the reference standard are always immeasurable. The aim of this paper is to recommend a quantitative approach for the selection of reference standard during method development based on the analytical hierarchy process (AHP) as a decision-making tool. Six alternative single reference standards were assessed in quantitative analysis of six phenolic acids from Salvia Miltiorrhiza and its preparations by using ultra-performance liquid chromatography. The AHP model simultaneously considered six criteria related to reference standard characteristics and method performance, containing feasibility to obtain, abundance in samples, chemical stability, accuracy, precision and robustness. The priority of each alternative was calculated using standard AHP analysis method. The results showed that protocatechuic aldehyde is the ideal reference standard, and rosmarinic acid is about 79.8% ability as the second choice. The determination results successfully verified the evaluation ability of this model. The AHP allowed us comprehensive considering the benefits and risks of the alternatives. It was an effective and practical tool for optimization of reference standards during method development. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Analyzing the Evolution of Membrane Fouling via a Novel Method Based on 3D Optical Coherence Tomography Imaging.

    PubMed

    Li, Weiyi; Liu, Xin; Wang, Yi-Ning; Chong, Tzyy Haur; Tang, Chuyang Y; Fane, Anthony G

    2016-07-05

    The development of novel tools for studying the fouling behavior during membrane processes is critical. This work explored optical coherence tomography (OCT) to quantitatively interpret the formation of a cake layer during a membrane process; the quantitative analysis was based on a novel image processing method that was able to precisely resolve the 3D structure of the cake layer on a micrometer scale. Fouling experiments were carried out with foulants having different physicochemical characteristics (silica nanoparticles and bentonite particles). The cake layers formed at a series of times were digitalized using the OCT-based characterization. The specific deposit (cake volume/membrane surface area) and surface coverage were evaluated as a function of time, which for the first time provided direct experimental evidence for the transition of various fouling mechanisms. Axial stripes were observed in the grayscale plots showing the deposit distribution in the scanned area; this interesting observation was in agreement with the instability analysis that correlated the polarized particle groups with the small disturbances in the boundary layer. This work confirms that the OCT-based characterization is able to provide deep insights into membrane fouling processes and offers a powerful tool for exploring membrane processes with enhanced performance.

  9. Validation of an instrument to assess toddler feeding practices of Latino mothers.

    PubMed

    Chaidez, Virginia; Kaiser, Lucia L

    2011-08-01

    This paper describes qualitative and quantitative aspects of testing a 34-item Toddler-Feeding Questionnaire (TFQ), designed for use in Latino families, and the associations between feeding practices and toddler dietary outcomes. Qualitative methods included review by an expert panel for content validity and cognitive testing of the tool to assess face validity. Quantitative analyses included use of exploratory factor analysis for construct validity; Pearson's correlations for test-retest reliability; Cronbach's alpha (α) for internal reliability; and multivariate regression for investigating relationships between feeding practices and toddler diet and anthropometry. Interviews were conducted using a convenience sample of 94 Latino mother and toddler dyads obtained largely through the Supplemental Nutrition Program for Women, Infants and Children (WIC). Data collection included household characteristics, self-reported early-infant feeding practices, the toddler's dietary intake, and anthropometric measurements. Factor analysis suggests the TFQ contains three subscales: indulgent; authoritative; and environmental influences. The TFQ demonstrated acceptable reliability for most measures. As hypothesized, indulgent practices in Latino toddlers were associated with increased energy consumption and higher intakes of total fat, saturated fat, and sweetened beverages. This tool may be useful in future research exploring the relationship of toddler feeding practices to nutritional outcomes in Latino families. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. The need for a usable assessment tool to analyse the efficacy of emergency care systems in developing countries: proposal to use the TEWS methodology.

    PubMed

    Sun, Jared H; Twomey, Michele; Tran, Jeffrey; Wallis, Lee A

    2012-11-01

    Ninety percent of emergency incidents occur in developing countries, and this is only expected to get worse as these nations develop. As a result, governments in developing countries are establishing emergency care systems. However, there is currently no widely-usable, objective method to monitor or research the rapid growth of emergency care in the developing world. Analysis of current quantitative methods to assess emergency care in developing countries, and the proposal of a more appropriate method. Currently accepted methods to quantitatively assess the efficacy of emergency care systems cannot be performed in most developing countries due to weak record-keeping infrastructure and the inappropriateness of applying Western derived coefficients to developing country conditions. As a result, although emergency care in the developing world is rapidly growing, researchers and clinicians are unable to objectively measure its progress or determine which policies work best in their respective countries. We propose the TEWS methodology, a simple analytical tool that can be handled by low-resource, developing countries. By relying on the most basic universal parameters, simplest calculations and straightforward protocol, the TEWS methodology allows for widespread analysis of emergency care in the developing world. This could become essential in the establishment and growth of new emergency care systems worldwide.

  11. The Effectiveness of Traditional and 21st Century Teaching Tools on Students' Science Learning

    NASA Astrophysics Data System (ADS)

    Bellflower, Julie V.

    Any student seeking a high school diploma from the public school system in one U.S. state must pass the state's high school graduation test. In 2009, only 88% of students at one high school in the state met the basic proficiency requirements on the science portion of the test. Because improved science education has been identified as an explicit national goal, the purpose of this mixed methods study was to determine whether traditional teaching tools (notes, lecture, and textbook) or 21st century teaching tools (online tutorials, video games, YouTube, and virtual labs) lead to greater gains in students' science learning. Bruner's constructivist and Bandura's social cognitive theories served as the foundations for the study. Quantitative research questions were used to investigate the relationship between the type of teaching tools used and student learning gains. Quantitative data from students' pre and posttests were collected and analyzed using a dependent samples t-test. Qualitative data were collected through a focus group interview and participant journals. Analysis of the qualitative data included coding the data and writing a descriptive narrative to convey the findings. Results showed no statistically significant differences in students' science achievement: both types of teaching tools led to student learning gains. As a result, an action plan was developed to assist science educators in the implementation of traditional and 21st century teaching tools that can be used to improve students' science learning. Implications for positive social change included providing science educators with a specific plan of action that will enhance students' science learning, thereby increasing science scores on the state and other high stakes tests.

  12. Cloning of fox (Vulpes vulpes) Il2, Il6, Il10 and IFNgamma and analysis of their expression by quantitative RT-PCR in fox PBMC after in vitro stimulation by Concanavalin A.

    PubMed

    Rolland-Turner, Magali; Farré, Guillaume; Boué, Franck

    2006-04-15

    The immune response in the fox (Vulpes vulpes), despite the success of the oral rabies vaccine is not well characterised, and specific immunological tools are needed. A quantitative RT-PCR using SyBR Green to investigate fox cytokine expression after antigen PBMC in vitro re-stimulation is presented here. First, we cloned by homology with dog cytokine sequences the fox IL2, IL6, IL10, IFNgamma and a partial 18S sequence. Fox specific primers were then defined and used to set up a species-specific quantitative RT-PCR assay using SyBR Green and 18S housekeeping gene as internal standard. The technique was validated using total RNA from fox PBMC stimulated with a polyclonal activator, Concanavaline A.

  13. Localization-based super-resolution imaging meets high-content screening.

    PubMed

    Beghin, Anne; Kechkar, Adel; Butler, Corey; Levet, Florian; Cabillic, Marine; Rossier, Olivier; Giannone, Gregory; Galland, Rémi; Choquet, Daniel; Sibarita, Jean-Baptiste

    2017-12-01

    Single-molecule localization microscopy techniques have proven to be essential tools for quantitatively monitoring biological processes at unprecedented spatial resolution. However, these techniques are very low throughput and are not yet compatible with fully automated, multiparametric cellular assays. This shortcoming is primarily due to the huge amount of data generated during imaging and the lack of software for automation and dedicated data mining. We describe an automated quantitative single-molecule-based super-resolution methodology that operates in standard multiwell plates and uses analysis based on high-content screening and data-mining software. The workflow is compatible with fixed- and live-cell imaging and allows extraction of quantitative data like fluorophore photophysics, protein clustering or dynamic behavior of biomolecules. We demonstrate that the method is compatible with high-content screening using 3D dSTORM and DNA-PAINT based super-resolution microscopy as well as single-particle tracking.

  14. Quantitative determination of Auramine O by terahertz spectroscopy with 2DCOS-PLSR model

    NASA Astrophysics Data System (ADS)

    Zhang, Huo; Li, Zhi; Chen, Tao; Qin, Binyi

    2017-09-01

    Residues of harmful dyes such as Auramine O (AO) in herb and food products threaten the health of people. So, fast and sensitive detection techniques of the residues are needed. As a powerful tool for substance detection, terahertz (THz) spectroscopy was used for the quantitative determination of AO by combining with an improved partial least-squares regression (PLSR) model in this paper. Absorbance of herbal samples with different concentrations was obtained by THz-TDS in the band between 0.2THz and 1.6THz. We applied two-dimensional correlation spectroscopy (2DCOS) to improve the PLSR model. This method highlighted the spectral differences of different concentrations, provided a clear criterion of the input interval selection, and improved the accuracy of detection result. The experimental result indicated that the combination of the THz spectroscopy and 2DCOS-PLSR is an excellent quantitative analysis method.

  15. Analysis of Nanodomain Composition in High-Impact Polypropylene by Atomic Force Microscopy-Infrared.

    PubMed

    Tang, Fuguang; Bao, Peite; Su, Zhaohui

    2016-05-03

    In this paper, compositions of nanodomains in a commercial high-impact polypropylene (HIPP) were investigated by an atomic force microscopy-infrared (AFM-IR) technique. An AFM-IR quantitative analysis method was established for the first time, which was then employed to analyze the polyethylene content in the nanoscopic domains of the rubber particles dispersed in the polypropylene matrix. It was found that the polyethylene content in the matrix was close to zero and was high in the rubbery intermediate layers, both as expected. However, the major component of the rigid cores of the rubber particles was found to be polypropylene rather than polyethylene, contrary to what was previously believed. The finding provides new insight into the complicated structure of HIPPs, and the AFM-IR quantitative method reported here offers a useful tool for assessing compositions of nanoscopic domains in complex polymeric systems.

  16. BiQ Analyzer HT: locus-specific analysis of DNA methylation by high-throughput bisulfite sequencing

    PubMed Central

    Lutsik, Pavlo; Feuerbach, Lars; Arand, Julia; Lengauer, Thomas; Walter, Jörn; Bock, Christoph

    2011-01-01

    Bisulfite sequencing is a widely used method for measuring DNA methylation in eukaryotic genomes. The assay provides single-base pair resolution and, given sufficient sequencing depth, its quantitative accuracy is excellent. High-throughput sequencing of bisulfite-converted DNA can be applied either genome wide or targeted to a defined set of genomic loci (e.g. using locus-specific PCR primers or DNA capture probes). Here, we describe BiQ Analyzer HT (http://biq-analyzer-ht.bioinf.mpi-inf.mpg.de/), a user-friendly software tool that supports locus-specific analysis and visualization of high-throughput bisulfite sequencing data. The software facilitates the shift from time-consuming clonal bisulfite sequencing to the more quantitative and cost-efficient use of high-throughput sequencing for studying locus-specific DNA methylation patterns. In addition, it is useful for locus-specific visualization of genome-wide bisulfite sequencing data. PMID:21565797

  17. From the street to the laboratory: analytical profiles of methoxetamine, 3-methoxyeticyclidine and 3-methoxyphencyclidine and their determination in three biological matrices.

    PubMed

    De Paoli, Giorgia; Brandt, Simon D; Wallach, Jason; Archer, Roland P; Pounder, Derrick J

    2013-06-01

    Three psychoactive arylcyclohexylamines, advertised as "research chemicals," were obtained from an online retailer and characterized by gas chromatography ion trap electron and chemical ionization mass spectrometry, nuclear magnetic resonance spectroscopy and diode array detection. The three phencyclidines were identified as 2-(ethylamino)-2-(3-methoxyphenyl)cyclohexanone (methoxetamine), N-ethyl-1-(3-methoxyphenyl)cyclohexanamine and 1-[1-(3-methoxyphenyl)cyclohexyl]piperidine. A qualitative/quantitative method of analysis was developed and validated using liquid chromatography (HPLC) electrospray tandem mass spectrometry and ultraviolet (UV) detection for the determination of these compounds in blood, urine and vitreous humor. HPLC-UV proved to be a robust, accurate and precise method for the qualitative and quantitative analysis of these substances in biological fluids (0.16-5.0 mg/L), whereas the mass spectrometer was useful as a confirmatory tool.

  18. A subagging regression method for estimating the qualitative and quantitative state of groundwater

    NASA Astrophysics Data System (ADS)

    Jeong, J.; Park, E.; Choi, J.; Han, W. S.; Yun, S. T.

    2016-12-01

    A subagging regression (SBR) method for the analysis of groundwater data pertaining to the estimation of trend and the associated uncertainty is proposed. The SBR method is validated against synthetic data competitively with other conventional robust and non-robust methods. From the results, it is verified that the estimation accuracies of the SBR method are consistent and superior to those of the other methods and the uncertainties are reasonably estimated where the others have no uncertainty analysis option. To validate further, real quantitative and qualitative data are employed and analyzed comparatively with Gaussian process regression (GPR). For all cases, the trend and the associated uncertainties are reasonably estimated by SBR, whereas the GPR has limitations in representing the variability of non-Gaussian skewed data. From the implementations, it is determined that the SBR method has potential to be further developed as an effective tool of anomaly detection or outlier identification in groundwater state data.

  19. As we fall asleep we forget about the future: A quantitative linguistic analysis of mentation reports from hypnagogia.

    PubMed

    Speth, Jana; Schloerscheidt, Astrid M; Speth, Clemens

    2016-10-01

    We present a quantitative study of mental time travel to the past and future in sleep onset hypnagogia. Three independent, blind judges analysed a total of 150 mentation reports from different intervals prior to and after sleep onset. The linguistic tool for the mentation report analysis grounds on established grammatical and cognitive-semantic theories, and proof of concept has been provided in previous studies. The current results indicate that memory for the future, but not for the past, decreases in sleep onset - thereby supporting preliminary physiological evidence at the level of brain function. While recent memory research emphasizes similarities in the cognitive and physiological processes of mental time travel to the past and future, the current study explores a state of consciousness which may serve to dissociate between the two. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. HPTLC in Herbal Drug Quantification

    NASA Astrophysics Data System (ADS)

    Shinde, Devanand B.; Chavan, Machindra J.; Wakte, Pravin S.

    For the past few decades, compounds from natural sources have been gaining importance because of the vast chemical diversity they offer. This has led to phenomenal increase in the demand for herbal medicines in the last two decades and need has been felt for ensuring the quality, safety, and efficacy of herbal drugs. Phytochemical evaluation is one of the tools for the quality assessment, which include preliminary phytochemical screening, chemoprofiling, and marker compound analysis using modern analytical techniques. High-performance thin-layer chromatography (HPTLC) has been emerged as an important tool for the qualitative, semiquantitative, and quantitative phytochemical analysis of the herbal drugs and formulations. This includes developing TLC fingerprinting profiles and estimation of biomarkers. This review has an attempt to focus on the theoretical considerations of HPTLC and some examples of herbal drugs and formulations analyzed by HPTLC.

Top