Sample records for enables quantitative comparison

  1. Comparison and quantitative verification of mapping algorithms for whole genome bisulfite sequencing

    USDA-ARS?s Scientific Manuscript database

    Coupling bisulfite conversion with next-generation sequencing (Bisulfite-seq) enables genome-wide measurement of DNA methylation, but poses unique challenges for mapping. However, despite a proliferation of Bisulfite-seq mapping tools, no systematic comparison of their genomic coverage and quantitat...

  2. Real medical benefit assessed by indirect comparison.

    PubMed

    Falissard, Bruno; Zylberman, Myriam; Cucherat, Michel; Izard, Valérie; Meyer, François

    2009-01-01

    Frequently, in data packages submitted for Marketing Approval to the CHMP, there is a lack of relevant head-to-head comparisons of medicinal products that could enable national authorities responsible for the approval of reimbursement to assess the Added Therapeutic Value (ASMR) of new clinical entities or line extensions of existing therapies.Indirect or mixed treatment comparisons (MTC) are methods stemming from the field of meta-analysis that have been designed to tackle this problem. Adjusted indirect comparisons, meta-regressions, mixed models, Bayesian network analyses pool results of randomised controlled trials (RCTs), enabling a quantitative synthesis.The REAL procedure, recently developed by the HAS (French National Authority for Health), is a mixture of an MTC and effect model based on expert opinions. It is intended to translate the efficacy observed in the trials into effectiveness expected in day-to-day clinical practice in France.

  3. Enhancing Transfer Effectiveness: A Model for the 1990s.

    ERIC Educational Resources Information Center

    Berman, Paul; And Others

    In an effort to identify effective transfer practices appropriate to different community college circumstances, and to establish a quantitative database that would enable valid comparisons of transfer between their 28 member institutions, the National Effective Transfer Consortium (NETC) sponsored a survey of more than 30,000 students attending…

  4. Garlic (Allium sativum L.) fertility: transcriptome and proteome analyses provide insight into flower and pollen development

    PubMed Central

    Shemesh-Mayer, Einat; Ben-Michael, Tomer; Rotem, Neta; Rabinowitch, Haim D.; Doron-Faigenboim, Adi; Kosmala, Arkadiusz; Perlikowski, Dawid; Sherman, Amir; Kamenetsky, Rina

    2015-01-01

    Commercial cultivars of garlic, a popular condiment, are sterile, making genetic studies and breeding of this plant challenging. However, recent fertility restoration has enabled advanced physiological and genetic research and hybridization in this important crop. Morphophysiological studies, combined with transcriptome and proteome analyses and quantitative PCR validation, enabled the identification of genes and specific processes involved in gametogenesis in fertile and male-sterile garlic genotypes. Both genotypes exhibit normal meiosis at early stages of anther development, but in the male-sterile plants, tapetal hypertrophy after microspore release leads to pollen degeneration. Transcriptome analysis and global gene-expression profiling showed that >16,000 genes are differentially expressed in the fertile vs. male-sterile developing flowers. Proteome analysis and quantitative comparison of 2D-gel protein maps revealed 36 significantly different protein spots, 9 of which were present only in the male-sterile genotype. Bioinformatic and quantitative PCR validation of 10 candidate genes exhibited significant expression differences between male-sterile and fertile flowers. A comparison of morphophysiological and molecular traits of fertile and male-sterile garlic flowers suggests that respiratory restrictions and/or non-regulated programmed cell death of the tapetum can lead to energy deficiency and consequent pollen abortion. Potential molecular markers for male fertility and sterility in garlic are proposed. PMID:25972879

  5. Summary of Work for Joint Research Interchanges with DARWIN Integrated Product Team 1998

    NASA Technical Reports Server (NTRS)

    Hesselink, Lambertus

    1999-01-01

    The intent of Stanford University's SciVis group is to develop technologies that enabled comparative analysis and visualization techniques for simulated and experimental flow fields. These techniques would then be made available under the Joint Research Interchange for potential injection into the DARWIN Workspace Environment (DWE). In the past, we have focused on techniques that exploited feature based comparisons such as shock and vortex extractions. Our current research effort focuses on finding a quantitative comparison of general vector fields based on topological features. Since the method relies on topological information, grid matching and vector alignment is not needed in the comparison. This is often a problem with many data comparison techniques. In addition, since only topology based information is stored and compared for each field, there is a significant compression of information that enables large databases to be quickly searched. This report will briefly (1) describe current technologies in the area of comparison techniques, (2) will describe the theory of our new method and finally (3) summarize a few of the results.

  6. Summary of Work for Joint Research Interchanges with DARWIN Integrated Product Team

    NASA Technical Reports Server (NTRS)

    Hesselink, Lambertus

    1999-01-01

    The intent of Stanford University's SciVis group is to develop technologies that enabled comparative analysis and visualization techniques for simulated and experimental flow fields. These techniques would then be made available un- der the Joint Research Interchange for potential injection into the DARWIN Workspace Environment (DWE). In the past, we have focused on techniques that exploited feature based comparisons such as shock and vortex extractions. Our current research effort focuses on finding a quantitative comparison of general vector fields based on topological features. Since the method relies on topological information, grid matching an@ vector alignment is not needed in the comparison. This is often a problem with many data comparison techniques. In addition, since only topology based information is stored and compared for each field, there is a significant compression of information that enables large databases to be quickly searched. This report will briefly (1) describe current technologies in the area of comparison techniques, (2) will describe the theory of our new method and finally (3) summarize a few of the results.

  7. Evaluating ICT Integration in Turkish K-12 Schools through Teachers' Views

    ERIC Educational Resources Information Center

    Aydin, Mehmet Kemal; Gürol, Mehmet; Vanderlinde, Ruben

    2016-01-01

    The current study aims to explore ICT integration in Turkish K-12 schools purposively selected as a representation of F@tih and non-F@tih public schools together with a private school. A convergent mixed methods design was employed with a multiple case strategy as such it will enable to make casewise comparisons. The quantitative data was…

  8. Multistrip western blotting to increase quantitative data output.

    PubMed

    Kiyatkin, Anatoly; Aksamitiene, Edita

    2009-01-01

    The qualitative and quantitative measurements of protein abundance and modification states are essential in understanding their functions in diverse cellular processes. Typical western blotting, though sensitive, is prone to produce substantial errors and is not readily adapted to high-throughput technologies. Multistrip western blotting is a modified immunoblotting procedure based on simultaneous electrophoretic transfer of proteins from multiple strips of polyacrylamide gels to a single membrane sheet. In comparison with the conventional technique, Multistrip western blotting increases the data output per single blotting cycle up to tenfold, allows concurrent monitoring of up to nine different proteins from the same loading of the sample, and substantially improves the data accuracy by reducing immunoblotting-derived signal errors. This approach enables statistically reliable comparison of different or repeated sets of data, and therefore is beneficial to apply in biomedical diagnostics, systems biology, and cell signaling research.

  9. Design and analysis issues in quantitative proteomics studies.

    PubMed

    Karp, Natasha A; Lilley, Kathryn S

    2007-09-01

    Quantitative proteomics is the comparison of distinct proteomes which enables the identification of protein species which exhibit changes in expression or post-translational state in response to a given stimulus. Many different quantitative techniques are being utilized and generate large datasets. Independent of the technique used, these large datasets need robust data analysis to ensure valid conclusions are drawn from such studies. Approaches to address the problems that arise with large datasets are discussed to give insight into the types of statistical analyses of data appropriate for the various experimental strategies that can be employed by quantitative proteomic studies. This review also highlights the importance of employing a robust experimental design and highlights various issues surrounding the design of experiments. The concepts and examples discussed within will show how robust design and analysis will lead to confident results that will ensure quantitative proteomics delivers.

  10. Quantitative proteomics in biological research.

    PubMed

    Wilm, Matthias

    2009-10-01

    Proteomics has enabled the direct investigation of biological material, at first through the analysis of individual proteins, then of lysates from cell cultures, and finally of extracts from tissues and biopsies from entire organisms. Its latest manifestation - quantitative proteomics - allows deeper insight into biological systems. This article reviews the different methods used to extract quantitative information from mass spectra. It follows the technical developments aimed toward global proteomics, the attempt to characterize every expressed protein in a cell by at least one peptide. When applications of the technology are discussed, the focus is placed on yeast biology. In particular, differential quantitative proteomics, the comparison between an experiment and its control, is very discriminating for proteins involved in the process being studied. When trying to understand biological processes on a molecular level, differential quantitative proteomics tends to give a clearer picture than global transcription analyses. As a result, MS has become an even more indispensable tool for biochemically motivated biological research.

  11. Developing a Multiplexed Quantitative Cross-Linking Mass Spectrometry Platform for Comparative Structural Analysis of Protein Complexes.

    PubMed

    Yu, Clinton; Huszagh, Alexander; Viner, Rosa; Novitsky, Eric J; Rychnovsky, Scott D; Huang, Lan

    2016-10-18

    Cross-linking mass spectrometry (XL-MS) represents a recently popularized hybrid methodology for defining protein-protein interactions (PPIs) and analyzing structures of large protein assemblies. In particular, XL-MS strategies have been demonstrated to be effective in elucidating molecular details of PPIs at the peptide resolution, providing a complementary set of structural data that can be utilized to refine existing complex structures or direct de novo modeling of unknown protein structures. To study structural and interaction dynamics of protein complexes, quantitative cross-linking mass spectrometry (QXL-MS) strategies based on isotope-labeled cross-linkers have been developed. Although successful, these approaches are mostly limited to pairwise comparisons. In order to establish a robust workflow enabling comparative analysis of multiple cross-linked samples simultaneously, we have developed a multiplexed QXL-MS strategy, namely, QMIX (Quantitation of Multiplexed, Isobaric-labeled cross (X)-linked peptides) by integrating MS-cleavable cross-linkers with isobaric labeling reagents. This study has established a new analytical platform for quantitative analysis of cross-linked peptides, which can be directly applied for multiplexed comparisons of the conformational dynamics of protein complexes and PPIs at the proteome scale in future studies.

  12. Multistrip Western blotting: a tool for comparative quantitative analysis of multiple proteins.

    PubMed

    Aksamitiene, Edita; Hoek, Jan B; Kiyatkin, Anatoly

    2015-01-01

    The qualitative and quantitative measurements of protein abundance and modification states are essential in understanding their functions in diverse cellular processes. Typical Western blotting, though sensitive, is prone to produce substantial errors and is not readily adapted to high-throughput technologies. Multistrip Western blotting is a modified immunoblotting procedure based on simultaneous electrophoretic transfer of proteins from multiple strips of polyacrylamide gels to a single membrane sheet. In comparison with the conventional technique, Multistrip Western blotting increases data output per single blotting cycle up to tenfold; allows concurrent measurement of up to nine different total and/or posttranslationally modified protein expression obtained from the same loading of the sample; and substantially improves the data accuracy by reducing immunoblotting-derived signal errors. This approach enables statistically reliable comparison of different or repeated sets of data and therefore is advantageous to apply in biomedical diagnostics, systems biology, and cell signaling research.

  13. Improving validation methods for molecular diagnostics: application of Bland-Altman, Deming and simple linear regression analyses in assay comparison and evaluation for next-generation sequencing

    PubMed Central

    Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L

    2018-01-01

    Aims A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R2), using R2 as the primary metric of assay agreement. However, the use of R2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. Methods We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Results Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. Conclusions The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. PMID:28747393

  14. Least squares QR-based decomposition provides an efficient way of computing optimal regularization parameter in photoacoustic tomography.

    PubMed

    Shaw, Calvin B; Prakash, Jaya; Pramanik, Manojit; Yalavarthy, Phaneendra K

    2013-08-01

    A computationally efficient approach that computes the optimal regularization parameter for the Tikhonov-minimization scheme is developed for photoacoustic imaging. This approach is based on the least squares-QR decomposition which is a well-known dimensionality reduction technique for a large system of equations. It is shown that the proposed framework is effective in terms of quantitative and qualitative reconstructions of initial pressure distribution enabled via finding an optimal regularization parameter. The computational efficiency and performance of the proposed method are shown using a test case of numerical blood vessel phantom, where the initial pressure is exactly known for quantitative comparison.

  15. A multiplexed system for quantitative comparisons of chromatin landscapes

    PubMed Central

    van Galen, Peter; Viny, Aaron D.; Ram, Oren; Ryan, Russell J.H.; Cotton, Matthew J.; Donohue, Laura; Sievers, Cem; Drier, Yotam; Liau, Brian B.; Gillespie, Shawn M.; Carroll, Kaitlin M.; Cross, Michael B.; Levine, Ross L.; Bernstein, Bradley E.

    2015-01-01

    Genome-wide profiling of histone modifications can provide systematic insight into the regulatory elements and programs engaged in a given cell type. However, conventional chromatin immunoprecipitation and sequencing (ChIP-seq) does not capture quantitative information on histone modification levels, requires large amounts of starting material, and involves tedious processing of each individual sample. Here we address these limitations with a technology that leverages DNA barcoding to profile chromatin quantitatively and in multiplexed format. We concurrently map relative levels of multiple histone modifications across multiple samples, each comprising as few as a thousand cells. We demonstrate the technology by monitoring dynamic changes following inhibition of P300, EZH2 or KDM5, by linking altered epigenetic landscapes to chromatin regulator mutations, and by mapping active and repressive marks in purified human hematopoietic stem cells. Hence, this technology enables quantitative studies of chromatin state dynamics across rare cell types, genotypes, environmental conditions and drug treatments. PMID:26687680

  16. Morphology enabled dipole inversion (MEDI) from a single-angle acquisition: comparison with COSMOS in human brain imaging.

    PubMed

    Liu, Tian; Liu, Jing; de Rochefort, Ludovic; Spincemaille, Pascal; Khalidov, Ildar; Ledoux, James Robert; Wang, Yi

    2011-09-01

    Magnetic susceptibility varies among brain structures and provides insights into the chemical and molecular composition of brain tissues. However, the determination of an arbitrary susceptibility distribution from the measured MR signal phase is a challenging, ill-conditioned inverse problem. Although a previous method named calculation of susceptibility through multiple orientation sampling (COSMOS) has solved this inverse problem both theoretically and experimentally using multiple angle acquisitions, it is often impractical to carry out on human subjects. Recently, the feasibility of calculating the brain susceptibility distribution from a single-angle acquisition was demonstrated using morphology enabled dipole inversion (MEDI). In this study, we further improved the original MEDI method by sparsifying the edges in the quantitative susceptibility map that do not have a corresponding edge in the magnitude image. Quantitative susceptibility maps generated by the improved MEDI were compared qualitatively and quantitatively with those generated by calculation of susceptibility through multiple orientation sampling. The results show a high degree of agreement between MEDI and calculation of susceptibility through multiple orientation sampling, and the practicality of MEDI allows many potential clinical applications. Copyright © 2011 Wiley-Liss, Inc.

  17. Improving validation methods for molecular diagnostics: application of Bland-Altman, Deming and simple linear regression analyses in assay comparison and evaluation for next-generation sequencing.

    PubMed

    Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L

    2018-02-01

    A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R 2 ), using R 2 as the primary metric of assay agreement. However, the use of R 2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  18. Development of an ultra-sensitive Simoa assay to enable GDF11 detection: a comparison across bioanalytical platforms.

    PubMed

    Myzithras, Maria; Li, Hua; Bigwarfe, Tammy; Waltz, Erica; Gupta, Priyanka; Low, Sarah; Hayes, David B; MacDonnell, Scott; Ahlberg, Jennifer; Franti, Michael; Roberts, Simon

    2016-03-01

    Four bioanalytical platforms were evaluated to optimize sensitivity and enable detection of recombinant human GDF11 in biological matrices; ELISA, Meso Scale Discovery, Gyrolab xP Workstation and Simoa HD-1. Results & methodology: After completion of custom assay development, the single-molecule ELISA (Simoa) achieved the greatest sensitivity with a lower limit of quantitation of 0.1 ng/ml, an improvement of 100-fold over the next sensitive platform (MSD). This improvement was essential to enable detection of GDF11 in biological samples, and without the technology the sensitivity achieved on the other platforms would not have been sufficient. Other factors such as ease of use, cost, assay time and automation capability can also be considered when developing custom immunoassays, based on the requirements of the bioanalyst.

  19. Boolean logic analysis for flow regime recognition of gas-liquid horizontal flow

    NASA Astrophysics Data System (ADS)

    Ramskill, Nicholas P.; Wang, Mi

    2011-10-01

    In order to develop a flowmeter for the accurate measurement of multiphase flows, it is of the utmost importance to correctly identify the flow regime present to enable the selection of the optimal method for metering. In this study, the horizontal flow of air and water in a pipeline was studied under a multitude of conditions using electrical resistance tomography but the flow regimes that are presented in this paper have been limited to plug and bubble air-water flows. This study proposes a novel method for recognition of the prevalent flow regime using only a fraction of the data, thus rendering the analysis more efficient. By considering the average conductivity of five zones along the central axis of the tomogram, key features can be identified, thus enabling the recognition of the prevalent flow regime. Boolean logic and frequency spectrum analysis has been applied for flow regime recognition. Visualization of the flow using the reconstructed images provides a qualitative comparison between different flow regimes. Application of the Boolean logic scheme enables a quantitative comparison of the flow patterns, thus reducing the subjectivity in the identification of the prevalent flow regime.

  20. Comparability analysis of protein therapeutics by bottom-up LC-MS with stable isotope-tagged reference standards

    PubMed Central

    Manuilov, Anton V; Radziejewski, Czeslaw H

    2011-01-01

    Comparability studies lie at the heart of assessments that evaluate differences amongst manufacturing processes and stability studies of protein therapeutics. Low resolution chromatographic and electrophoretic methods facilitate quantitation, but do not always yield detailed insight into the effect of the manufacturing change or environmental stress. Conversely, mass spectrometry (MS) can provide high resolution information on the molecule, but conventional methods are not very quantitative. This gap can be reconciled by use of a stable isotope-tagged reference standard (SITRS), a version of the analyte protein that is uniformly labeled with 13C6-arginine and 13C6-lysine. The SITRS serves as an internal control that is trypsin-digested and analyzed by liquid chromatography (LC)-MS with the analyte sample. The ratio of the ion intensities of each unlabeled and labeled peptide pair is then compared to that of other sample(s). A comparison of these ratios provides a readily accessible way to spot even minute differences among samples. In a study of a monoclonal antibody (mAb) spiked with varying amounts of the same antibody bearing point mutations, peptides containing the mutations were readily identified and quantified at concentrations as low as 2% relative to unmodified peptides. The method was robust, reproducible and produced a linear response for every peptide that was monitored. The method was also successfully used to distinguish between two batches of a mAb that were produced in two different cell lines while two batches produced from the same cell line were found to be highly comparable. Finally, the use of the SITRS method in the comparison of two stressed mAb samples enabled the identification of sites susceptible to deamidation and oxidation, as well as their quantitation. The experimental results indicate that use of a SITRS in a peptide mapping experiment with MS detection enables sensitive and quantitative comparability studies of proteins at high resolution. PMID:21654206

  1. Comparability analysis of protein therapeutics by bottom-up LC-MS with stable isotope-tagged reference standards.

    PubMed

    Manuilov, Anton V; Radziejewski, Czeslaw H; Lee, David H

    2011-01-01

    Comparability studies lie at the heart of assessments that evaluate differences amongst manufacturing processes and stability studies of protein therapeutics. Low resolution chromatographic and electrophoretic methods facilitate quantitation, but do not always yield detailed insight into the effect of the manufacturing change or environmental stress. Conversely, mass spectrometry (MS) can provide high resolution information on the molecule, but conventional methods are not very quantitative. This gap can be reconciled by use of a stable isotope-tagged reference standard (SITRS), a version of the analyte protein that is uniformly labeled (13)C6-arginine and (13)C6-lysine. The SITRS serves as an internal control that is trypsin-digested and analyzed by liquid chromatography (LC)-MS with the analyte sample. The ratio of the ion intensities of each unlabeled and labeled peptide pair is then compared to that of other sample(s). A comparison of these ratios provides a readily accessible way to spot even minute differences among samples. In a study of a monoclonal antibody (mAb) spiked with varying amounts of the same antibody bearing point mutations, peptides containing the mutations were readily identified and quantified at concentrations as low as 2% relative to unmodified peptides. The method is robust, reproducible and produced a linear response for every peptide that was monitored. The method was also successfully used to distinguish between two batches of a mAb that were produced in two different cell lines while two batches produced from the same cell line were found to be highly comparable. Finally, the use of the SITRS method in the comparison of two stressed mAb samples enabled the identification of sites susceptible to deamidation and oxidation, as well as their quantitation. The experimental results indicate that use of a SITRS in a peptide mapping experiment with MS detection enables sensitive and quantitative comparability studies of proteins at high resolution.

  2. Airborne radar and radiometer experiment for quantitative remote measurements of rain

    NASA Technical Reports Server (NTRS)

    Kozu, Toshiaki; Meneghini, Robert; Boncyk, Wayne; Wilheit, Thomas T.; Nakamura, Kenji

    1989-01-01

    An aircraft experiment has been conducted with a dual-frequency (10 GHz and 35 GHz) radar/radiometer system and an 18-GHz radiometer to test various rain-rate retrieval algorithms from space. In the experiment, which took place in the fall of 1988 at the NASA Wallops Flight Facility, VA, both stratiform and convective storms were observed. A ground-based radar and rain gauges were also used to obtain truth data. An external radar calibration is made with rain gauge data, thereby enabling quantitative reflectivity measurements. Comparisons between path attenuations derived from the surface return and from the radar reflectivity profile are made to test the feasibility of a technique to estimate the raindrop size distribution from simultaneous radar and path-attenuation measurements.

  3. Nontargeted quantitation of lipid classes using hydrophilic interaction liquid chromatography-electrospray ionization mass spectrometry with single internal standard and response factor approach.

    PubMed

    Cífková, Eva; Holčapek, Michal; Lísa, Miroslav; Ovčačíková, Magdaléna; Lyčka, Antonín; Lynen, Frédéric; Sandra, Pat

    2012-11-20

    The identification and quantitation of a wide range of lipids in complex biological samples is an essential requirement for the lipidomic studies. High-performance liquid chromatography-mass spectrometry (HPLC/MS) has the highest potential to obtain detailed information on the whole lipidome, but the reliable quantitation of multiple lipid classes is still a challenging task. In this work, we describe a new method for the nontargeted quantitation of polar lipid classes separated by hydrophilic interaction liquid chromatography (HILIC) followed by positive-ion electrospray ionization mass spectrometry (ESI-MS) using a single internal lipid standard to which all class specific response factors (RFs) are related to. The developed method enables the nontargeted quantitation of lipid classes and molecules inside these classes in contrast to the conventional targeted quantitation, which is based on predefined selected reaction monitoring (SRM) transitions for selected lipids only. In the nontargeted quantitation method described here, concentrations of lipid classes are obtained by the peak integration in HILIC chromatograms multiplied by their RFs related to the single internal standard (i.e., sphingosyl PE, d17:1/12:0) used as common reference for all polar lipid classes. The accuracy, reproducibility and robustness of the method have been checked by various means: (1) the comparison with conventional lipidomic quantitation using SRM scans on a triple quadrupole (QqQ) mass analyzer, (2) (31)P nuclear magnetic resonance (NMR) quantitation of the total lipid extract, (3) method robustness test using subsequent measurements by three different persons, (4) method transfer to different HPLC/MS systems using different chromatographic conditions, and (5) comparison with previously published results for identical samples, especially human reference plasma from the National Institute of Standards and Technology (NIST human plasma). Results on human plasma, egg yolk and porcine liver extracts are presented and discussed.

  4. Fluorescence-labeled methylation-sensitive amplified fragment length polymorphism (FL-MS-AFLP) analysis for quantitative determination of DNA methylation and demethylation status.

    PubMed

    Kageyama, Shinji; Shinmura, Kazuya; Yamamoto, Hiroko; Goto, Masanori; Suzuki, Koichi; Tanioka, Fumihiko; Tsuneyoshi, Toshihiro; Sugimura, Haruhiko

    2008-04-01

    The PCR-based DNA fingerprinting method called the methylation-sensitive amplified fragment length polymorphism (MS-AFLP) analysis is used for genome-wide scanning of methylation status. In this study, we developed a method of fluorescence-labeled MS-AFLP (FL-MS-AFLP) analysis by applying a fluorescence-labeled primer and fluorescence-detecting electrophoresis apparatus to the existing method of MS-AFLP analysis. The FL-MS-AFLP analysis enables quantitative evaluation of more than 350 random CpG loci per run. It was shown to allow evaluation of the differences in methylation level of blood DNA of gastric cancer patients and evaluation of hypermethylation and hypomethylation in DNA from gastric cancer tissue in comparison with adjacent non-cancerous tissue.

  5. Epidemiologic Evidence of a Relationship between Tea, Coffee, or Caffeine Consumption and Cognitive Decline12

    PubMed Central

    Arab, Lenore; Khan, Faraz; Lam, Helen

    2013-01-01

    A systematic literature review of human studies relating caffeine or caffeine-rich beverages to cognitive decline reveals only 6 studies that have collected and analyzed cognition data in a prospective fashion that enables study of decline across the spectrum of cognition. These 6 studies, in general, evaluate cognitive function using the Mini Mental State Exam and base their beverage data on FFQs. Studies included in our review differed in their source populations, duration of study, and most dramatically in how their analyses were done, disallowing direct quantitative comparisons of their effect estimates. Only one of the studies reported on all 3 exposures, coffee, tea, and caffeine, making comparisons of findings across studies more difficult. However, in general, it can be stated that for all studies of tea and most studies of coffee and caffeine, the estimates of cognitive decline were lower among consumers, although there is a lack of a distinct dose response. Only a few measures showed a quantitative significance and, interestingly, studies indicate a stronger effect among women than men. PMID:23319129

  6. Epidemiologic evidence of a relationship between tea, coffee, or caffeine consumption and cognitive decline.

    PubMed

    Arab, Lenore; Khan, Faraz; Lam, Helen

    2013-01-01

    A systematic literature review of human studies relating caffeine or caffeine-rich beverages to cognitive decline reveals only 6 studies that have collected and analyzed cognition data in a prospective fashion that enables study of decline across the spectrum of cognition. These 6 studies, in general, evaluate cognitive function using the Mini Mental State Exam and base their beverage data on FFQs. Studies included in our review differed in their source populations, duration of study, and most dramatically in how their analyses were done, disallowing direct quantitative comparisons of their effect estimates. Only one of the studies reported on all 3 exposures, coffee, tea, and caffeine, making comparisons of findings across studies more difficult. However, in general, it can be stated that for all studies of tea and most studies of coffee and caffeine, the estimates of cognitive decline were lower among consumers, although there is a lack of a distinct dose response. Only a few measures showed a quantitative significance and, interestingly, studies indicate a stronger effect among women than men.

  7. A Comparison of the Pitfall Trap, Winkler Extractor and Berlese Funnel for Sampling Ground-Dwelling Arthropods in Tropical Montane Cloud Forests

    PubMed Central

    Sabu, Thomas K.; Shiju, Raj T.; Vinod, KV.; Nithya, S.

    2011-01-01

    Little is known about the ground-dwelling arthropod diversity in tropical montane cloud forests (TMCF). Due to unique habitat conditions in TMCFs with continuously wet substrates and a waterlogged forest floor along with the innate biases of the pitfall trap, Berlese funnel and Winkler extractor are certain to make it difficult to choose the most appropriate method to sample the ground-dwelling arthropods in TMCFs. Among the three methods, the Winkler extractor was the most efficient method for quantitative data and pitfall trapping for qualitative data for most groups. Inclusion of floatation method as a complementary method along with the Winkler extractor would enable a comprehensive quantitative survey of ground-dwelling arthropods. Pitfall trapping is essential for both quantitative and qualitative sampling of Diplopoda, Opiliones, Orthoptera, and Diptera. The Winkler extractor was the best quantitative method for Psocoptera, Araneae, Isopoda, and Formicidae; and the Berlese funnel was best for Collembola and Chilopoda. For larval forms of different insect orders and the Acari, all the three methods were equally effective. PMID:21529148

  8. Cleavage Entropy as Quantitative Measure of Protease Specificity

    PubMed Central

    Fuchs, Julian E.; von Grafenstein, Susanne; Huber, Roland G.; Margreiter, Michael A.; Spitzer, Gudrun M.; Wallnoefer, Hannes G.; Liedl, Klaus R.

    2013-01-01

    A purely information theory-guided approach to quantitatively characterize protease specificity is established. We calculate an entropy value for each protease subpocket based on sequences of cleaved substrates extracted from the MEROPS database. We compare our results with known subpocket specificity profiles for individual proteases and protease groups (e.g. serine proteases, metallo proteases) and reflect them quantitatively. Summation of subpocket-wise cleavage entropy contributions yields a measure for overall protease substrate specificity. This total cleavage entropy allows ranking of different proteases with respect to their specificity, separating unspecific digestive enzymes showing high total cleavage entropy from specific proteases involved in signaling cascades. The development of a quantitative cleavage entropy score allows an unbiased comparison of subpocket-wise and overall protease specificity. Thus, it enables assessment of relative importance of physicochemical and structural descriptors in protease recognition. We present an exemplary application of cleavage entropy in tracing substrate specificity in protease evolution. This highlights the wide range of substrate promiscuity within homologue proteases and hence the heavy impact of a limited number of mutations on individual substrate specificity. PMID:23637583

  9. A Quantitative Comparison of Calibration Methods for RGB-D Sensors Using Different Technologies.

    PubMed

    Villena-Martínez, Víctor; Fuster-Guilló, Andrés; Azorín-López, Jorge; Saval-Calvo, Marcelo; Mora-Pascual, Jeronimo; Garcia-Rodriguez, Jose; Garcia-Garcia, Alberto

    2017-01-27

    RGB-D (Red Green Blue and Depth) sensors are devices that can provide color and depth information from a scene at the same time. Recently, they have been widely used in many solutions due to their commercial growth from the entertainment market to many diverse areas (e.g., robotics, CAD, etc.). In the research community, these devices have had good uptake due to their acceptable levelofaccuracyformanyapplicationsandtheirlowcost,butinsomecases,theyworkatthelimitof their sensitivity, near to the minimum feature size that can be perceived. For this reason, calibration processes are critical in order to increase their accuracy and enable them to meet the requirements of such kinds of applications. To the best of our knowledge, there is not a comparative study of calibration algorithms evaluating its results in multiple RGB-D sensors. Specifically, in this paper, a comparison of the three most used calibration methods have been applied to three different RGB-D sensors based on structured light and time-of-flight. The comparison of methods has been carried out by a set of experiments to evaluate the accuracy of depth measurements. Additionally, an object reconstruction application has been used as example of an application for which the sensor works at the limit of its sensitivity. The obtained results of reconstruction have been evaluated through visual inspection and quantitative measurements.

  10. An analysis toolbox to explore mesenchymal migration heterogeneity reveals adaptive switching between distinct modes

    PubMed Central

    Shafqat-Abbasi, Hamdah; Kowalewski, Jacob M; Kiss, Alexa; Gong, Xiaowei; Hernandez-Varas, Pablo; Berge, Ulrich; Jafari-Mamaghani, Mehrdad; Lock, John G; Strömblad, Staffan

    2016-01-01

    Mesenchymal (lamellipodial) migration is heterogeneous, although whether this reflects progressive variability or discrete, 'switchable' migration modalities, remains unclear. We present an analytical toolbox, based on quantitative single-cell imaging data, to interrogate this heterogeneity. Integrating supervised behavioral classification with multivariate analyses of cell motion, membrane dynamics, cell-matrix adhesion status and F-actin organization, this toolbox here enables the detection and characterization of two quantitatively distinct mesenchymal migration modes, termed 'Continuous' and 'Discontinuous'. Quantitative mode comparisons reveal differences in cell motion, spatiotemporal coordination of membrane protrusion/retraction, and how cells within each mode reorganize with changed cell speed. These modes thus represent distinctive migratory strategies. Additional analyses illuminate the macromolecular- and cellular-scale effects of molecular targeting (fibronectin, talin, ROCK), including 'adaptive switching' between Continuous (favored at high adhesion/full contraction) and Discontinuous (low adhesion/inhibited contraction) modes. Overall, this analytical toolbox now facilitates the exploration of both spontaneous and adaptive heterogeneity in mesenchymal migration. DOI: http://dx.doi.org/10.7554/eLife.11384.001 PMID:26821527

  11. All you need is shape: Predicting shear banding in sand with LS-DEM

    NASA Astrophysics Data System (ADS)

    Kawamoto, Reid; Andò, Edward; Viggiani, Gioacchino; Andrade, José E.

    2018-02-01

    This paper presents discrete element method (DEM) simulations with experimental comparisons at multiple length scales-underscoring the crucial role of particle shape. The simulations build on technological advances in the DEM furnished by level sets (LS-DEM), which enable the mathematical representation of the surface of arbitrarily-shaped particles such as grains of sand. We show that this ability to model shape enables unprecedented capture of the mechanics of granular materials across scales ranging from macroscopic behavior to local behavior to particle behavior. Specifically, the model is able to predict the onset and evolution of shear banding in sands, replicating the most advanced high-fidelity experiments in triaxial compression equipped with sequential X-ray tomography imaging. We present comparisons of the model and experiment at an unprecedented level of quantitative agreement-building a one-to-one model where every particle in the more than 53,000-particle array has its own avatar or numerical twin. Furthermore, the boundary conditions of the experiment are faithfully captured by modeling the membrane effect as well as the platen displacement and tilting. The results show a computational tool that can give insight into the physics and mechanics of granular materials undergoing shear deformation and failure, with computational times comparable to those of the experiment. One quantitative measure that is extracted from the LS-DEM simulations that is currently not available experimentally is the evolution of three dimensional force chains inside and outside of the shear band. We show that the rotations on the force chains are correlated to the rotations in stress principal directions.

  12. Method for measuring residual stresses in materials by plastically deforming the material and interference pattern comparison

    DOEpatents

    Pechersky, Martin J.

    1995-01-01

    A method for measuring residual stress in a material comprising the steps of establishing a speckle pattern on the surface with a first laser then heating a portion of that pattern with an infrared laser until the surface plastically deforms. Comparing the speckle patterns before and after deformation by subtracting one pattern from the other will produce a fringe pattern that serves as a visual and quantitative indication of the degree to which the plasticized surface responded to the stress dung heating and enables calculation of the stress.

  13. Rapid Analysis of Carbohydrates in Bioprocess Samples: An Evaluation of the CarboPac SA10 for HPAE-PAD Analysis by Interlaboratory Comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sevcik, R. S.; Hyman, D. A.; Basumallich, L.

    2013-01-01

    A technique for carbohydrate analysis for bioprocess samples has been developed, providing reduced analysis time compared to current practice in the biofuels R&D community. The Thermofisher CarboPac SA10 anion-exchange column enables isocratic separation of monosaccharides, sucrose and cellobiose in approximately 7 minutes. Additionally, use of a low-volume (0.2 mL) injection valve in combination with a high-volume detection cell minimizes the extent of sample dilution required to bring sugar concentrations into the linear range of the pulsed amperometric detector (PAD). Three laboratories, representing academia, industry, and government, participated in an interlaboratory study which analyzed twenty-one opportunistic samples representing biomass pretreatment, enzymaticmore » saccharification, and fermentation samples. The technique's robustness, linearity, and interlaboratory reproducibility were evaluated and showed excellent-to-acceptable characteristics. Additionally, quantitation by the CarboPac SA10/PAD was compared with the current practice method utilizing a HPX-87P/RID. While these two methods showed good agreement a statistical comparison found significant quantitation difference between them, highlighting the difference between selective and universal detection modes.« less

  14. Automated Detection of P. falciparum Using Machine Learning Algorithms with Quantitative Phase Images of Unstained Cells.

    PubMed

    Park, Han Sang; Rinehart, Matthew T; Walzer, Katelyn A; Chi, Jen-Tsan Ashley; Wax, Adam

    2016-01-01

    Malaria detection through microscopic examination of stained blood smears is a diagnostic challenge that heavily relies on the expertise of trained microscopists. This paper presents an automated analysis method for detection and staging of red blood cells infected by the malaria parasite Plasmodium falciparum at trophozoite or schizont stage. Unlike previous efforts in this area, this study uses quantitative phase images of unstained cells. Erythrocytes are automatically segmented using thresholds of optical phase and refocused to enable quantitative comparison of phase images. Refocused images are analyzed to extract 23 morphological descriptors based on the phase information. While all individual descriptors are highly statistically different between infected and uninfected cells, each descriptor does not enable separation of populations at a level satisfactory for clinical utility. To improve the diagnostic capacity, we applied various machine learning techniques, including linear discriminant classification (LDC), logistic regression (LR), and k-nearest neighbor classification (NNC), to formulate algorithms that combine all of the calculated physical parameters to distinguish cells more effectively. Results show that LDC provides the highest accuracy of up to 99.7% in detecting schizont stage infected cells compared to uninfected RBCs. NNC showed slightly better accuracy (99.5%) than either LDC (99.0%) or LR (99.1%) for discriminating late trophozoites from uninfected RBCs. However, for early trophozoites, LDC produced the best accuracy of 98%. Discrimination of infection stage was less accurate, producing high specificity (99.8%) but only 45.0%-66.8% sensitivity with early trophozoites most often mistaken for late trophozoite or schizont stage and late trophozoite and schizont stage most often confused for each other. Overall, this methodology points to a significant clinical potential of using quantitative phase imaging to detect and stage malaria infection without staining or expert analysis.

  15. Automated Detection of P. falciparum Using Machine Learning Algorithms with Quantitative Phase Images of Unstained Cells

    PubMed Central

    Park, Han Sang; Rinehart, Matthew T.; Walzer, Katelyn A.; Chi, Jen-Tsan Ashley; Wax, Adam

    2016-01-01

    Malaria detection through microscopic examination of stained blood smears is a diagnostic challenge that heavily relies on the expertise of trained microscopists. This paper presents an automated analysis method for detection and staging of red blood cells infected by the malaria parasite Plasmodium falciparum at trophozoite or schizont stage. Unlike previous efforts in this area, this study uses quantitative phase images of unstained cells. Erythrocytes are automatically segmented using thresholds of optical phase and refocused to enable quantitative comparison of phase images. Refocused images are analyzed to extract 23 morphological descriptors based on the phase information. While all individual descriptors are highly statistically different between infected and uninfected cells, each descriptor does not enable separation of populations at a level satisfactory for clinical utility. To improve the diagnostic capacity, we applied various machine learning techniques, including linear discriminant classification (LDC), logistic regression (LR), and k-nearest neighbor classification (NNC), to formulate algorithms that combine all of the calculated physical parameters to distinguish cells more effectively. Results show that LDC provides the highest accuracy of up to 99.7% in detecting schizont stage infected cells compared to uninfected RBCs. NNC showed slightly better accuracy (99.5%) than either LDC (99.0%) or LR (99.1%) for discriminating late trophozoites from uninfected RBCs. However, for early trophozoites, LDC produced the best accuracy of 98%. Discrimination of infection stage was less accurate, producing high specificity (99.8%) but only 45.0%-66.8% sensitivity with early trophozoites most often mistaken for late trophozoite or schizont stage and late trophozoite and schizont stage most often confused for each other. Overall, this methodology points to a significant clinical potential of using quantitative phase imaging to detect and stage malaria infection without staining or expert analysis. PMID:27636719

  16. A Computational Framework for Bioimaging Simulation.

    PubMed

    Watabe, Masaki; Arjunan, Satya N V; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.

  17. Usefulness of quantitative susceptibility mapping for the diagnosis of Parkinson disease.

    PubMed

    Murakami, Y; Kakeda, S; Watanabe, K; Ueda, I; Ogasawara, A; Moriya, J; Ide, S; Futatsuya, K; Sato, T; Okada, K; Uozumi, T; Tsuji, S; Liu, T; Wang, Y; Korogi, Y

    2015-06-01

    Quantitative susceptibility mapping allows overcoming several nonlocal restrictions of susceptibility-weighted and phase imaging and enables quantification of magnetic susceptibility. We compared the diagnostic accuracy of quantitative susceptibility mapping and R2* (1/T2*) mapping to discriminate between patients with Parkinson disease and controls. For 21 patients with Parkinson disease and 21 age- and sex-matched controls, 2 radiologists measured the quantitative susceptibility mapping values and R2* values in 6 brain structures (the thalamus, putamen, caudate nucleus, pallidum, substantia nigra, and red nucleus). The quantitative susceptibility mapping values and R2* values of the substantia nigra were significantly higher in patients with Parkinson disease (P < .01); measurements in other brain regions did not differ significantly between patients and controls. For the discrimination of patients with Parkinson disease from controls, receiver operating characteristic analysis suggested that the optimal cutoff values for the substantia nigra, based on the Youden Index, were >0.210 for quantitative susceptibility mapping and >28.8 for R2*. The sensitivity, specificity, and accuracy of quantitative susceptibility mapping were 90% (19 of 21), 86% (18 of 21), and 88% (37 of 42), respectively; for R2* mapping, they were 81% (17 of 21), 52% (11 of 21), and 67% (28 of 42). Pair-wise comparisons showed that the areas under the receiver operating characteristic curves were significantly larger for quantitative susceptibility mapping than for R2* mapping (0.91 versus 0.69, P < .05). Quantitative susceptibility mapping showed higher diagnostic performance than R2* mapping for the discrimination between patients with Parkinson disease and controls. © 2015 by American Journal of Neuroradiology.

  18. An improved 96-well turbidity assay for T4 lysozyme activity.

    PubMed

    Toro, Tasha B; Nguyen, Thao P; Watt, Terry J

    2015-01-01

    T4 lysozyme (T4L) is an important model system for investigating the relationship between protein structure and function. Despite being extensively studied, a reliable, quantitative activity assay for T4L has not been developed. Here, we present an improved T4L turbidity assay as well as an affinity-based T4L expression and purification protocol. This assay is designed for 96-well format and utilizes conditions amenable for both T4L and other lysozymes. This protocol enables easy, efficient, and quantitative characterization of T4L variants and allows comparison between different lysozymes. Our method: •Is applicable for all lysozymes, with enhanced sensitivity for T4 lysozyme compared to other 96-well plate turbidity assays;•Utilizes standardized conditions for comparing T4 lysozyme variants and other lysozymes; and•Incorporates a simplified expression and purification protocol for T4 lysozyme.

  19. An improved 96-well turbidity assay for T4 lysozyme activity

    PubMed Central

    Toro, Tasha B.; Nguyen, Thao P.; Watt, Terry J.

    2015-01-01

    T4 lysozyme (T4L) is an important model system for investigating the relationship between protein structure and function. Despite being extensively studied, a reliable, quantitative activity assay for T4L has not been developed. Here, we present an improved T4L turbidity assay as well as an affinity-based T4L expression and purification protocol. This assay is designed for 96-well format and utilizes conditions amenable for both T4L and other lysozymes. This protocol enables easy, efficient, and quantitative characterization of T4L variants and allows comparison between different lysozymes. Our method: • Is applicable for all lysozymes, with enhanced sensitivity for T4 lysozyme compared to other 96-well plate turbidity assays; • Utilizes standardized conditions for comparing T4 lysozyme variants and other lysozymes; and • Incorporates a simplified expression and purification protocol for T4 lysozyme. PMID:26150996

  20. Multisite formative assessment for the Pathways study to prevent obesity in American Indian schoolchildren123

    PubMed Central

    Gittelsohn, Joel; Evans, Marguerite; Story, Mary; Davis, Sally M; Metcalfe, Lauve; Helitzer, Deborah L; Clay, Theresa E

    2016-01-01

    We describe the formative assessment process, using an approach based on social learning theory, for the development of a school-based obesity-prevention intervention into which cultural perspectives are integrated. The feasibility phase of the Pathways study was conducted in multiple settings in 6 American Indian nations. The Pathways formative assessment collected both qualitative and quantitative data. The qualitative data identified key social and environmental issues and enabled local people to express their own needs and views. The quantitative, structured data permitted comparison across sites. Both types of data were integrated by using a conceptual and procedural model. The formative assessment results were used to identify and rank the behavioral risk factors that were to become the focus of the Pathways intervention and to provide guidance on developing common intervention strategies that would be culturally appropriate and acceptable to all sites. PMID:10195601

  1. Improved Selection of Internal Transcribed Spacer-Specific Primers Enables Quantitative, Ultra-High-Throughput Profiling of Fungal Communities

    PubMed Central

    Bokulich, Nicholas A.

    2013-01-01

    Ultra-high-throughput sequencing (HTS) of fungal communities has been restricted by short read lengths and primer amplification bias, slowing the adoption of newer sequencing technologies to fungal community profiling. To address these issues, we evaluated the performance of several common internal transcribed spacer (ITS) primers and designed a novel primer set and work flow for simultaneous quantification and species-level interrogation of fungal consortia. Primer comparison and validation were predicted in silico and by sequencing a “mock community” of mixed yeast species to explore the challenges of amplicon length and amplification bias for reconstructing defined yeast community structures. The amplicon size and distribution of this primer set are smaller than for all preexisting ITS primer sets, maximizing sequencing coverage of hypervariable ITS domains by very-short-amplicon, high-throughput sequencing platforms. This feature also enables the optional integration of quantitative PCR (qPCR) directly into the HTS preparatory work flow by substituting qPCR with these primers for standard PCR, yielding quantification of individual community members. The complete work flow described here, utilizing any of the qualified primer sets evaluated, can rapidly profile mixed fungal communities and capably reconstructed well-characterized beer and wine fermentation fungal communities. PMID:23377949

  2. Comparison of Quantitative PCR and Droplet Digital PCR Multiplex Assays for Two Genera of Bloom-Forming Cyanobacteria, Cylindrospermopsis and Microcystis

    PubMed Central

    Te, Shu Harn; Chen, Enid Yingru

    2015-01-01

    The increasing occurrence of harmful cyanobacterial blooms, often linked to deteriorated water quality and adverse public health effects, has become a worldwide concern in recent decades. The use of molecular techniques such as real-time quantitative PCR (qPCR) has become increasingly popular in the detection and monitoring of harmful cyanobacterial species. Multiplex qPCR assays that quantify several toxigenic cyanobacterial species have been established previously; however, there is no molecular assay that detects several bloom-forming species simultaneously. Microcystis and Cylindrospermopsis are the two most commonly found genera and are known to be able to produce microcystin and cylindrospermopsin hepatotoxins. In this study, we designed primers and probes which enable quantification of these genera based on the RNA polymerase C1 gene for Cylindrospermopsis species and the c-phycocyanin beta subunit-like gene for Microcystis species. Duplex assays were developed for two molecular techniques—qPCR and droplet digital PCR (ddPCR). After optimization, both qPCR and ddPCR assays have high linearity and quantitative correlations for standards. Comparisons of the two techniques showed that qPCR has higher sensitivity, a wider linear dynamic range, and shorter analysis time and that it was more cost-effective, making it a suitable method for initial screening. However, the ddPCR approach has lower variability and was able to handle the PCR inhibition and competitive effects found in duplex assays, thus providing more precise and accurate analysis for bloom samples. PMID:26025892

  3. Development of a quantitative pachytene chromosome map and its unification with somatic chromosome and linkage maps of rice (Oryza sativa L.).

    PubMed

    Ohmido, Nobuko; Iwata, Aiko; Kato, Seiji; Wako, Toshiyuki; Fukui, Kiichi

    2018-01-01

    A quantitative pachytene chromosome map of rice (Oryza sativa L.) was developed using imaging methods. The map depicts not only distribution patterns of chromomeres specific to pachytene chromosomes, but also the higher order information of chromosomal structures, such as heterochromatin (condensed regions), euchromatin (decondensed regions), the primary constrictions (centromeres), and the secondary constriction (nucleolar organizing regions, NOR). These features were image analyzed and quantitatively mapped onto the map by Chromosome Image Analyzing System ver. 4.0 (CHIAS IV). Correlation between H3K9me2, an epigenetic marker and formation and/or maintenance of heterochromatin, thus was, clearly visualized. Then the pachytene chromosome map was unified with the existing somatic chromosome and linkage maps by physically mapping common DNA markers among them, such as a rice A genome specific tandem repeat sequence (TrsA), 5S and 45S ribosomal RNA genes, five bacterial artificial chromosome (BAC) clones, four P1 bacteriophage artificial chromosome (PAC) clones using multicolor fluorescence in situ hybridization (FISH). Detailed comparison between the locations of the DNA probes on the pachytene chromosomes using multicolor FISH, and the linkage map enabled determination of the chromosome number and short/long arms of individual pachytene chromosomes using the chromosome number and arm assignment designated for the linkage map. As a result, the quantitative pachytene chromosome map was unified with two other major rice chromosome maps representing somatic prometaphase chromosomes and genetic linkages. In conclusion, the unification of the three rice maps serves as an indispensable basic information, not only for an in-depth comparison between genetic and chromosomal data, but also for practical breeding programs.

  4. Multidimensional analysis of data obtained in experiments with X-ray emulsion chambers and extensive air showers

    NASA Technical Reports Server (NTRS)

    Chilingaryan, A. A.; Galfayan, S. K.; Zazyan, M. Z.; Dunaevsky, A. M.

    1985-01-01

    Nonparametric statistical methods are used to carry out the quantitative comparison of the model and the experimental data. The same methods enable one to select the events initiated by the heavy nuclei and to calculate the portion of the corresponding events. For this purpose it is necessary to have the data on artificial events describing the experiment sufficiently well established. At present, the model with the small scaling violation in the fragmentation region is the closest to the experiments. Therefore, the treatment of gamma families obtained in the Pamir' experiment is being carried out at present with the application of these models.

  5. Mapping the function of neuronal ion channels in model and experiment

    PubMed Central

    Podlaski, William F; Seeholzer, Alexander; Groschner, Lukas N; Miesenböck, Gero; Ranjan, Rajnish; Vogels, Tim P

    2017-01-01

    Ion channel models are the building blocks of computational neuron models. Their biological fidelity is therefore crucial for the interpretation of simulations. However, the number of published models, and the lack of standardization, make the comparison of ion channel models with one another and with experimental data difficult. Here, we present a framework for the automated large-scale classification of ion channel models. Using annotated metadata and responses to a set of voltage-clamp protocols, we assigned 2378 models of voltage- and calcium-gated ion channels coded in NEURON to 211 clusters. The IonChannelGenealogy (ICGenealogy) web interface provides an interactive resource for the categorization of new and existing models and experimental recordings. It enables quantitative comparisons of simulated and/or measured ion channel kinetics, and facilitates field-wide standardization of experimentally-constrained modeling. DOI: http://dx.doi.org/10.7554/eLife.22152.001 PMID:28267430

  6. Quantitative assessment of RNA-protein interactions with high-throughput sequencing-RNA affinity profiling.

    PubMed

    Ozer, Abdullah; Tome, Jacob M; Friedman, Robin C; Gheba, Dan; Schroth, Gary P; Lis, John T

    2015-08-01

    Because RNA-protein interactions have a central role in a wide array of biological processes, methods that enable a quantitative assessment of these interactions in a high-throughput manner are in great demand. Recently, we developed the high-throughput sequencing-RNA affinity profiling (HiTS-RAP) assay that couples sequencing on an Illumina GAIIx genome analyzer with the quantitative assessment of protein-RNA interactions. This assay is able to analyze interactions between one or possibly several proteins with millions of different RNAs in a single experiment. We have successfully used HiTS-RAP to analyze interactions of the EGFP and negative elongation factor subunit E (NELF-E) proteins with their corresponding canonical and mutant RNA aptamers. Here we provide a detailed protocol for HiTS-RAP that can be completed in about a month (8 d hands-on time). This includes the preparation and testing of recombinant proteins and DNA templates, clustering DNA templates on a flowcell, HiTS and protein binding with a GAIIx instrument, and finally data analysis. We also highlight aspects of HiTS-RAP that can be further improved and points of comparison between HiTS-RAP and two other recently developed methods, quantitative analysis of RNA on a massively parallel array (RNA-MaP) and RNA Bind-n-Seq (RBNS), for quantitative analysis of RNA-protein interactions.

  7. Quantitative model analysis with diverse biological data: applications in developmental pattern formation.

    PubMed

    Pargett, Michael; Umulis, David M

    2013-07-15

    Mathematical modeling of transcription factor and signaling networks is widely used to understand if and how a mechanism works, and to infer regulatory interactions that produce a model consistent with the observed data. Both of these approaches to modeling are informed by experimental data, however, much of the data available or even acquirable are not quantitative. Data that is not strictly quantitative cannot be used by classical, quantitative, model-based analyses that measure a difference between the measured observation and the model prediction for that observation. To bridge the model-to-data gap, a variety of techniques have been developed to measure model "fitness" and provide numerical values that can subsequently be used in model optimization or model inference studies. Here, we discuss a selection of traditional and novel techniques to transform data of varied quality and enable quantitative comparison with mathematical models. This review is intended to both inform the use of these model analysis methods, focused on parameter estimation, and to help guide the choice of method to use for a given study based on the type of data available. Applying techniques such as normalization or optimal scaling may significantly improve the utility of current biological data in model-based study and allow greater integration between disparate types of data. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. Jet measurements in heavy ion physics

    NASA Astrophysics Data System (ADS)

    Connors, Megan; Nattrass, Christine; Reed, Rosi; Salur, Sevil

    2018-04-01

    A hot, dense medium called a quark gluon plasma (QGP) is created in ultrarelativistic heavy ion collisions. Early in the collision, hard parton scatterings generate high momentum partons that traverse the medium, which then fragment into sprays of particles called jets. Understanding how these partons interact with the QGP and fragment into final state particles provides critical insight into quantum chromodynamics. Experimental measurements from high momentum hadrons, two particle correlations, and full jet reconstruction at the Relativistic Heavy Ion Collider (RHIC) and the Large Hadron Collider (LHC) continue to improve our understanding of energy loss in the QGP. Run 2 at the LHC recently began and there is a jet detector at RHIC under development. Now is the perfect time to reflect on what the experimental measurements have taught us so far, the limitations of the techniques used for studying jets, how the techniques can be improved, and how to move forward with the wealth of experimental data such that a complete description of energy loss in the QGP can be achieved. Measurements of jets to date clearly indicate that hard partons lose energy. Detailed comparisons of the nuclear modification factor between data and model calculations led to quantitative constraints on the opacity of the medium to hard probes. However, while there is substantial evidence for softening and broadening jets through medium interactions, the difficulties comparing measurements to theoretical calculations limit further quantitative constraints on energy loss mechanisms. Since jets are algorithmic descriptions of the initial parton, the same jet definitions must be used, including the treatment of the underlying heavy ion background, when making data and theory comparisons. An agreement is called for between theorists and experimentalists on the appropriate treatment of the background, Monte Carlo generators that enable experimental algorithms to be applied to theoretical calculations, and a clear understanding of which observables are most sensitive to the properties of the medium, even in the presence of background. This will enable us to determine the best strategy for the field to improve quantitative constraints on properties of the medium in the face of these challenges.

  9. Informatics methods to enable sharing of quantitative imaging research data.

    PubMed

    Levy, Mia A; Freymann, John B; Kirby, Justin S; Fedorov, Andriy; Fennessy, Fiona M; Eschrich, Steven A; Berglund, Anders E; Fenstermacher, David A; Tan, Yongqiang; Guo, Xiaotao; Casavant, Thomas L; Brown, Bartley J; Braun, Terry A; Dekker, Andre; Roelofs, Erik; Mountz, James M; Boada, Fernando; Laymon, Charles; Oborski, Matt; Rubin, Daniel L

    2012-11-01

    The National Cancer Institute Quantitative Research Network (QIN) is a collaborative research network whose goal is to share data, algorithms and research tools to accelerate quantitative imaging research. A challenge is the variability in tools and analysis platforms used in quantitative imaging. Our goal was to understand the extent of this variation and to develop an approach to enable sharing data and to promote reuse of quantitative imaging data in the community. We performed a survey of the current tools in use by the QIN member sites for representation and storage of their QIN research data including images, image meta-data and clinical data. We identified existing systems and standards for data sharing and their gaps for the QIN use case. We then proposed a system architecture to enable data sharing and collaborative experimentation within the QIN. There are a variety of tools currently used by each QIN institution. We developed a general information system architecture to support the QIN goals. We also describe the remaining architecture gaps we are developing to enable members to share research images and image meta-data across the network. As a research network, the QIN will stimulate quantitative imaging research by pooling data, algorithms and research tools. However, there are gaps in current functional requirements that will need to be met by future informatics development. Special attention must be given to the technical requirements needed to translate these methods into the clinical research workflow to enable validation and qualification of these novel imaging biomarkers. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Subject-specific longitudinal shape analysis by coupling spatiotemporal shape modeling with medial analysis

    NASA Astrophysics Data System (ADS)

    Hong, Sungmin; Fishbaugh, James; Rezanejad, Morteza; Siddiqi, Kaleem; Johnson, Hans; Paulsen, Jane; Kim, Eun Young; Gerig, Guido

    2017-02-01

    Modeling subject-specific shape change is one of the most important challenges in longitudinal shape analysis of disease progression. Whereas anatomical change over time can be a function of normal aging, anatomy can also be impacted by disease related degeneration. Anatomical shape change may also be affected by structural changes from neighboring shapes, which may cause non-linear variations in pose. In this paper, we propose a framework to analyze disease related shape changes by coupling extrinsic modeling of the ambient anatomical space via spatiotemporal deformations with intrinsic shape properties from medial surface analysis. We compare intrinsic shape properties of a subject-specific shape trajectory to a normative 4D shape atlas representing normal aging to isolate shape changes related to disease. The spatiotemporal shape modeling establishes inter/intra subject anatomical correspondence, which in turn enables comparisons between subjects and the 4D shape atlas, and also quantitative analysis of disease related shape change. The medial surface analysis captures intrinsic shape properties related to local patterns of deformation. The proposed framework jointly models extrinsic longitudinal shape changes in the ambient anatomical space, as well as intrinsic shape properties to give localized measurements of degeneration. Six high risk subjects and six controls are randomly sampled from a Huntington's disease image database for qualitative and quantitative comparison.

  11. Reconstitution of the flavor signature of Dornfelder red wine on the basis of the natural concentrations of its key aroma and taste compounds.

    PubMed

    Frank, Stephanie; Wollmann, Nadine; Schieberle, Peter; Hofmann, Thomas

    2011-08-24

    By application of aroma extract dilution analysis (AEDA) on the volatile fraction isolated from a Dornfelder red wine, 31 odor-active compounds were identified by means of HRGC-MS and comparison with reference compounds. A total of 27 odorants, judged with high FD factors by means of AEDA, was quantitated by means of stable isotope dilution assays, and acetaldehyde was determined enzymatically. In addition, 36 taste-active compounds were analyzed by means of HPLC-UV, HPLC-MS/MS, and ion chromatography. The quantitative data obtained for the identified aroma and taste compounds enabled for the first time the reconstruction of the overall flavor of the red wine. Sensory evaluation of both the aroma and taste profiles of the authentic red wine and the recombinate revealed that Dornfelder red wine was closely mimicked. Moreover, it was demonstrated that the high molecular weight fraction of red wine is essential for its astringent taste impression. By comparison of the overall odor of the aroma recombinate in ethanol with that of the total flavor recombinate containing all tastants, it was shown for the first time that the nonvolatile tastants had a strong influence on the intensity of certain aroma qualities.

  12. QUANTITATIVE ASSESSMENT OF INTEGRATED PHRENIC NERVE ACTIVITY

    PubMed Central

    Nichols, Nicole L.; Mitchell, Gordon S.

    2016-01-01

    Integrated electrical activity in the phrenic nerve is commonly used to assess within-animal changes in phrenic motor output. Because of concerns regarding the consistency of nerve recordings, activity is most often expressed as a percent change from baseline values. However, absolute values of nerve activity are necessary to assess the impact of neural injury or disease on phrenic motor output. To date, no systematic evaluations of the repeatability/reliability have been made among animals when phrenic recordings are performed by an experienced investigator using standardized methods. We performed a meta-analysis of studies reporting integrated phrenic nerve activity in many rat groups by the same experienced investigator; comparisons were made during baseline and maximal chemoreceptor stimulation in 14 wild-type Harlan and 14 Taconic Sprague Dawley groups, and in 3 pre-symptomatic and 11 end-stage SOD1G93A Taconic rat groups (an ALS model). Meta-analysis results indicate: 1) consistent measurements of integrated phrenic activity in each sub-strain of wild-type rats; 2) with bilateral nerve recordings, left-to-right integrated phrenic activity ratios are ~1.0; and 3) consistently reduced activity in end-stage SOD1G93A rats. Thus, with appropriate precautions, integrated phrenic nerve activity enables robust, quantitative comparisons among nerves or experimental groups, including differences caused by neuromuscular disease. PMID:26724605

  13. Comparison of Diagnostic Performance of Semi-Quantitative Knee Ultrasound and Knee Radiography with MRI: Oulu Knee Osteoarthritis Study.

    PubMed

    Podlipská, Jana; Guermazi, Ali; Lehenkari, Petri; Niinimäki, Jaakko; Roemer, Frank W; Arokoski, Jari P; Kaukinen, Päivi; Liukkonen, Esa; Lammentausta, Eveliina; Nieminen, Miika T; Tervonen, Osmo; Koski, Juhani M; Saarakkala, Simo

    2016-03-01

    Osteoarthritis (OA) is a common degenerative musculoskeletal disease highly prevalent in aging societies worldwide. Traditionally, knee OA is diagnosed using conventional radiography. However, structural changes of articular cartilage or menisci cannot be directly evaluated using this method. On the other hand, ultrasound is a promising tool able to provide direct information on soft tissue degeneration. The aim of our study was to systematically determine the site-specific diagnostic performance of semi-quantitative ultrasound grading of knee femoral articular cartilage, osteophytes and meniscal extrusion, and of radiographic assessment of joint space narrowing and osteophytes, using MRI as a reference standard. Eighty asymptomatic and 79 symptomatic subjects with mean age of 57.7 years were included in the study. Ultrasound performed best in the assessment of femoral medial and lateral osteophytes, and medial meniscal extrusion. In comparison to radiography, ultrasound performed better or at least equally well in identification of tibio-femoral osteophytes, medial meniscal extrusion and medial femoral cartilage morphological degeneration. Ultrasound provides relevant additional diagnostic information on tissue-specific morphological changes not depicted by conventional radiography. Consequently, the use of ultrasound as a complementary imaging tool along with radiography may enable more accurate and cost-effective diagnostics of knee osteoarthritis at the primary healthcare level.

  14. Quantitative comparison of initial soil erosion processes and runoff generation in Spanish and German vineyards.

    PubMed

    Rodrigo Comino, J; Iserloh, T; Lassu, T; Cerdà, A; Keestra, S D; Prosdocimi, M; Brings, C; Marzen, M; Ramos, M C; Senciales, J M; Ruiz Sinoga, J D; Seeger, M; Ries, J B

    2016-09-15

    The aim of this study was to enable a quantitative comparison of initial soil erosion processes in European vineyards using the same methodology and equipment. The study was conducted in four viticultural areas with different characteristics (Valencia and Málaga in Spain, Ruwer-Mosel valley and Saar-Mosel valley in Germany). Old and young vineyards, with conventional and ecological planting and management systems were compared. The same portable rainfall simulator with identical rainfall intensity (40mmh(-1)) and sampling intervals (30min of test duration, collecting the samples at 5-min-intervals) was used over a circular test plot with 0.28m(2). The results of 83 simulations have been analysed and correlation coefficients were calculated for each study area to identify the relationship between environmental plot characteristics, soil texture, soil erosion, runoff and infiltration. The results allow for identification of the main factors related to soil properties, topography and management, which control soil erosion processes in vineyards. The most important factors influencing soil erosion and runoff were the vegetation cover for the ecological German vineyards (with 97.6±8% infiltration coefficients) and stone cover, soil moisture and slope steepness for the conventional land uses. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Quantitative nephelometry

    MedlinePlus

    ... this page: //medlineplus.gov/ency/article/003545.htm Quantitative nephelometry test To use the sharing features on this page, please enable JavaScript. Quantitative nephelometry is a lab test to quickly and ...

  16. Validation metrics for turbulent plasma transport

    DOE PAGES

    Holland, C.

    2016-06-22

    Developing accurate models of plasma dynamics is essential for confident predictive modeling of current and future fusion devices. In modern computer science and engineering, formal verification and validation processes are used to assess model accuracy and establish confidence in the predictive capabilities of a given model. This paper provides an overview of the key guiding principles and best practices for the development of validation metrics, illustrated using examples from investigations of turbulent transport in magnetically confined plasmas. Particular emphasis is given to the importance of uncertainty quantification and its inclusion within the metrics, and the need for utilizing synthetic diagnosticsmore » to enable quantitatively meaningful comparisons between simulation and experiment. As a starting point, the structure of commonly used global transport model metrics and their limitations is reviewed. An alternate approach is then presented, which focuses upon comparisons of predicted local fluxes, fluctuations, and equilibrium gradients against observation. Furthermore, the utility of metrics based upon these comparisons is demonstrated by applying them to gyrokinetic predictions of turbulent transport in a variety of discharges performed on the DIII-D tokamak, as part of a multi-year transport model validation activity.« less

  17. A Computational Framework for Bioimaging Simulation

    PubMed Central

    Watabe, Masaki; Arjunan, Satya N. V.; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units. PMID:26147508

  18. Validation metrics for turbulent plasma transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, C.

    Developing accurate models of plasma dynamics is essential for confident predictive modeling of current and future fusion devices. In modern computer science and engineering, formal verification and validation processes are used to assess model accuracy and establish confidence in the predictive capabilities of a given model. This paper provides an overview of the key guiding principles and best practices for the development of validation metrics, illustrated using examples from investigations of turbulent transport in magnetically confined plasmas. Particular emphasis is given to the importance of uncertainty quantification and its inclusion within the metrics, and the need for utilizing synthetic diagnosticsmore » to enable quantitatively meaningful comparisons between simulation and experiment. As a starting point, the structure of commonly used global transport model metrics and their limitations is reviewed. An alternate approach is then presented, which focuses upon comparisons of predicted local fluxes, fluctuations, and equilibrium gradients against observation. Furthermore, the utility of metrics based upon these comparisons is demonstrated by applying them to gyrokinetic predictions of turbulent transport in a variety of discharges performed on the DIII-D tokamak, as part of a multi-year transport model validation activity.« less

  19. Rapid comparison of properties on protein surface

    PubMed Central

    Sael, Lee; La, David; Li, Bin; Rustamov, Raif; Kihara, Daisuke

    2008-01-01

    The mapping of physicochemical characteristics onto the surface of a protein provides crucial insights into its function and evolution. This information can be further used in the characterization and identification of similarities within protein surface regions. We propose a novel method which quantitatively compares global and local properties on the protein surface. We have tested the method on comparison of electrostatic potentials and hydrophobicity. The method is based on 3D Zernike descriptors, which provides a compact representation of a given property defined on a protein surface. Compactness and rotational invariance of this descriptor enable fast comparison suitable for database searches. The usefulness of this method is exemplified by studying several protein families including globins, thermophilic and mesophilic proteins, and active sites of TIM β/α barrel proteins. In all the cases studied, the descriptor is able to cluster proteins into functionally relevant groups. The proposed approach can also be easily extended to other surface properties. This protein surface-based approach will add a new way of viewing and comparing proteins to conventional methods, which compare proteins in terms of their primary sequence or tertiary structure. PMID:18618695

  20. Rapid comparison of properties on protein surface.

    PubMed

    Sael, Lee; La, David; Li, Bin; Rustamov, Raif; Kihara, Daisuke

    2008-10-01

    The mapping of physicochemical characteristics onto the surface of a protein provides crucial insights into its function and evolution. This information can be further used in the characterization and identification of similarities within protein surface regions. We propose a novel method which quantitatively compares global and local properties on the protein surface. We have tested the method on comparison of electrostatic potentials and hydrophobicity. The method is based on 3D Zernike descriptors, which provides a compact representation of a given property defined on a protein surface. Compactness and rotational invariance of this descriptor enable fast comparison suitable for database searches. The usefulness of this method is exemplified by studying several protein families including globins, thermophilic and mesophilic proteins, and active sites of TIM beta/alpha barrel proteins. In all the cases studied, the descriptor is able to cluster proteins into functionally relevant groups. The proposed approach can also be easily extended to other surface properties. This protein surface-based approach will add a new way of viewing and comparing proteins to conventional methods, which compare proteins in terms of their primary sequence or tertiary structure.

  1. In vitro and in vivo comparison of wrist MR imaging at 3.0 and 7.0 tesla using a gradient echo sequence and identical eight-channel coil array designs.

    PubMed

    Nordmeyer-Massner, Jurek A; Wyss, Michael; Andreisek, Gustav; Pruessmann, Klaas P; Hodler, Juerg

    2011-03-01

    To evaluate in vivo MR imaging of the wrist at 3.0 Tesla (T) and 7.0T quantitatively and qualitatively. To enable unbiased signal-to-noise ratio (SNR) comparisons, geometrically identical eight-channel receiver arrays were used at both field strengths. First, in vitro images of a phantom bottle were acquired at 3.0T and 7.0T to obtain an estimate of the maximum SNR gain that can be expected. MR images of the dominant wrist of 10 healthy volunteers were acquired at both field strengths. All measurements were done using the same sequence parameters. Quantitative SNR maps were calculated on a pixel-by-pixel basis and analyzed in several regions-of-interest. Furthermore, the images were qualitatively evaluated by two independent radiologists. The quantitative analysis showed SNR increases of up to 100% at 7.0T compared with 3.0T, with considerable variation between different anatomical structures. The qualitative analysis revealed no significant difference in the visualization of anatomical structures comparing 3.0T and 7.0T MR images (P>0.05). The presented results establish the SNR benefits of the transition from 3.0T to 7.0T for wrist imaging without bias by different array designs and based on exact, algebraic SNR quantification. The observed SNR increase nearly reaches expected values but varies greatly between different tissues. It does not necessarily improve the visibility of anatomic structures but adds valuable latitude for sequence optimization. Copyright © 2011 Wiley-Liss, Inc.

  2. Comparison of Quantitative PCR and Droplet Digital PCR Multiplex Assays for Two Genera of Bloom-Forming Cyanobacteria, Cylindrospermopsis and Microcystis.

    PubMed

    Te, Shu Harn; Chen, Enid Yingru; Gin, Karina Yew-Hoong

    2015-08-01

    The increasing occurrence of harmful cyanobacterial blooms, often linked to deteriorated water quality and adverse public health effects, has become a worldwide concern in recent decades. The use of molecular techniques such as real-time quantitative PCR (qPCR) has become increasingly popular in the detection and monitoring of harmful cyanobacterial species. Multiplex qPCR assays that quantify several toxigenic cyanobacterial species have been established previously; however, there is no molecular assay that detects several bloom-forming species simultaneously. Microcystis and Cylindrospermopsis are the two most commonly found genera and are known to be able to produce microcystin and cylindrospermopsin hepatotoxins. In this study, we designed primers and probes which enable quantification of these genera based on the RNA polymerase C1 gene for Cylindrospermopsis species and the c-phycocyanin beta subunit-like gene for Microcystis species. Duplex assays were developed for two molecular techniques-qPCR and droplet digital PCR (ddPCR). After optimization, both qPCR and ddPCR assays have high linearity and quantitative correlations for standards. Comparisons of the two techniques showed that qPCR has higher sensitivity, a wider linear dynamic range, and shorter analysis time and that it was more cost-effective, making it a suitable method for initial screening. However, the ddPCR approach has lower variability and was able to handle the PCR inhibition and competitive effects found in duplex assays, thus providing more precise and accurate analysis for bloom samples. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  3. Identification of cellular MMP substrates using quantitative proteomics: isotope-coded affinity tags (ICAT) and isobaric tags for relative and absolute quantification (iTRAQ).

    PubMed

    Butler, Georgina S; Dean, Richard A; Morrison, Charlotte J; Overall, Christopher M

    2010-01-01

    Identification of protease substrates is essential to understand the functional consequences of normal proteolytic processing and dysregulated proteolysis in disease. Quantitative proteomics and mass spectrometry can be used to identify protease substrates in the cellular context. Here we describe the use of two protein labeling techniques, Isotope-Coded Affinity Tags (ICAT and Isobaric Tags for Relative and Absolute Quantification (iTRAQ), which we have used successfully to identify novel matrix metalloproteinase (MMP) substrates in cell culture systems (1-4). ICAT and iTRAQ can label proteins and protease cleavage products of secreted proteins, protein domains shed from the cell membrane or pericellular matrix of protease-transfected cells that have accumulated in conditioned medium, or cell surface proteins in membrane preparations; isotopically distinct labels are used for control cells. Tryptic digestion and tandem mass spectrometry of the generated fragments enable sequencing of differentially labeled but otherwise identical pooled peptides. The isotopic tag, which is unique for each label, identifies the peptides originating from each sample, for instance, protease-transfected or control cells, and comparison of the peak areas enables relative quantification of the peptide in each sample. Thus proteins present in altered amounts between protease-expressing and null cells are implicated as protease substrates and can be further validated as such.

  4. Quantitative allochem compositional analysis of Lochkovian-Pragian boundary sections in the Prague Basin (Czech Republic)

    NASA Astrophysics Data System (ADS)

    Weinerová, Hedvika; Hron, Karel; Bábek, Ondřej; Šimíček, Daniel; Hladil, Jindřich

    2017-06-01

    Quantitative allochem compositional trends across the Lochkovian-Pragian boundary Event were examined at three sections recording the proximal to more distal carbonate ramp environment of the Prague Basin. Multivariate statistical methods (principal component analysis, correspondence analysis, cluster analysis) of point-counted thin section data were used to reconstruct facies stacking patterns and sea-level history. Both the closed-nature allochem percentages and their centred log-ratio (clr) coordinates were used. Both these approaches allow for distinguishing of lowstand, transgressive and highstand system tracts within the Praha Formation, which show gradual transition from crinoid-dominated facies deposited above the storm wave base to dacryoconarid-dominated facies of deep-water environment below the storm wave base. Quantitative compositional data also indicate progradative-retrogradative trends in the macrolithologically monotonous shallow-water succession and enable its stratigraphic correlation with successions from deeper-water environments. Generally, the stratigraphic trends of the clr data are more sensitive to subtle changes in allochem composition in comparison to the results based on raw data. A heterozoan-dominated allochem association in shallow-water environments of the Praha Formation supports the carbonate ramp environment assumed by previous authors.

  5. FT-IR imaging for quantitative determination of liver fat content in non-alcoholic fatty liver.

    PubMed

    Kochan, K; Maslak, E; Chlopicki, S; Baranska, M

    2015-08-07

    In this work we apply FT-IR imaging of large areas of liver tissue cross-section samples (∼5 cm × 5 cm) for quantitative assessment of steatosis in murine model of Non-Alcoholic Fatty Liver (NAFLD). We quantified the area of liver tissue occupied by lipid droplets (LDs) by FT-IR imaging and Oil Red O (ORO) staining for comparison. Two alternative FT-IR based approaches are presented. The first, straightforward method, was based on average spectra from tissues and provided values of the fat content by using a PLS regression model and the reference method. The second one – the chemometric-based method – enabled us to determine the values of the fat content, independently of the reference method by means of k-means cluster (KMC) analysis. In summary, FT-IR images of large size liver sections may prove to be useful for quantifying liver steatosis without the need of tissue staining.

  6. Assessment of Reproducibility of Laser Electrospray Mass Spectrometry using Electrospray Deposition of Analyte

    NASA Astrophysics Data System (ADS)

    Sistani, Habiballah; Karki, Santosh; Archer, Jieutonne J.; Shi, Fengjian; Levis, Robert J.

    2017-05-01

    A nonresonant, femtosecond (fs) laser is employed to desorb samples of Victoria blue deposited on stainless steel or indium tin oxide (ITO) slides using either electrospray deposition (ESD) or dried droplet deposition. The use of ESD resulted in uniform films of Victoria blue whereas the dried droplet method resulted in the formation of a ring pattern of the dye. Laser electrospray mass spectrometry (LEMS) measurements of the ESD-prepared films on either substrate were similar and revealed lower average relative standard deviations for measurements within-film (20.9%) and between-films (8.7%) in comparison to dried droplet (75.5% and 40.2%, respectively). The mass spectral response for ESD samples on both substrates was linear (R2 > 0.99), enabling quantitative measurements over the selected range of 7.0 × 10-11 to 2.8 × 10-9 mol, as opposed to the dried droplet samples where quantitation was not possible (R2 = 0.56). The limit of detection was measured to be 210 fmol.

  7. Investigating fold structures of 2D materials by quantitative transmission electron microscopy.

    PubMed

    Wang, Zhiwei; Zhang, Zengming; Liu, Wei; Wang, Zhong Lin

    2017-04-01

    We report an approach developed for deriving 3D structural information of 2D membrane folds based on the recently-established quantitative transmission electron microscopy (TEM) in combination with density functional theory (DFT) calculations. Systematic multislice simulations reveal that the membrane folding leads to sufficiently strong electron scattering which enables a precise determination of bending radius. The image contrast depends also on the folding angles of 2D materials due to the variation of projection potentials, which however exerts much smaller effect compared with the bending radii. DFT calculations show that folded edges are typically characteristic of (fractional) nanotubes with the same curvature retained after energy optimization. Owing to the exclusion of Stobbs factor issue, numerical simulations were directly used in comparison with the experimental measurements on an absolute contrast scale, which results in a successful determination of bending radius of folded monolayer MoS 2 films. The method should be applicable to characterizing all 2D membranes with 3D folding features. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. MetaPathways v2.5: quantitative functional, taxonomic and usability improvements.

    PubMed

    Konwar, Kishori M; Hanson, Niels W; Bhatia, Maya P; Kim, Dongjae; Wu, Shang-Ju; Hahn, Aria S; Morgan-Lang, Connor; Cheung, Hiu Kan; Hallam, Steven J

    2015-10-15

    Next-generation sequencing is producing vast amounts of sequence information from natural and engineered ecosystems. Although this data deluge has an enormous potential to transform our lives, knowledge creation and translation need software applications that scale with increasing data processing and analysis requirements. Here, we present improvements to MetaPathways, an annotation and analysis pipeline for environmental sequence information that expedites this transformation. We specifically address pathway prediction hazards through integration of a weighted taxonomic distance and enable quantitative comparison of assembled annotations through a normalized read-mapping measure. Additionally, we improve LAST homology searches through BLAST-equivalent E-values and output formats that are natively compatible with prevailing software applications. Finally, an updated graphical user interface allows for keyword annotation query and projection onto user-defined functional gene hierarchies, including the Carbohydrate-Active Enzyme database. MetaPathways v2.5 is available on GitHub: http://github.com/hallamlab/metapathways2. shallam@mail.ubc.ca Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  9. Monitoring Peptidase Activities in Complex Proteomes by MALDI-TOF Mass Spectrometry

    PubMed Central

    Villanueva, Josep; Nazarian, Arpi; Lawlor, Kevin; Tempst, Paul

    2009-01-01

    Measuring enzymatic activities in biological fluids is a form of activity-based proteomics and may be utilized as a means of developing disease biomarkers. Activity-based assays allow amplification of output signals, thus potentially visualizing low-abundant enzymes on a virtually transparent whole-proteome background. The protocol presented here describes a semi-quantitative in vitro assay of proteolytic activities in complex proteomes by monitoring breakdown of designer peptide-substrates using robotic extraction and a MALDI-TOF mass spectrometric read-out. Relative quantitation of the peptide metabolites is done by comparison with spiked internal standards, followed by statistical analysis of the resulting mini-peptidome. Partial automation provides reproducibility and throughput essential for comparing large sample sets. The approach may be employed for diagnostic or predictive purposes and enables profiling of 96 samples in 30 hours. It could be tailored to many diagnostic and pharmaco-dynamic purposes, as a read-out of catalytic and metabolic activities in body fluids or tissues. PMID:19617888

  10. Enabling Analysis of Big, Thick, Long, and Wide Data: Data Management for the Analysis of a Large Longitudinal and Cross-National Narrative Data Set.

    PubMed

    Winskell, Kate; Singleton, Robyn; Sabben, Gaelle

    2018-03-01

    Distinctive longitudinal narrative data, collected during a critical 18-year period in the history of the HIV epidemic, offer a unique opportunity to examine how young Africans are making sense of evolving developments in HIV prevention and treatment. More than 200,000 young people from across sub-Saharan Africa took part in HIV-themed scriptwriting contests held at eight discrete time points between 1997 and 2014, creating more than 75,000 narratives. This article describes the data reduction and management strategies developed for our cross-national and longitudinal study of these qualitative data. The study aims to inform HIV communication practice by identifying cultural meanings and contextual factors that inform sexual behaviors and social practices, and also to help increase understanding of processes of sociocultural change. We describe our sampling strategies and our triangulating methodologies, combining in-depth narrative analysis, thematic qualitative analysis, and quantitative analysis, which are designed to enable systematic comparison without sacrificing ethnographic richness.

  11. A cell-based computational model of early embryogenesis coupling mechanical behaviour and gene regulation

    NASA Astrophysics Data System (ADS)

    Delile, Julien; Herrmann, Matthieu; Peyriéras, Nadine; Doursat, René

    2017-01-01

    The study of multicellular development is grounded in two complementary domains: cell biomechanics, which examines how physical forces shape the embryo, and genetic regulation and molecular signalling, which concern how cells determine their states and behaviours. Integrating both sides into a unified framework is crucial to fully understand the self-organized dynamics of morphogenesis. Here we introduce MecaGen, an integrative modelling platform enabling the hypothesis-driven simulation of these dual processes via the coupling between mechanical and chemical variables. Our approach relies upon a minimal `cell behaviour ontology' comprising mesenchymal and epithelial cells and their associated behaviours. MecaGen enables the specification and control of complex collective movements in 3D space through a biologically relevant gene regulatory network and parameter space exploration. Three case studies investigating pattern formation, epithelial differentiation and tissue tectonics in zebrafish early embryogenesis, the latter with quantitative comparison to live imaging data, demonstrate the validity and usefulness of our framework.

  12. Objective fitting of hemoglobin dynamics in traumatic bruises based on temperature depth profiling

    NASA Astrophysics Data System (ADS)

    Vidovič, Luka; Milanič, Matija; Majaron, Boris

    2014-02-01

    Pulsed photothermal radiometry (PPTR) allows noninvasive measurement of laser-induced temperature depth profiles. The obtained profiles provide information on depth distribution of absorbing chromophores, such as melanin and hemoglobin. We apply this technique to objectively characterize mass diffusion and decomposition rate of extravasated hemoglobin during the bruise healing process. In present study, we introduce objective fitting of PPTR data obtained over the course of the bruise healing process. By applying Monte Carlo simulation of laser energy deposition and simulation of the corresponding PPTR signal, quantitative analysis of underlying bruise healing processes is possible. Introduction of objective fitting enables an objective comparison between the simulated and experimental PPTR signals. In this manner, we avoid reconstruction of laser-induced depth profiles and thus inherent loss of information in the process. This approach enables us to determine the value of hemoglobin mass diffusivity, which is controversial in existing literature. Such information will be a valuable addition to existing bruise age determination techniques.

  13. Risk Assessment and Integration Team (RAIT) Portfolio Risk Analysis Strategy

    NASA Technical Reports Server (NTRS)

    Edwards, Michelle

    2010-01-01

    Impact at management level: Qualitative assessment of risk criticality in conjunction with risk consequence, likelihood, and severity enable development of an "investment policy" towards managing a portfolio of risks. Impact at research level: Quantitative risk assessments enable researchers to develop risk mitigation strategies with meaningful risk reduction results. Quantitative assessment approach provides useful risk mitigation information.

  14. Quantitative Glycomics Strategies*

    PubMed Central

    Mechref, Yehia; Hu, Yunli; Desantos-Garcia, Janie L.; Hussein, Ahmed; Tang, Haixu

    2013-01-01

    The correlations between protein glycosylation and many biological processes and diseases are increasing the demand for quantitative glycomics strategies enabling sensitive monitoring of changes in the abundance and structure of glycans. This is currently attained through multiple strategies employing several analytical techniques such as capillary electrophoresis, liquid chromatography, and mass spectrometry. The detection and quantification of glycans often involve labeling with ionic and/or hydrophobic reagents. This step is needed in order to enhance detection in spectroscopic and mass spectrometric measurements. Recently, labeling with stable isotopic reagents has also been presented as a very viable strategy enabling relative quantitation. The different strategies available for reliable and sensitive quantitative glycomics are herein described and discussed. PMID:23325767

  15. Wavelength tuning of multimode interference bandpass filters by mechanical bending: experiment and theory in comparison

    NASA Astrophysics Data System (ADS)

    Walbaum, T.; Fallnich, C.

    2012-07-01

    We present the tuning of multimode interference bandpass filters made of standard fibers by mechanical bending. Our setup allows continuous adjustment of the bending radius from infinity down to about 5 cm. The impact of bending on the transmission spectrum and on polarization is investigated experimentally, and a filter with a continuous tuning range of 13.6 nm and 86 % peak transmission was realized. By use of numerical simulations employing a semi-analytical mode expansion approach, we obtain quantitative understanding of the underlying physics. Further breakdown of the governing equations enables us to identify the fiber parameters that are relevant for the design of customized filters.

  16. Law of corresponding states for open collaborations

    NASA Astrophysics Data System (ADS)

    Gherardi, Marco; Bassetti, Federico; Cosentino Lagomarsino, Marco

    2016-04-01

    We study the relation between number of contributors and product size in Wikipedia and GitHub. In contrast to traditional production, this is strongly probabilistic, but is characterized by two quantitative nonlinear laws: a power-law bound to product size for increasing number of contributors, and the universal collapse of rescaled distributions. A variant of the random-energy model shows that both laws are due to the heterogeneity of contributors, and displays an intriguing finite-size scaling property with no equivalent in standard systems. The analysis uncovers the right intensive densities, enabling the comparison of projects with different numbers of contributors on equal grounds. We use this property to expose the detrimental effects of conflicting interactions in Wikipedia.

  17. BATSE Gamma-Ray Burst Line Search. IV. Line Candidates from the Visual Search

    NASA Astrophysics Data System (ADS)

    Band, D. L.; Ryder, S.; Ford, L. A.; Matteson, J. L.; Palmer, D. M.; Teegarden, B. J.; Briggs, M. S.; Paciesas, W. S.; Pendleton, G. N.; Preece, R. D.

    1996-02-01

    We evaluate the significance of the line candidates identified by a visual search of burst spectra from BATSE's Spectroscopy Detectors. None of the candidates satisfy our detection criteria: an F-test probability less than 10-4 for a feature in one detector and consistency among the detectors that viewed the burst. Most of the candidates are not very significant and are likely to be fluctuations. Because of the expectation of finding absorption lines, the search was biased toward absorption features. We do not have a quantitative measure of the completeness of the search, which would enable a comparison with previous missions. Therefore, a more objective computerized search has begun.

  18. A quantitative comparison of corrective and perfective maintenance

    NASA Technical Reports Server (NTRS)

    Henry, Joel; Cain, James

    1994-01-01

    This paper presents a quantitative comparison of corrective and perfective software maintenance activities. The comparison utilizes basic data collected throughout the maintenance process. The data collected are extensive and allow the impact of both types of maintenance to be quantitatively evaluated and compared. Basic statistical techniques test relationships between and among process and product data. The results show interesting similarities and important differences in both process and product characteristics.

  19. A multisite assessment of the quantitative capabilities of the Xpert MTB/RIF assay.

    PubMed

    Blakemore, Robert; Nabeta, Pamela; Davidow, Amy L; Vadwai, Viral; Tahirli, Rasim; Munsamy, Vanisha; Nicol, Mark; Jones, Martin; Persing, David H; Hillemann, Doris; Ruesch-Gerdes, Sabine; Leisegang, Felicity; Zamudio, Carlos; Rodrigues, Camilla; Boehme, Catharina C; Perkins, Mark D; Alland, David

    2011-11-01

    The Xpert MTB/RIF is an automated molecular test for Mycobacterium tuberculosis that estimates bacterial burden by measuring the threshold-cycle (Ct) of its M. tuberculosis-specific real-time polymerase chain reaction. Bacterial burden is an important biomarker for disease severity, infection control risk, and response to therapy. Evaluate bacterial load quantitation by Xpert MTB/RIF compared with conventional quantitative methods. Xpert MTB/RIF results were compared with smear-microscopy, semiquantiative solid culture, and time-to-detection in liquid culture for 741 patients and 2,008 samples tested in a multisite clinical trial. An internal control real-time polymerase chain reaction was evaluated for its ability to identify inaccurate quantitative Xpert MTB/RIF results. Assays with an internal control Ct greater than 34 were likely to be inaccurately quantitated; this represented 15% of M. tuberculosis-positive tests. Excluding these, decreasing M. tuberculosis Ct was associated with increasing smear microscopy grade for smears of concentrated sputum pellets (r(s) = -0.77) and directly from sputum (r(s) =-0.71). A Ct cutoff of approximately 27.7 best predicted smear-positive status. The association between M. tuberculosis Ct and time-to-detection in liquid culture (r(s) = 0.68) and semiquantitative colony counts (r(s) = -0.56) was weaker than smear. Tests of paired same-patient sputum showed that high viscosity sputum samples contained ×32 more M. tuberculosis than nonviscous samples. Comparisons between the grade of the acid-fast bacilli smear and Xpert MTB/RIF quantitative data across study sites enabled us to identify a site outlier in microscopy. Xpert MTB/RIF quantitation offers a new, standardized approach to measuring bacterial burden in the sputum of patients with tuberculosis.

  20. A Database of Reaction Monitoring Mass Spectrometry Assays for Elucidating Therapeutic Response in Cancer

    PubMed Central

    Remily-Wood, Elizabeth R.; Liu, Richard Z.; Xiang, Yun; Chen, Yi; Thomas, C. Eric; Rajyaguru, Neal; Kaufman, Laura M.; Ochoa, Joana E.; Hazlehurst, Lori; Pinilla-Ibarz, Javier; Lancet, Jeffrey; Zhang, Guolin; Haura, Eric; Shibata, David; Yeatman, Timothy; Smalley, Keiran S.M.; Dalton, William S.; Huang, Emina; Scott, Ed; Bloom, Gregory C.; Eschrich, Steven A.; Koomen, John M.

    2012-01-01

    Purpose The Quantitative Assay Database (QuAD), http://proteome.moffitt.org/QUAD/, facilitates widespread implementation of quantitative mass spectrometry in cancer biology and clinical research through sharing of methods and reagents for monitoring protein expression and modification. Experimental Design Liquid chromatography coupled to multiple reaction monitoring mass spectrometry (LC-MRM) assays are developed using SDS-PAGE fractionated lysates from cancer cell lines. Pathway maps created using GeneGO Metacore provide the biological relationships between proteins and illustrate concepts for multiplexed analysis; each protein can be selected to examine assay development at the protein and peptide level. Results The coupling of SDS-PAGE and LC-MRM screening has been used to detect 876 peptides from 218 cancer-related proteins in model systems including colon, lung, melanoma, leukemias, and myeloma, which has led to the development of 95 quantitative assays including stable-isotope labeled peptide standards. Methods are published online and peptide standards are made available to the research community. Protein expression measurements for heat shock proteins, including a comparison with ELISA and monitoring response to the HSP90 inhibitor, 17-DMAG, are used to illustrate the components of the QuAD and its potential utility. Conclusions and Clinical Relevance This resource enables quantitative assessment of protein components of signaling pathways and biological processes and holds promise for systematic investigation of treatment responses in cancer. PMID:21656910

  1. Systematic review of statistically-derived models of immunological response in HIV-infected adults on antiretroviral therapy in Sub-Saharan Africa.

    PubMed

    Sempa, Joseph B; Ujeneza, Eva L; Nieuwoudt, Martin

    2017-01-01

    In Sub-Saharan African (SSA) resource limited settings, Cluster of Differentiation 4 (CD4) counts continue to be used for clinical decision making in antiretroviral therapy (ART). Here, HIV-infected people often remain with CD4 counts <350 cells/μL even after 5 years of viral load suppression. Ongoing immunological monitoring is necessary. Due to varying statistical modeling methods comparing immune response to ART across different cohorts is difficult. We systematically review such models and detail the similarities, differences and problems. 'Preferred Reporting Items for Systematic Review and Meta-Analyses' guidelines were used. Only studies of immune-response after ART initiation from SSA in adults were included. Data was extracted from each study and tabulated. Outcomes were categorized into 3 groups: 'slope', 'survival', and 'asymptote' models. Wordclouds were drawn wherein the frequency of variables occurring in the reviewed models is indicated by their size and color. 69 covariates were identified in the final models of 35 studies. Effect sizes of covariates were not directly quantitatively comparable in view of the combination of differing variables and scale transformation methods across models. Wordclouds enabled the identification of qualitative and semi-quantitative covariate sets for each outcome category. Comparison across categories identified sex, baseline age, baseline log viral load, baseline CD4, ART initiation regimen and ART duration as a minimal consensus set. Most models were different with respect to covariates included, variable transformations and scales, model assumptions, modelling strategies and reporting methods, even for the same outcomes. To enable comparison across cohorts, statistical models would benefit from the application of more uniform modelling techniques. Historic efforts have produced results that are anecdotal to individual cohorts only. This study was able to define 'prior' knowledge in the Bayesian sense. Such information has value for prospective modelling efforts.

  2. Creating normograms of dural sinuses in healthy persons using computer-assisted detection for analysis and comparison of cross-section dural sinuses in the brain.

    PubMed

    Anconina, Reut; Zur, Dinah; Kesler, Anat; Lublinsky, Svetlana; Toledano, Ronen; Novack, Victor; Benkobich, Elya; Novoa, Rosa; Novic, Evelyne Farkash; Shelef, Ilan

    2017-06-01

    Dural sinuses vary in size and shape in many pathological conditions with abnormal intracranial pressure. Size and shape normograms of dural brain sinuses are not available. The creation of such normograms may enable computer-assisted comparison to pathologic exams and facilitate diagnoses. The purpose of this study was to quantitatively evaluate normal magnetic resonance venography (MRV) studies in order to create normograms of dural sinuses using a computerized algorithm for vessel cross-sectional analysis. This was a retrospective analysis of MRV studies of 30 healthy persons. Data were analyzed using a specially developed Matlab algorithm for vessel cross-sectional analysis. The cross-sectional area and shape measurements were evaluated to create normograms. Mean cross-sectional size was 53.27±13.31 for the right transverse sinus (TS), 46.87+12.57 for the left TS (p=0.089) and 36.65+12.38 for the superior sagittal sinus. Normograms were created. The distribution of cross-sectional areas along the vessels showed distinct patterns and a parallel course for the median, 25th, 50th and 75th percentiles. In conclusion, using a novel computerized method for vessel cross-sectional analysis we were able to quantitatively characterize dural sinuses of healthy persons and create normograms. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Quantitative assessment of integrated phrenic nerve activity.

    PubMed

    Nichols, Nicole L; Mitchell, Gordon S

    2016-06-01

    Integrated electrical activity in the phrenic nerve is commonly used to assess within-animal changes in phrenic motor output. Because of concerns regarding the consistency of nerve recordings, activity is most often expressed as a percent change from baseline values. However, absolute values of nerve activity are necessary to assess the impact of neural injury or disease on phrenic motor output. To date, no systematic evaluations of the repeatability/reliability have been made among animals when phrenic recordings are performed by an experienced investigator using standardized methods. We performed a meta-analysis of studies reporting integrated phrenic nerve activity in many rat groups by the same experienced investigator; comparisons were made during baseline and maximal chemoreceptor stimulation in 14 wild-type Harlan and 14 Taconic Sprague Dawley groups, and in 3 pre-symptomatic and 11 end-stage SOD1(G93A) Taconic rat groups (an ALS model). Meta-analysis results indicate: (1) consistent measurements of integrated phrenic activity in each sub-strain of wild-type rats; (2) with bilateral nerve recordings, left-to-right integrated phrenic activity ratios are ∼1.0; and (3) consistently reduced activity in end-stage SOD1(G93A) rats. Thus, with appropriate precautions, integrated phrenic nerve activity enables robust, quantitative comparisons among nerves or experimental groups, including differences caused by neuromuscular disease. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. A quantitative and qualitative comparison of illumina MiSeq and 454 amplicon sequencing for genotyping the highly polymorphic major histocompatibility complex (MHC) in a non-model species.

    PubMed

    Razali, Haslina; O'Connor, Emily; Drews, Anna; Burke, Terry; Westerdahl, Helena

    2017-07-28

    High-throughput sequencing enables high-resolution genotyping of extremely duplicated genes. 454 amplicon sequencing (454) has become the standard technique for genotyping the major histocompatibility complex (MHC) genes in non-model organisms. However, illumina MiSeq amplicon sequencing (MiSeq), which offers a much higher read depth, is now superseding 454. The aim of this study was to quantitatively and qualitatively evaluate the performance of MiSeq in relation to 454 for genotyping MHC class I alleles using a house sparrow (Passer domesticus) dataset with pedigree information. House sparrows provide a good study system for this comparison as their MHC class I genes have been studied previously and, consequently, we had prior expectations concerning the number of alleles per individual. We found that 454 and MiSeq performed equally well in genotyping amplicons with low diversity, i.e. amplicons from individuals that had fewer than 6 alleles. Although there was a higher rate of failure in the 454 dataset in resolving amplicons with higher diversity (6-9 alleles), the same genotypes were identified by both 454 and MiSeq in 98% of cases. We conclude that low diversity amplicons are equally well genotyped using either 454 or MiSeq, but the higher coverage afforded by MiSeq can lead to this approach outperforming 454 in amplicons with higher diversity.

  5. On Quantitative Comparative Research in Communication and Language Evolution

    PubMed Central

    Oller, D. Kimbrough; Griebel, Ulrike

    2014-01-01

    Quantitative comparison of human language and natural animal communication requires improved conceptualizations. We argue that an infrastructural approach to development and evolution incorporating an extended interpretation of the distinctions among illocution, perlocution, and meaning (Austin 1962; Oller and Griebel 2008) can help place the issues relevant to quantitative comparison in perspective. The approach can illuminate the controversy revolving around the notion of functional referentiality as applied to alarm calls, for example in the vervet monkey. We argue that referentiality offers a poor point of quantitative comparison across language and animal communication in the wild. Evidence shows that even newborn human cry could be deemed to show functional referentiality according to the criteria typically invoked by advocates of referentiality in animal communication. Exploring the essence of the idea of illocution, we illustrate an important realm of commonality among animal communication systems and human language, a commonality that opens the door to more productive, quantifiable comparisons. Finally, we delineate two examples of infrastructural communicative capabilities that should be particularly amenable to direct quantitative comparison across humans and our closest relatives. PMID:25285057

  6. On Quantitative Comparative Research in Communication and Language Evolution.

    PubMed

    Oller, D Kimbrough; Griebel, Ulrike

    2014-09-01

    Quantitative comparison of human language and natural animal communication requires improved conceptualizations. We argue that an infrastructural approach to development and evolution incorporating an extended interpretation of the distinctions among illocution, perlocution, and meaning (Austin 1962; Oller and Griebel 2008) can help place the issues relevant to quantitative comparison in perspective. The approach can illuminate the controversy revolving around the notion of functional referentiality as applied to alarm calls, for example in the vervet monkey. We argue that referentiality offers a poor point of quantitative comparison across language and animal communication in the wild. Evidence shows that even newborn human cry could be deemed to show functional referentiality according to the criteria typically invoked by advocates of referentiality in animal communication. Exploring the essence of the idea of illocution, we illustrate an important realm of commonality among animal communication systems and human language, a commonality that opens the door to more productive, quantifiable comparisons. Finally, we delineate two examples of infrastructural communicative capabilities that should be particularly amenable to direct quantitative comparison across humans and our closest relatives.

  7. Analytical method for the accurate determination of tricothecenes in grains using LC-MS/MS: a comparison between MRM transition and MS3 quantitation.

    PubMed

    Lim, Chee Wei; Tai, Siew Hoon; Lee, Lin Min; Chan, Sheot Harn

    2012-07-01

    The current food crisis demands unambiguous determination of mycotoxin contamination in staple foods to achieve safer food for consumption. This paper describes the first accurate LC-MS/MS method developed to analyze tricothecenes in grains by applying multiple reaction monitoring (MRM) transition and MS(3) quantitation strategies in tandem. The tricothecenes are nivalenol, deoxynivalenol, deoxynivalenol-3-glucoside, fusarenon X, 3-acetyl-deoxynivalenol, 15-acetyldeoxynivalenol, diacetoxyscirpenol, and HT-2 and T-2 toxins. Acetic acid and ammonium acetate were used to convert the analytes into their respective acetate adducts and ammonium adducts under negative and positive MS polarity conditions, respectively. The mycotoxins were separated by reversed-phase LC in a 13.5-min run, ionized using electrospray ionization, and detected by tandem mass spectrometry. Analyte-specific mass-to-charge (m/z) ratios were used to perform quantitation under MRM transition and MS(3) (linear ion trap) modes. Three experiments were made for each quantitation mode and matrix in batches over 6 days for recovery studies. The matrix effect was investigated at concentration levels of 20, 40, 80, 120, 160, and 200 μg kg(-1) (n = 3) in 5 g corn flour and rice flour. Extraction with acetonitrile provided a good overall recovery range of 90-108% (n = 3) at three levels of spiking concentration of 40, 80, and 120 μg kg(-1). A quantitation limit of 2-6 μg kg(-1) was achieved by applying an MRM transition quantitation strategy. Under MS(3) mode, a quantitation limit of 4-10 μg kg(-1) was achieved. Relative standard deviations of 2-10% and 2-11% were reported for MRM transition and MS(3) quantitation, respectively. The successful utilization of MS(3) enabled accurate analyte fragmentation pattern matching and its quantitation, leading to the development of analytical methods in fields that demand both analyte specificity and fragmentation fingerprint-matching capabilities that are unavailable under MRM transition.

  8. Quantitative analysis of tympanic membrane perforation: a simple and reliable method.

    PubMed

    Ibekwe, T S; Adeosun, A A; Nwaorgu, O G

    2009-01-01

    Accurate assessment of the features of tympanic membrane perforation, especially size, site, duration and aetiology, is important, as it enables optimum management. To describe a simple, cheap and effective method of quantitatively analysing tympanic membrane perforations. The system described comprises a video-otoscope (capable of generating still and video images of the tympanic membrane), adapted via a universal serial bus box to a computer screen, with images analysed using the Image J geometrical analysis software package. The reproducibility of results and their correlation with conventional otoscopic methods of estimation were tested statistically with the paired t-test and correlational tests, using the Statistical Package for the Social Sciences version 11 software. The following equation was generated: P/T x 100 per cent = percentage perforation, where P is the area (in pixels2) of the tympanic membrane perforation and T is the total area (in pixels2) for the entire tympanic membrane (including the perforation). Illustrations are shown. Comparison of blinded data on tympanic membrane perforation area obtained independently from assessments by two trained otologists, of comparative years of experience, using the video-otoscopy system described, showed similar findings, with strong correlations devoid of inter-observer error (p = 0.000, r = 1). Comparison with conventional otoscopic assessment also indicated significant correlation, comparing results for two trained otologists, but some inter-observer variation was present (p = 0.000, r = 0.896). Correlation between the two methods for each of the otologists was also highly significant (p = 0.000). A computer-adapted video-otoscope, with images analysed by Image J software, represents a cheap, reliable, technology-driven, clinical method of quantitative analysis of tympanic membrane perforations and injuries.

  9. Statistical Characterization and Classification of Edge-Localized Plasma Instabilities

    NASA Astrophysics Data System (ADS)

    Webster, A. J.; Dendy, R. O.

    2013-04-01

    The statistics of edge-localized plasma instabilities (ELMs) in toroidal magnetically confined fusion plasmas are considered. From first principles, standard experimentally motivated assumptions are shown to determine a specific probability distribution for the waiting times between ELMs: the Weibull distribution. This is confirmed empirically by a statistically rigorous comparison with a large data set from the Joint European Torus. The successful characterization of ELM waiting times enables future work to progress in various ways. Here we present a quantitative classification of ELM types, complementary to phenomenological approaches. It also informs us about the nature of ELM processes, such as whether they are random or deterministic. The methods are extremely general and can be applied to numerous other quasiperiodic intermittent phenomena.

  10. Imaging cellular structures in super-resolution with SIM, STED and Localisation Microscopy: A practical comparison.

    PubMed

    Wegel, Eva; Göhler, Antonia; Lagerholm, B Christoffer; Wainman, Alan; Uphoff, Stephan; Kaufmann, Rainer; Dobbie, Ian M

    2016-06-06

    Many biological questions require fluorescence microscopy with a resolution beyond the diffraction limit of light. Super-resolution methods such as Structured Illumination Microscopy (SIM), STimulated Emission Depletion (STED) microscopy and Single Molecule Localisation Microscopy (SMLM) enable an increase in image resolution beyond the classical diffraction-limit. Here, we compare the individual strengths and weaknesses of each technique by imaging a variety of different subcellular structures in fixed cells. We chose examples ranging from well separated vesicles to densely packed three dimensional filaments. We used quantitative and correlative analyses to assess the performance of SIM, STED and SMLM with the aim of establishing a rough guideline regarding the suitability for typical applications and to highlight pitfalls associated with the different techniques.

  11. Validation metrics for turbulent plasma transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, C., E-mail: chholland@ucsd.edu

    Developing accurate models of plasma dynamics is essential for confident predictive modeling of current and future fusion devices. In modern computer science and engineering, formal verification and validation processes are used to assess model accuracy and establish confidence in the predictive capabilities of a given model. This paper provides an overview of the key guiding principles and best practices for the development of validation metrics, illustrated using examples from investigations of turbulent transport in magnetically confined plasmas. Particular emphasis is given to the importance of uncertainty quantification and its inclusion within the metrics, and the need for utilizing synthetic diagnosticsmore » to enable quantitatively meaningful comparisons between simulation and experiment. As a starting point, the structure of commonly used global transport model metrics and their limitations is reviewed. An alternate approach is then presented, which focuses upon comparisons of predicted local fluxes, fluctuations, and equilibrium gradients against observation. The utility of metrics based upon these comparisons is demonstrated by applying them to gyrokinetic predictions of turbulent transport in a variety of discharges performed on the DIII-D tokamak [J. L. Luxon, Nucl. Fusion 42, 614 (2002)], as part of a multi-year transport model validation activity.« less

  12. Three-dimensional structural modelling and calculation of electrostatic potentials of HLA Bw4 and Bw6 epitopes to explain the molecular basis for alloantibody binding: toward predicting HLA antigenicity and immunogenicity.

    PubMed

    Mallon, Dermot H; Bradley, J Andrew; Winn, Peter J; Taylor, Craig J; Kosmoliaptsis, Vasilis

    2015-02-01

    We have previously shown that qualitative assessment of surface electrostatic potential of HLA class I molecules helps explain serological patterns of alloantibody binding. We have now used a novel computational approach to quantitate differences in surface electrostatic potential of HLA B-cell epitopes and applied this to explain HLA Bw4 and Bw6 antigenicity. Protein structure models of HLA class I alleles expressing either the Bw4 or Bw6 epitope (defined by sequence motifs at positions 77 to 83) were generated using comparative structure prediction. The electrostatic potential in 3-dimensional space encompassing the Bw4/Bw6 epitope was computed by solving the Poisson-Boltzmann equation and quantitatively compared in a pairwise, all-versus-all fashion to produce distance matrices that cluster epitopes with similar electrostatics properties. Quantitative comparison of surface electrostatic potential at the carboxyl terminal of the α1-helix of HLA class I alleles, corresponding to amino acid sequence motif 77 to 83, produced clustering of HLA molecules in 3 principal groups according to Bw4 or Bw6 epitope expression. Remarkably, quantitative differences in electrostatic potential reflected known patterns of serological reactivity better than Bw4/Bw6 amino acid sequence motifs. Quantitative assessment of epitope electrostatic potential allowed the impact of known amino acid substitutions (HLA-B*07:02 R79G, R82L, G83R) that are critical for antibody binding to be predicted. We describe a novel approach for quantitating differences in HLA B-cell epitope electrostatic potential. Proof of principle is provided that this approach enables better assessment of HLA epitope antigenicity than amino acid sequence data alone, and it may allow prediction of HLA immunogenicity.

  13. Miniaturized Battery-Free Wireless Systems for Wearable Pulse Oximetry.

    PubMed

    Kim, Jeonghyun; Gutruf, Philipp; Chiarelli, Antonio M; Heo, Seung Yun; Cho, Kyoungyeon; Xie, Zhaoqian; Banks, Anthony; Han, Seungyoung; Jang, Kyung-In; Lee, Jung Woo; Lee, Kyu-Tae; Feng, Xue; Huang, Yonggang; Fabiani, Monica; Gratton, Gabriele; Paik, Ungyu; Rogers, John A

    2017-01-05

    Development of unconventional technologies for wireless collection, storage and analysis of quantitative, clinically relevant information on physiological status is of growing interest. Soft, biocompatible systems are widely regarded as important because they facilitate mounting on external (e.g. skin) and internal (e.g. heart, brain) surfaces of the body. Ultra-miniaturized, lightweight and battery-free devices have the potential to establish complementary options in bio-integration, where chronic interfaces (i.e. months) are possible on hard surfaces such as the fingernails and the teeth, with negligible risk for irritation or discomfort. Here we report materials and device concepts for flexible platforms that incorporate advanced optoelectronic functionality for applications in wireless capture and transmission of photoplethysmograms, including quantitative information on blood oxygenation, heart rate and heart rate variability. Specifically, reflectance pulse oximetry in conjunction with near-field communication (NFC) capabilities enables operation in thin, miniaturized flexible devices. Studies of the material aspects associated with the body interface, together with investigations of the radio frequency characteristics, the optoelectronic data acquisition approaches and the analysis methods capture all of the relevant engineering considerations. Demonstrations of operation on various locations of the body and quantitative comparisons to clinical gold standards establish the versatility and the measurement accuracy of these systems, respectively.

  14. Quantitative Assessment of RNA-Protein Interactions with High Throughput Sequencing - RNA Affinity Profiling (HiTS-RAP)

    PubMed Central

    Ozer, Abdullah; Tome, Jacob M.; Friedman, Robin C.; Gheba, Dan; Schroth, Gary P.; Lis, John T.

    2016-01-01

    Because RNA-protein interactions play a central role in a wide-array of biological processes, methods that enable a quantitative assessment of these interactions in a high-throughput manner are in great demand. Recently, we developed the High Throughput Sequencing-RNA Affinity Profiling (HiTS-RAP) assay, which couples sequencing on an Illumina GAIIx with the quantitative assessment of one or several proteins’ interactions with millions of different RNAs in a single experiment. We have successfully used HiTS-RAP to analyze interactions of EGFP and NELF-E proteins with their corresponding canonical and mutant RNA aptamers. Here, we provide a detailed protocol for HiTS-RAP, which can be completed in about a month (8 days hands-on time) including the preparation and testing of recombinant proteins and DNA templates, clustering DNA templates on a flowcell, high-throughput sequencing and protein binding with GAIIx, and finally data analysis. We also highlight aspects of HiTS-RAP that can be further improved and points of comparison between HiTS-RAP and two other recently developed methods, RNA-MaP and RBNS. A successful HiTS-RAP experiment provides the sequence and binding curves for approximately 200 million RNAs in a single experiment. PMID:26182240

  15. Quantitative proteomics in cardiovascular research: global and targeted strategies

    PubMed Central

    Shen, Xiaomeng; Young, Rebeccah; Canty, John M.; Qu, Jun

    2014-01-01

    Extensive technical advances in the past decade have substantially expanded quantitative proteomics in cardiovascular research. This has great promise for elucidating the mechanisms of cardiovascular diseases (CVD) and the discovery of cardiac biomarkers used for diagnosis and treatment evaluation. Global and targeted proteomics are the two major avenues of quantitative proteomics. While global approaches enable unbiased discovery of altered proteins via relative quantification at the proteome level, targeted techniques provide higher sensitivity and accuracy, and are capable of multiplexed absolute quantification in numerous clinical/biological samples. While promising, technical challenges need to be overcome to enable full utilization of these techniques in cardiovascular medicine. Here we discuss recent advances in quantitative proteomics and summarize applications in cardiovascular research with an emphasis on biomarker discovery and elucidating molecular mechanisms of disease. We propose the integration of global and targeted strategies as a high-throughput pipeline for cardiovascular proteomics. Targeted approaches enable rapid, extensive validation of biomarker candidates discovered by global proteomics. These approaches provide a promising alternative to immunoassays and other low-throughput means currently used for limited validation. PMID:24920501

  16. 75 FR 373 - Agency Information Collection Activities; Submission for Office of Management and Budget Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-05

    ... Request; Experimental Study: Presentation of Quantitative Effectiveness Information to Consumers in Direct... clearance. Experimental Study: Presentation of Quantitative Effectiveness Information to Consumers in Direct... research has proposed that providing quantitative information about product efficacy enables consumers to...

  17. Automated feature extraction and spatial organization of seafloor pockmarks, Belfast Bay, Maine, USA

    USGS Publications Warehouse

    Andrews, Brian D.; Brothers, Laura L.; Barnhardt, Walter A.

    2010-01-01

    Seafloor pockmarks occur worldwide and may represent millions of m3 of continental shelf erosion, but few numerical analyses of their morphology and spatial distribution of pockmarks exist. We introduce a quantitative definition of pockmark morphology and, based on this definition, propose a three-step geomorphometric method to identify and extract pockmarks from high-resolution swath bathymetry. We apply this GIS-implemented approach to 25 km2 of bathymetry collected in the Belfast Bay, Maine USA pockmark field. Our model extracted 1767 pockmarks and found a linear pockmark depth-to-diameter ratio for pockmarks field-wide. Mean pockmark depth is 7.6 m and mean diameter is 84.8 m. Pockmark distribution is non-random, and nearly half of the field's pockmarks occur in chains. The most prominent chains are oriented semi-normal to the steepest gradient in Holocene sediment thickness. A descriptive model yields field-wide spatial statistics indicating that pockmarks are distributed in non-random clusters. Results enable quantitative comparison of pockmarks in fields worldwide as well as similar concave features, such as impact craters, dolines, or salt pools.

  18. Analytic Guided-Search Model of Human Performance Accuracy in Target- Localization Search Tasks

    NASA Technical Reports Server (NTRS)

    Eckstein, Miguel P.; Beutter, Brent R.; Stone, Leland S.

    2000-01-01

    Current models of human visual search have extended the traditional serial/parallel search dichotomy. Two successful models for predicting human visual search are the Guided Search model and the Signal Detection Theory model. Although these models are inherently different, it has been difficult to compare them because the Guided Search model is designed to predict response time, while Signal Detection Theory models are designed to predict performance accuracy. Moreover, current implementations of the Guided Search model require the use of Monte-Carlo simulations, a method that makes fitting the model's performance quantitatively to human data more computationally time consuming. We have extended the Guided Search model to predict human accuracy in target-localization search tasks. We have also developed analytic expressions that simplify simulation of the model to the evaluation of a small set of equations using only three free parameters. This new implementation and extension of the Guided Search model will enable direct quantitative comparisons with human performance in target-localization search experiments and with the predictions of Signal Detection Theory and other search accuracy models.

  19. Bimetallic Effect of Single Nanocatalysts Visualized by Super-Resolution Catalysis Imaging

    DOE PAGES

    Chen, Guanqun; Zou, Ningmu; Chen, Bo; ...

    2017-11-01

    Compared with their monometallic counterparts, bimetallic nanoparticles often show enhanced catalytic activity associated with the bimetallic interface. Direct quantitation of catalytic activity at the bimetallic interface is important for understanding the enhancement mechanism, but challenging experimentally. Here using single-molecule super-resolution catalysis imaging in correlation with electron microscopy, we report the first quantitative visualization of enhanced bimetallic activity within single bimetallic nanoparticles. We focus on heteronuclear bimetallic PdAu nanoparticles that present a well-defined Pd–Au bimetallic interface in catalyzing a photodriven fluorogenic disproportionation reaction. Our approach also enables a direct comparison between the bimetallic and monometallic regions within the same nanoparticle. Theoreticalmore » calculations further provide insights into the electronic nature of N–O bond activation of the reactant (resazurin) adsorbed on bimetallic sites. Subparticle activity correlation between bimetallic enhancement and monometallic activity suggests that the favorable locations to construct bimetallic sites are those monometallic sites with higher activity, leading to a strategy for making effective bimetallic nanocatalysts. Furthermore, the results highlight the power of super-resolution catalysis imaging in gaining insights that could help improve nanocatalysts.« less

  20. Direct molecular dynamics simulation of Ge deposition on amorphous SiO 2 at experimentally relevant conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chuang, Claire Y.; Zepeda-Ruiz, Luis A.; Han, Sang M.

    2015-06-01

    Molecular dynamics simulations were used to study Ge island nucleation and growth on amorphous SiO 2 substrates. This process is relevant in selective epitaxial growth of Ge on Si, for which SiO 2 is often used as a template mask. The islanding process was studied over a wide range of temperatures and fluxes, using a recently proposed empirical potential model for the Si–SiO 2–Ge system. The simulations provide an excellent quantitative picture of the Ge islanding and compare well with detailed experimental measurements. These quantitative comparisons were enabled by an analytical rate model as a bridge between simulations and experimentsmore » despite the fact that deposition fluxes accessible in simulations and experiments are necessarily different by many orders of magnitude. In particular, the simulations led to accurate predictions of the critical island size and the scaling of island density as a function of temperature. Lastly, the overall approach used here should be useful not just for future studies in this particular system, but also for molecular simulations of deposition in other materials.« less

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Guanqun; Zou, Ningmu; Chen, Bo

    Compared with their monometallic counterparts, bimetallic nanoparticles often show enhanced catalytic activity associated with the bimetallic interface. Direct quantitation of catalytic activity at the bimetallic interface is important for understanding the enhancement mechanism, but challenging experimentally. Here using single-molecule super-resolution catalysis imaging in correlation with electron microscopy, we report the first quantitative visualization of enhanced bimetallic activity within single bimetallic nanoparticles. We focus on heteronuclear bimetallic PdAu nanoparticles that present a well-defined Pd–Au bimetallic interface in catalyzing a photodriven fluorogenic disproportionation reaction. Our approach also enables a direct comparison between the bimetallic and monometallic regions within the same nanoparticle. Theoreticalmore » calculations further provide insights into the electronic nature of N–O bond activation of the reactant (resazurin) adsorbed on bimetallic sites. Subparticle activity correlation between bimetallic enhancement and monometallic activity suggests that the favorable locations to construct bimetallic sites are those monometallic sites with higher activity, leading to a strategy for making effective bimetallic nanocatalysts. Furthermore, the results highlight the power of super-resolution catalysis imaging in gaining insights that could help improve nanocatalysts.« less

  2. Comparative mapping of quantitative trait loci sculpting the curd of Brassica oleracea.

    PubMed

    Lan, T H; Paterson, A H

    2000-08-01

    The enlarged inflorescence (curd) of cauliflower and broccoli provide not only a popular vegetable for human consumption, but also a unique opportunity for scientists who seek to understand the genetic basis of plant growth and development. By the comparison of quantitative trait loci (QTL) maps constructed from three different F(2) populations, we identified a total of 86 QTL that control eight curd-related traits in Brassica oleracea. The 86 QTL may reflect allelic variation in as few as 67 different genetic loci and 54 ancestral genes. Although the locations of QTL affecting a trait occasionally corresponded between different populations or between different homeologous Brassica chromosomes, our data supported other molecular and morphological data in suggesting that the Brassica genus is rapidly evolving. Comparative data enabled us to identify a number of candidate genes from Arabidopsis that warrant further investigation to determine if some of them might account for Brassica QTL. The Arabidopsis/Brassica system is an important example of both the challenges and opportunities associated with extrapolation of genomic information from facile models to large-genome taxa including major crops.

  3. 2D IR Spectroscopy using Four-Wave Mixing, Pulse Shaping, and IR Upconversion: A Quantitative Comparison

    PubMed Central

    Rock, William; Li, Yun-Liang; Pagano, Philip; Cheatum, Christopher M.

    2013-01-01

    Recent technological advances have led to major changes in the apparatuses used to collect 2D IR spectra. Pulse shaping offers several advantages including rapid data collection, inherent phase stability, and phase cycling capabilities. Visible array detection via upconversion allows the use of visible detectors that are cheaper, faster, more sensitive, and less noisy than IR detectors. However, despite these advantages, many researchers are reluctant to implement these technologies. Here we present a quantitative study of the S/N of 2D IR spectra collected with a traditional four-wave mixing (FWM) apparatus, with a pulse shaping apparatus, and with visible detection via upconversion to address the question of whether or not weak chromophores at low concentrations are still accessible with such an apparatus. We find that the enhanced averaging capability of the pulse shaping apparatus enables the detection of small signals that would be challenging to measure even with the traditional FWM apparatus, and we demonstrate this ability on a sample of cyanylated dihydrofolate reductase (DHFR). PMID:23687988

  4. Recent trends in high spin sensitivity magnetic resonance

    NASA Astrophysics Data System (ADS)

    Blank, Aharon; Twig, Ygal; Ishay, Yakir

    2017-07-01

    Magnetic resonance is a very powerful methodology that has been employed successfully in many applications for about 70 years now, resulting in a wealth of scientific, technological, and diagnostic data. Despite its many advantages, one major drawback of magnetic resonance is its relatively poor sensitivity and, as a consequence, its bad spatial resolution when examining heterogeneous samples. Contemporary science and technology often make use of very small amounts of material and examine heterogeneity on a very small length scale, both of which are well beyond the current capabilities of conventional magnetic resonance. It is therefore very important to significantly improve both the sensitivity and the spatial resolution of magnetic resonance techniques. The quest for higher sensitivity led in recent years to the development of many alternative detection techniques that seem to rival and challenge the conventional ;old-fashioned; induction-detection approach. The aim of this manuscript is to briefly review recent advances in the field, and to provide a quantitative as well as qualitative comparison between various detection methods with an eye to future potential advances and developments. We first offer a common definition of sensitivity in magnetic resonance to enable proper quantitative comparisons between various detection methods. Following that, up-to-date information about the sensitivity capabilities of the leading recently-developed detection approaches in magnetic resonance is provided, accompanied by a critical comparison between them and induction detection. Our conclusion from this comparison is that induction detection is still indispensable, and as such, it is very important to look for ways to significantly improve it. To do so, we provide expressions for the sensitivity of induction-detection, derived from both classical and quantum mechanics, that identify its main limiting factors. Examples from current literature, as well as a description of new ideas, show how these limiting factors can be mitigated to significantly improve the sensitivity of induction detection. Finally, we outline some directions for the possible applications of high-sensitivity induction detection in the field of electron spin resonance.

  5. Evaluation of Heterogeneous Metabolic Profile in an Orthotopic Human Glioblastoma Xenograft Model Using Compressed Sensing Hyperpolarized 3D 13C Magnetic Resonance Spectroscopic Imaging

    PubMed Central

    Park, Ilwoo; Hu, Simon; Bok, Robert; Ozawa, Tomoko; Ito, Motokazu; Mukherjee, Joydeep; Phillips, Joanna J.; James, C. David; Pieper, Russell O.; Ronen, Sabrina M.; Vigneron, Daniel B.; Nelson, Sarah J.

    2013-01-01

    High resolution compressed sensing hyperpolarized 13C magnetic resonance spectroscopic imaging was applied in orthotopic human glioblastoma xenografts for quantitative assessment of spatial variations in 13C metabolic profiles and comparison with histopathology. A new compressed sensing sampling design with a factor of 3.72 acceleration was implemented to enable a factor of 4 increase in spatial resolution. Compressed sensing 3D 13C magnetic resonance spectroscopic imaging data were acquired from a phantom and 10 tumor-bearing rats following injection of hyperpolarized [1-13C]-pyruvate using a 3T scanner. The 13C metabolic profiles were compared with hematoxylin and eosin staining and carbonic anhydrase 9 staining. The high-resolution compressed sensing 13C magnetic resonance spectroscopic imaging data enabled the differentiation of distinct 13C metabolite patterns within abnormal tissues with high specificity in similar scan times compared to the fully sampled method. The results from pathology confirmed the different characteristics of 13C metabolic profiles between viable, non-necrotic, nonhypoxic tumor, and necrotic, hypoxic tissue. PMID:22851374

  6. Evaluation of heterogeneous metabolic profile in an orthotopic human glioblastoma xenograft model using compressed sensing hyperpolarized 3D 13C magnetic resonance spectroscopic imaging.

    PubMed

    Park, Ilwoo; Hu, Simon; Bok, Robert; Ozawa, Tomoko; Ito, Motokazu; Mukherjee, Joydeep; Phillips, Joanna J; James, C David; Pieper, Russell O; Ronen, Sabrina M; Vigneron, Daniel B; Nelson, Sarah J

    2013-07-01

    High resolution compressed sensing hyperpolarized (13)C magnetic resonance spectroscopic imaging was applied in orthotopic human glioblastoma xenografts for quantitative assessment of spatial variations in (13)C metabolic profiles and comparison with histopathology. A new compressed sensing sampling design with a factor of 3.72 acceleration was implemented to enable a factor of 4 increase in spatial resolution. Compressed sensing 3D (13)C magnetic resonance spectroscopic imaging data were acquired from a phantom and 10 tumor-bearing rats following injection of hyperpolarized [1-(13)C]-pyruvate using a 3T scanner. The (13)C metabolic profiles were compared with hematoxylin and eosin staining and carbonic anhydrase 9 staining. The high-resolution compressed sensing (13)C magnetic resonance spectroscopic imaging data enabled the differentiation of distinct (13)C metabolite patterns within abnormal tissues with high specificity in similar scan times compared to the fully sampled method. The results from pathology confirmed the different characteristics of (13)C metabolic profiles between viable, non-necrotic, nonhypoxic tumor, and necrotic, hypoxic tissue. Copyright © 2012 Wiley Periodicals, Inc.

  7. Computer modeling of pulsed CO2 lasers for lidar applications

    NASA Technical Reports Server (NTRS)

    Spiers, Gary D.; Smithers, Martin E.; Murty, Rom

    1991-01-01

    The experimental results will enable a comparison of the numerical code output with experimental data. This will ensure verification of the validity of the code. The measurements were made on a modified commercial CO2 laser. Results are listed as following. (1) The pulse shape and energy dependence on gas pressure were measured. (2) The intrapulse frequency chirp due to plasma and laser induced medium perturbation effects were determined. A simple numerical model showed quantitative agreement with these measurements. The pulse to pulse frequency stability was also determined. (3) The dependence was measured of the laser transverse mode stability on cavity length. A simple analysis of this dependence in terms of changes to the equivalent fresnel number and the cavity magnification was performed. (4) An analysis was made of the discharge pulse shape which enabled the low efficiency of the laser to be explained in terms of poor coupling of the electrical energy into the vibrational levels. And (5) the existing laser resonator code was changed to allow it to run on the Cray XMP under the new operating system.

  8. Integrated Computational System for Aerodynamic Steering and Visualization

    NASA Technical Reports Server (NTRS)

    Hesselink, Lambertus

    1999-01-01

    In February of 1994, an effort from the Fluid Dynamics and Information Sciences Divisions at NASA Ames Research Center with McDonnel Douglas Aerospace Company and Stanford University was initiated to develop, demonstrate, validate and disseminate automated software for numerical aerodynamic simulation. The goal of the initiative was to develop a tri-discipline approach encompassing CFD, Intelligent Systems, and Automated Flow Feature Recognition to improve the utility of CFD in the design cycle. This approach would then be represented through an intelligent computational system which could accept an engineer's definition of a problem and construct an optimal and reliable CFD solution. Stanford University's role focused on developing technologies that advance visualization capabilities for analysis of CFD data, extract specific flow features useful for the design process, and compare CFD data with experimental data. During the years 1995-1997, Stanford University focused on developing techniques in the area of tensor visualization and flow feature extraction. Software libraries were created enabling feature extraction and exploration of tensor fields. As a proof of concept, a prototype system called the Integrated Computational System (ICS) was developed to demonstrate CFD design cycle. The current research effort focuses on finding a quantitative comparison of general vector fields based on topological features. Since the method relies on topological information, grid matching and vector alignment is not needed in the comparison. This is often a problem with many data comparison techniques. In addition, since only topology based information is stored and compared for each field, there is a significant compression of information that enables large databases to be quickly searched. This report will (1) briefly review the technologies developed during 1995-1997 (2) describe current technologies in the area of comparison techniques, (4) describe the theory of our new method researched during the grant year (5) summarize a few of the results and finally (6) discuss work within the last 6 months that are direct extensions from the grant.

  9. Four-point bending as a method for quantitatively evaluating spinal arthrodesis in a rat model.

    PubMed

    Robinson, Samuel T; Svet, Mark T; Kanim, Linda A; Metzger, Melodie F

    2015-02-01

    The most common method of evaluating the success (or failure) of rat spinal fusion procedures is manual palpation testing. Whereas manual palpation provides only a subjective binary answer (fused or not fused) regarding the success of a fusion surgery, mechanical testing can provide more quantitative data by assessing variations in strength among treatment groups. We here describe a mechanical testing method to quantitatively assess single-level spinal fusion in a rat model, to improve on the binary and subjective nature of manual palpation as an end point for fusion-related studies. We tested explanted lumbar segments from Sprague-Dawley rat spines after single-level posterolateral fusion procedures at L4-L5. Segments were classified as 'not fused,' 'restricted motion,' or 'fused' by using manual palpation testing. After thorough dissection and potting of the spine, 4-point bending in flexion then was applied to the L4-L5 motion segment, and stiffness was measured as the slope of the moment-displacement curve. Results demonstrated statistically significant differences in stiffness among all groups, which were consistent with preliminary grading according to manual palpation. In addition, the 4-point bending results provided quantitative information regarding the quality of the bony union formed and therefore enabled the comparison of fused specimens. Our results demonstrate that 4-point bending is a simple, reliable, and effective way to describe and compare results among rat spines after fusion surgery.

  10. Comparison of salivary collection and processing methods for quantitative HHV-8 detection.

    PubMed

    Speicher, D J; Johnson, N W

    2014-10-01

    Saliva is a proved diagnostic fluid for the qualitative detection of infectious agents, but the accuracy of viral load determinations is unknown. Stabilising fluids impede nucleic acid degradation, compared with collection onto ice and then freezing, and we have shown that the DNA Genotek P-021 prototype kit (P-021) can produce high-quality DNA after 14 months of storage at room temperature. Here we evaluate the quantitative capability of 10 collection/processing methods. Unstimulated whole mouth fluid was spiked with a mixture of HHV-8 cloned constructs, 10-fold serial dilutions were produced, and samples were extracted and then examined with quantitative PCR (qPCR). Calibration curves were compared by linear regression and qPCR dynamics. All methods extracted with commercial spin columns produced linear calibration curves with large dynamic range and gave accurate viral loads. Ethanol precipitation of the P-021 does not produce a linear standard curve, and virus is lost in the cell pellet. DNA extractions from the P-021 using commercial spin columns produced linear standard curves with wide dynamic range and excellent limit of detection. When extracted with spin columns, the P-021 enables accurate viral loads down to 23 copies μl(-1) DNA. The quantitative and long-term storage capability of this system makes it ideal for study of salivary DNA viruses in resource-poor settings. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  11. Classification-based quantitative analysis of stable isotope labeling by amino acids in cell culture (SILAC) data.

    PubMed

    Kim, Seongho; Carruthers, Nicholas; Lee, Joohyoung; Chinni, Sreenivasa; Stemmer, Paul

    2016-12-01

    Stable isotope labeling by amino acids in cell culture (SILAC) is a practical and powerful approach for quantitative proteomic analysis. A key advantage of SILAC is the ability to simultaneously detect the isotopically labeled peptides in a single instrument run and so guarantee relative quantitation for a large number of peptides without introducing any variation caused by separate experiment. However, there are a few approaches available to assessing protein ratios and none of the existing algorithms pays considerable attention to the proteins having only one peptide hit. We introduce new quantitative approaches to dealing with SILAC protein-level summary using classification-based methodologies, such as Gaussian mixture models with EM algorithms and its Bayesian approach as well as K-means clustering. In addition, a new approach is developed using Gaussian mixture model and a stochastic, metaheuristic global optimization algorithm, particle swarm optimization (PSO), to avoid either a premature convergence or being stuck in a local optimum. Our simulation studies show that the newly developed PSO-based method performs the best among others in terms of F1 score and the proposed methods further demonstrate the ability of detecting potential markers through real SILAC experimental data. No matter how many peptide hits the protein has, the developed approach can be applicable, rescuing many proteins doomed to removal. Furthermore, no additional correction for multiple comparisons is necessary for the developed methods, enabling direct interpretation of the analysis outcomes. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  12. PhosphOrtholog: a web-based tool for cross-species mapping of orthologous protein post-translational modifications.

    PubMed

    Chaudhuri, Rima; Sadrieh, Arash; Hoffman, Nolan J; Parker, Benjamin L; Humphrey, Sean J; Stöckli, Jacqueline; Hill, Adam P; James, David E; Yang, Jean Yee Hwa

    2015-08-19

    Most biological processes are influenced by protein post-translational modifications (PTMs). Identifying novel PTM sites in different organisms, including humans and model organisms, has expedited our understanding of key signal transduction mechanisms. However, with increasing availability of deep, quantitative datasets in diverse species, there is a growing need for tools to facilitate cross-species comparison of PTM data. This is particularly important because functionally important modification sites are more likely to be evolutionarily conserved; yet cross-species comparison of PTMs is difficult since they often lie in structurally disordered protein domains. Current tools that address this can only map known PTMs between species based on known orthologous phosphosites, and do not enable the cross-species mapping of newly identified modification sites. Here, we addressed this by developing a web-based software tool, PhosphOrtholog ( www.phosphortholog.com ) that accurately maps protein modification sites between different species. This facilitates the comparison of datasets derived from multiple species, and should be a valuable tool for the proteomics community. Here we describe PhosphOrtholog, a web-based application for mapping known and novel orthologous PTM sites from experimental data obtained from different species. PhosphOrtholog is the only generic and automated tool that enables cross-species comparison of large-scale PTM datasets without relying on existing PTM databases. This is achieved through pairwise sequence alignment of orthologous protein residues. To demonstrate its utility we apply it to two sets of human and rat muscle phosphoproteomes generated following insulin and exercise stimulation, respectively, and one publicly available mouse phosphoproteome following cellular stress revealing high mapping and coverage efficiency. Although coverage statistics are dataset dependent, PhosphOrtholog increased the number of cross-species mapped sites in all our example data sets by more than double when compared to those recovered using existing resources such as PhosphoSitePlus. PhosphOrtholog is the first tool that enables mapping of thousands of novel and known protein phosphorylation sites across species, accessible through an easy-to-use web interface. Identification of conserved PTMs across species from large-scale experimental data increases our knowledgebase of functional PTM sites. Moreover, PhosphOrtholog is generic being applicable to other PTM datasets such as acetylation, ubiquitination and methylation.

  13. Integrating Milk Metabolite Profile Information for the Prediction of Traditional Milk Traits Based on SNP Information for Holstein Cows

    PubMed Central

    Melzer, Nina; Wittenburg, Dörte; Repsilber, Dirk

    2013-01-01

    In this study the benefit of metabolome level analysis for the prediction of genetic value of three traditional milk traits was investigated. Our proposed approach consists of three steps: First, milk metabolite profiles are used to predict three traditional milk traits of 1,305 Holstein cows. Two regression methods, both enabling variable selection, are applied to identify important milk metabolites in this step. Second, the prediction of these important milk metabolite from single nucleotide polymorphisms (SNPs) enables the detection of SNPs with significant genetic effects. Finally, these SNPs are used to predict milk traits. The observed precision of predicted genetic values was compared to the results observed for the classical genotype-phenotype prediction using all SNPs or a reduced SNP subset (reduced classical approach). To enable a comparison between SNP subsets, a special invariable evaluation design was implemented. SNPs close to or within known quantitative trait loci (QTL) were determined. This enabled us to determine if detected important SNP subsets were enriched in these regions. The results show that our approach can lead to genetic value prediction, but requires less than 1% of the total amount of (40,317) SNPs., significantly more important SNPs in known QTL regions were detected using our approach compared to the reduced classical approach. Concluding, our approach allows a deeper insight into the associations between the different levels of the genotype-phenotype map (genotype-metabolome, metabolome-phenotype, genotype-phenotype). PMID:23990900

  14. SWATH Mass Spectrometry Performance Using Extended Peptide MS/MS Assay Libraries.

    PubMed

    Wu, Jemma X; Song, Xiaomin; Pascovici, Dana; Zaw, Thiri; Care, Natasha; Krisp, Christoph; Molloy, Mark P

    2016-07-01

    The use of data-independent acquisition methods such as SWATH for mass spectrometry based proteomics is usually performed with peptide MS/MS assay libraries which enable identification and quantitation of peptide peak areas. Reference assay libraries can be generated locally through information dependent acquisition, or obtained from community data repositories for commonly studied organisms. However, there have been no studies performed to systematically evaluate how locally generated or repository-based assay libraries affect SWATH performance for proteomic studies. To undertake this analysis, we developed a software workflow, SwathXtend, which generates extended peptide assay libraries by integration with a local seed library and delivers statistical analysis of SWATH-quantitative comparisons. We designed test samples using peptides from a yeast extract spiked into peptides from human K562 cell lysates at three different ratios to simulate protein abundance change comparisons. SWATH-MS performance was assessed using local and external assay libraries of varying complexities and proteome compositions. These experiments demonstrated that local seed libraries integrated with external assay libraries achieve better performance than local assay libraries alone, in terms of the number of identified peptides and proteins and the specificity to detect differentially abundant proteins. Our findings show that the performance of extended assay libraries is influenced by the MS/MS feature similarity of the seed and external libraries, while statistical analysis using multiple testing corrections increases the statistical rigor needed when searching against large extended assay libraries. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.

  15. SWATH Mass Spectrometry Performance Using Extended Peptide MS/MS Assay Libraries*

    PubMed Central

    Wu, Jemma X.; Song, Xiaomin; Pascovici, Dana; Zaw, Thiri; Care, Natasha; Krisp, Christoph; Molloy, Mark P.

    2016-01-01

    The use of data-independent acquisition methods such as SWATH for mass spectrometry based proteomics is usually performed with peptide MS/MS assay libraries which enable identification and quantitation of peptide peak areas. Reference assay libraries can be generated locally through information dependent acquisition, or obtained from community data repositories for commonly studied organisms. However, there have been no studies performed to systematically evaluate how locally generated or repository-based assay libraries affect SWATH performance for proteomic studies. To undertake this analysis, we developed a software workflow, SwathXtend, which generates extended peptide assay libraries by integration with a local seed library and delivers statistical analysis of SWATH-quantitative comparisons. We designed test samples using peptides from a yeast extract spiked into peptides from human K562 cell lysates at three different ratios to simulate protein abundance change comparisons. SWATH-MS performance was assessed using local and external assay libraries of varying complexities and proteome compositions. These experiments demonstrated that local seed libraries integrated with external assay libraries achieve better performance than local assay libraries alone, in terms of the number of identified peptides and proteins and the specificity to detect differentially abundant proteins. Our findings show that the performance of extended assay libraries is influenced by the MS/MS feature similarity of the seed and external libraries, while statistical analysis using multiple testing corrections increases the statistical rigor needed when searching against large extended assay libraries. PMID:27161445

  16. Comparison of bioluminescent kinase assays using substrate depletion and product formation.

    PubMed

    Tanega, Cordelle; Shen, Min; Mott, Bryan T; Thomas, Craig J; MacArthur, Ryan; Inglese, James; Auld, Douglas S

    2009-12-01

    Assays for ATPases have been enabled for high-throughput screening (HTS) by employing firefly luciferase to detect the remaining ATP in the assay. However, for any enzyme assay, measurement of product formation is a more sensitive assay design. Recently, technologies that allow detection of the ADP product from ATPase reactions have been described using fluorescent methods of detection. We describe here the characterization of a bioluminescent assay that employs firefly luciferase in a coupled-enzyme assay format to enable detection of ADP levels from ATPase assays (ADP-Glo, Promega Corp.). We determined the performance of the ADP-Glo assay in 1,536-well microtiter plates using the protein kinase Clk4 and a 1,352 member kinase focused combinatorial library. The ADP-Glo assay was compared to the Clk4 assay performed using a bioluminescence ATP-depletion format (Kinase-Glo, Promega Corp). We performed this analysis using quantitative HTS (qHTS) where we determined potency values for all library members and identified approximately 300 compounds with potencies ranging from as low as 50 nM to >10 microM, yielding a robust dataset for the comparison. Both assay formats showed high performance (Z'-factors approximately 0.9) and showed a similar potency distribution for the actives. We conclude that the bioluminescence ADP detection assay system is a viable generic alternative to the widely used ATP-depletion assay for ATPases and discuss the advantages and disadvantages of both approaches.

  17. Atlas-based analysis of cardiac shape and function: correction of regional shape bias due to imaging protocol for population studies.

    PubMed

    Medrano-Gracia, Pau; Cowan, Brett R; Bluemke, David A; Finn, J Paul; Kadish, Alan H; Lee, Daniel C; Lima, Joao A C; Suinesiaputra, Avan; Young, Alistair A

    2013-09-13

    Cardiovascular imaging studies generate a wealth of data which is typically used only for individual study endpoints. By pooling data from multiple sources, quantitative comparisons can be made of regional wall motion abnormalities between different cohorts, enabling reuse of valuable data. Atlas-based analysis provides precise quantification of shape and motion differences between disease groups and normal subjects. However, subtle shape differences may arise due to differences in imaging protocol between studies. A mathematical model describing regional wall motion and shape was used to establish a coordinate system registered to the cardiac anatomy. The atlas was applied to data contributed to the Cardiac Atlas Project from two independent studies which used different imaging protocols: steady state free precession (SSFP) and gradient recalled echo (GRE) cardiovascular magnetic resonance (CMR). Shape bias due to imaging protocol was corrected using an atlas-based transformation which was generated from a set of 46 volunteers who were imaged with both protocols. Shape bias between GRE and SSFP was regionally variable, and was effectively removed using the atlas-based transformation. Global mass and volume bias was also corrected by this method. Regional shape differences between cohorts were more statistically significant after removing regional artifacts due to imaging protocol bias. Bias arising from imaging protocol can be both global and regional in nature, and is effectively corrected using an atlas-based transformation, enabling direct comparison of regional wall motion abnormalities between cohorts acquired in separate studies.

  18. Physical validation of a patient-specific contact finite element model of the ankle.

    PubMed

    Anderson, Donald D; Goldsworthy, Jane K; Li, Wendy; James Rudert, M; Tochigi, Yuki; Brown, Thomas D

    2007-01-01

    A validation study was conducted to determine the extent to which computational ankle contact finite element (FE) results agreed with experimentally measured tibio-talar contact stress. Two cadaver ankles were loaded in separate test sessions, during which ankle contact stresses were measured with a high-resolution (Tekscan) pressure sensor. Corresponding contact FE analyses were subsequently performed for comparison. The agreement was good between FE-computed and experimentally measured mean (3.2% discrepancy for one ankle, 19.3% for the other) and maximum (1.5% and 6.2%) contact stress, as well as for contact area (1.7% and 14.9%). There was also excellent agreement between histograms of fractional areas of cartilage experiencing specific ranges of contact stress. Finally, point-by-point comparisons between the computed and measured contact stress distributions over the articular surface showed substantial agreement, with correlation coefficients of 90% for one ankle and 86% for the other. In the past, general qualitative, but little direct quantitative agreement has been demonstrated with articular joint contact FE models. The methods used for this validation enable formal comparison of computational and experimental results, and open the way for objective statistical measures of regional correlation between FE-computed contact stress distributions from comparison articular joint surfaces (e.g., those from an intact versus those with residual intra-articular fracture incongruity).

  19. Photo ion spectrometer

    DOEpatents

    Gruen, Dieter M.; Young, Charles E.; Pellin, Michael J.

    1989-01-01

    A charged particle spectrometer for performing ultrasensitive quantitative analysis of selected atomic components removed from a sample. Significant improvements in performing energy and angular refocusing spectroscopy are accomplished by means of a two dimensional structure for generating predetermined electromagnetic field boundary conditions. Both resonance and non-resonance ionization of selected neutral atomic components allow accumulation of increased chemical information. A multiplexed operation between a SIMS mode and a neutral atomic component ionization mode with EARTOF analysis enables comparison of chemical information from secondary ions and neutral atomic components removed from the sample. An electronic system is described for switching high level signals, such as SIMS signals, directly to a transient recorder and through a charge amplifier to the transient recorder for a low level signal pulse counting mode, such as for a neutral atomic component ionization mode.

  20. Maintenance = reuse-oriented software development

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.

    1989-01-01

    Maintenance is viewed as a reuse process. In this context, a set of models that can be used to support the maintenance process is discussed. A high level reuse framework is presented that characterizes the object of reuse, the process for adapting that object for its target application, and the reused object within its target application. Based upon this framework, a qualitative comparison is offered of the three maintenance process models with regard to their strengths and weaknesses and the circumstances in which they are appropriate. To provide a more systematic, quantitative approach for evaluating the appropriateness of the particular maintenance model, a measurement scheme is provided, based upon the reuse framework, in the form of an organized set of questions that need to be answered. To support the reuse perspective, a set of reuse enablers are discussed.

  1. Enabling comparative gene expression studies of thyroid hormone action through the development of a flexible real-time quantitative PCR assay for use across multiple anuran indicator and sentinel species.

    PubMed

    Veldhoen, Nik; Propper, Catherine R; Helbing, Caren C

    2014-03-01

    Studies performed across diverse frog species have made substantial contributions to our understanding of basic vertebrate development and the natural or anthropogenic environmental factors impacting sensitive life stages. Because, anurans are developmental models, provide ecosystems services, and act as sentinels for the identification of environmental chemical contaminants that interfere with thyroid hormone (TH) action during postembryonic development, there is demand for flexible assessment techniques that can be applied to multiple species. As part of the "thyroid assays across indicator and sentinel species" (TAXISS) initiative, we have designed and validated a series of cross-species real time quantitative PCR (qPCR) primer sets that provide information on transcriptome components in evolutionarily distant anurans. Validation for fifteen gene transcripts involved a rigorous three-tiered quality control within tissue/development-specific contexts. Assay performance was confirmed on multiple tissues (tail fin, liver, brain, and intestine) of Rana catesbeiana and Xenopus laevis tadpoles enabling comparisons between tissues and generation of response profiles to exogenous TH. This revealed notable differences in TH-responsive gene transcripts including thra, thrb, thibz, klf9, col1a2, fn1, plp1, mmp2, timm50, otc, and dio2, suggesting differential regulation and susceptibility to contaminant effects. Evidence for the applicability of the TAXISS anuran qPCR assay across seven other species is also provided with five frog families represented and its utility in defining genome structure was demonstrated. This novel validated approach will enable meaningful comparative studies between frog species and aid in extending knowledge of developmental regulatory pathways and the impact of environmental factors on TH signaling in frog species for which little or no genetic information is currently available. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. The Development of Mathematical Knowledge for Teaching for Quantitative Reasoning Using Video-Based Instruction

    ERIC Educational Resources Information Center

    Walters, Charles David

    2017-01-01

    Quantitative reasoning (P. W. Thompson, 1990, 1994) is a powerful mathematical tool that enables students to engage in rich problem solving across the curriculum. One way to support students' quantitative reasoning is to develop prospective secondary teachers' (PSTs) mathematical knowledge for teaching (MKT; Ball, Thames, & Phelps, 2008)…

  3. Multidimensional quantitative analysis of mRNA expression within intact vertebrate embryos.

    PubMed

    Trivedi, Vikas; Choi, Harry M T; Fraser, Scott E; Pierce, Niles A

    2018-01-08

    For decades, in situ hybridization methods have been essential tools for studies of vertebrate development and disease, as they enable qualitative analyses of mRNA expression in an anatomical context. Quantitative mRNA analyses typically sacrifice the anatomy, relying on embryo microdissection, dissociation, cell sorting and/or homogenization. Here, we eliminate the trade-off between quantitation and anatomical context, using quantitative in situ hybridization chain reaction (qHCR) to perform accurate and precise relative quantitation of mRNA expression with subcellular resolution within whole-mount vertebrate embryos. Gene expression can be queried in two directions: read-out from anatomical space to expression space reveals co-expression relationships in selected regions of the specimen; conversely, read-in from multidimensional expression space to anatomical space reveals those anatomical locations in which selected gene co-expression relationships occur. As we demonstrate by examining gene circuits underlying somitogenesis, quantitative read-out and read-in analyses provide the strengths of flow cytometry expression analyses, but by preserving subcellular anatomical context, they enable bi-directional queries that open a new era for in situ hybridization. © 2018. Published by The Company of Biologists Ltd.

  4. Toward standardized quantitative image quality (IQ) assessment in computed tomography (CT): A comprehensive framework for automated and comparative IQ analysis based on ICRU Report 87.

    PubMed

    Pahn, Gregor; Skornitzke, Stephan; Schlemmer, Hans-Peter; Kauczor, Hans-Ulrich; Stiller, Wolfram

    2016-01-01

    Based on the guidelines from "Report 87: Radiation Dose and Image-quality Assessment in Computed Tomography" of the International Commission on Radiation Units and Measurements (ICRU), a software framework for automated quantitative image quality analysis was developed and its usability for a variety of scientific questions demonstrated. The extendable framework currently implements the calculation of the recommended Fourier image quality (IQ) metrics modulation transfer function (MTF) and noise-power spectrum (NPS), and additional IQ quantities such as noise magnitude, CT number accuracy, uniformity across the field-of-view, contrast-to-noise ratio (CNR) and signal-to-noise ratio (SNR) of simulated lesions for a commercially available cone-beam phantom. Sample image data were acquired with different scan and reconstruction settings on CT systems from different manufacturers. Spatial resolution is analyzed in terms of edge-spread function, line-spread-function, and MTF. 3D NPS is calculated according to ICRU Report 87, and condensed to 2D and radially averaged 1D representations. Noise magnitude, CT numbers, and uniformity of these quantities are assessed on large samples of ROIs. Low-contrast resolution (CNR, SNR) is quantitatively evaluated as a function of lesion contrast and diameter. Simultaneous automated processing of several image datasets allows for straightforward comparative assessment. The presented framework enables systematic, reproducible, automated and time-efficient quantitative IQ analysis. Consistent application of the ICRU guidelines facilitates standardization of quantitative assessment not only for routine quality assurance, but for a number of research questions, e.g. the comparison of different scanner models or acquisition protocols, and the evaluation of new technology or reconstruction methods. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  5. Web-Enabled Distributed Health-Care Framework for Automated Malaria Parasite Classification: an E-Health Approach.

    PubMed

    Maity, Maitreya; Dhane, Dhiraj; Mungle, Tushar; Maiti, A K; Chakraborty, Chandan

    2017-10-26

    Web-enabled e-healthcare system or computer assisted disease diagnosis has a potential to improve the quality and service of conventional healthcare delivery approach. The article describes the design and development of a web-based distributed healthcare management system for medical information and quantitative evaluation of microscopic images using machine learning approach for malaria. In the proposed study, all the health-care centres are connected in a distributed computer network. Each peripheral centre manages its' own health-care service independently and communicates with the central server for remote assistance. The proposed methodology for automated evaluation of parasites includes pre-processing of blood smear microscopic images followed by erythrocytes segmentation. To differentiate between different parasites; a total of 138 quantitative features characterising colour, morphology, and texture are extracted from segmented erythrocytes. An integrated pattern classification framework is designed where four feature selection methods viz. Correlation-based Feature Selection (CFS), Chi-square, Information Gain, and RELIEF are employed with three different classifiers i.e. Naive Bayes', C4.5, and Instance-Based Learning (IB1) individually. Optimal features subset with the best classifier is selected for achieving maximum diagnostic precision. It is seen that the proposed method achieved with 99.2% sensitivity and 99.6% specificity by combining CFS and C4.5 in comparison with other methods. Moreover, the web-based tool is entirely designed using open standards like Java for a web application, ImageJ for image processing, and WEKA for data mining considering its feasibility in rural places with minimal health care facilities.

  6. Estimation of 3D reconstruction errors in a stereo-vision system

    NASA Astrophysics Data System (ADS)

    Belhaoua, A.; Kohler, S.; Hirsch, E.

    2009-06-01

    The paper presents an approach for error estimation for the various steps of an automated 3D vision-based reconstruction procedure of manufactured workpieces. The process is based on a priori planning of the task and built around a cognitive intelligent sensory system using so-called Situation Graph Trees (SGT) as a planning tool. Such an automated quality control system requires the coordination of a set of complex processes performing sequentially data acquisition, its quantitative evaluation and the comparison with a reference model (e.g., CAD object model) in order to evaluate quantitatively the object. To ensure efficient quality control, the aim is to be able to state if reconstruction results fulfill tolerance rules or not. Thus, the goal is to evaluate independently the error for each step of the stereo-vision based 3D reconstruction (e.g., for calibration, contour segmentation, matching and reconstruction) and then to estimate the error for the whole system. In this contribution, we analyze particularly the segmentation error due to localization errors for extracted edge points supposed to belong to lines and curves composing the outline of the workpiece under evaluation. The fitting parameters describing these geometric features are used as quality measure to determine confidence intervals and finally to estimate the segmentation errors. These errors are then propagated through the whole reconstruction procedure, enabling to evaluate their effect on the final 3D reconstruction result, specifically on position uncertainties. Lastly, analysis of these error estimates enables to evaluate the quality of the 3D reconstruction, as illustrated by the shown experimental results.

  7. Registration of 3D spectral OCT volumes combining ICP with a graph-based approach

    NASA Astrophysics Data System (ADS)

    Niemeijer, Meindert; Lee, Kyungmoo; Garvin, Mona K.; Abràmoff, Michael D.; Sonka, Milan

    2012-02-01

    The introduction of spectral Optical Coherence Tomography (OCT) scanners has enabled acquisition of high resolution, 3D cross-sectional volumetric images of the retina. 3D-OCT is used to detect and manage eye diseases such as glaucoma and age-related macular degeneration. To follow-up patients over time, image registration is a vital tool to enable more precise, quantitative comparison of disease states. In this work we present a 3D registrationmethod based on a two-step approach. In the first step we register both scans in the XY domain using an Iterative Closest Point (ICP) based algorithm. This algorithm is applied to vessel segmentations obtained from the projection image of each scan. The distance minimized in the ICP algorithm includes measurements of the vessel orientation and vessel width to allow for a more robust match. In the second step, a graph-based method is applied to find the optimal translation along the depth axis of the individual A-scans in the volume to match both scans. The cost image used to construct the graph is based on the mean squared error (MSE) between matching A-scans in both images at different translations. We have applied this method to the registration of Optic Nerve Head (ONH) centered 3D-OCT scans of the same patient. First, 10 3D-OCT scans of 5 eyes with glaucoma imaged in vivo were registered for a qualitative evaluation of the algorithm performance. Then, 17 OCT data set pairs of 17 eyes with known deformation were used for quantitative assessment of the method's robustness.

  8. Algorithm-enabled partial-angular-scan configurations for dual-energy CT.

    PubMed

    Chen, Buxin; Zhang, Zheng; Xia, Dan; Sidky, Emil Y; Pan, Xiaochuan

    2018-05-01

    We seek to investigate an optimization-based one-step method for image reconstruction that explicitly compensates for nonlinear spectral response (i.e., the beam-hardening effect) in dual-energy CT, to investigate the feasibility of the one-step method for enabling two dual-energy partial-angular-scan configurations, referred to as the short- and half-scan configurations, on standard CT scanners without involving additional hardware, and to investigate the potential of the short- and half-scan configurations in reducing imaging dose and scan time in a single-kVp-switch full-scan configuration in which two full rotations are made for collection of dual-energy data. We use the one-step method to reconstruct images directly from dual-energy data through solving a nonconvex optimization program that specifies the images to be reconstructed in dual-energy CT. Dual-energy full-scan data are generated from numerical phantoms and collected from physical phantoms with the standard single-kVp-switch full-scan configuration, whereas dual-energy short- and half-scan data are extracted from the corresponding full-scan data. Besides visual inspection and profile-plot comparison, the reconstructed images are analyzed also in quantitative studies based upon tasks of linear-attenuation-coefficient and material-concentration estimation and of material differentiation. Following the performance of a computer-simulation study to verify that the one-step method can reconstruct numerically accurately basis and monochromatic images of numerical phantoms, we reconstruct basis and monochromatic images by using the one-step method from real data of physical phantoms collected with the full-, short-, and half-scan configurations. Subjective inspection based upon visualization and profile-plot comparison reveals that monochromatic images, which are used often in practical applications, reconstructed from the full-, short-, and half-scan data are largely visually comparable except for some differences in texture details. Moreover, quantitative studies based upon tasks of linear-attenuation-coefficient and material-concentration estimation and of material differentiation indicate that the short- and half-scan configurations yield results in close agreement with the ground-truth information and that of the full-scan configuration. The one-step method considered can compensate effectively for the nonlinear spectral response in full- and partial-angular-scan dual-energy CT. It can be exploited for enabling partial-angular-scan configurations on standard CT scanner without involving additional hardware. Visual inspection and quantitative studies reveal that, with the one-step method, partial-angular-scan configurations considered can perform at a level comparable to that of the full-scan configuration, thus suggesting the potential of the two partial-angular-scan configurations in reducing imaging dose and scan time in the standard single-kVp-switch full-scan CT in which two full rotations are performed. The work also yields insights into the investigation and design of other nonstandard scan configurations of potential practical significance in dual-energy CT. © 2018 American Association of Physicists in Medicine.

  9. Studying the fundamental limit of optical fiber links to the 10-21 level.

    PubMed

    Xu, Dan; Lee, Won-Kyu; Stefani, Fabio; Lopez, Olivier; Amy-Klein, Anne; Pottie, Paul-Eric

    2018-04-16

    We present a hybrid fiber link combining effective optical frequency transfer and evaluation of performances with a self-synchronized two-way comparison. It enables us to detect the round-trip fiber noise and each of the forward and backward one-way fiber noises simultaneously. The various signals acquired with this setup allow us to study quantitatively several properties of optical fiber links. We check the reciprocity of the accumulated noise forth and back over a bi-directional fiber to the level of 3.1(±3.9) × 10 -20 based on a 160000s continuous data. We also analyze the noise correlation between two adjacent fibers and show the first experimental evidence of interferometric noise at very low Fourier frequency. We estimate redundantly and consistently the stability and accuracy of the transferred optical frequency over 43 km at 4 × 10 -21 level after 16 days of integration and demonstrate that a frequency comparison with instability as low as 8 × 10 -18 would be achievable with uni-directional fibers in urban area.

  10. The Earthquake‐Source Inversion Validation (SIV) Project

    USGS Publications Warehouse

    Mai, P. Martin; Schorlemmer, Danijel; Page, Morgan T.; Ampuero, Jean-Paul; Asano, Kimiyuki; Causse, Mathieu; Custodio, Susana; Fan, Wenyuan; Festa, Gaetano; Galis, Martin; Gallovic, Frantisek; Imperatori, Walter; Käser, Martin; Malytskyy, Dmytro; Okuwaki, Ryo; Pollitz, Fred; Passone, Luca; Razafindrakoto, Hoby N. T.; Sekiguchi, Haruko; Song, Seok Goo; Somala, Surendra N.; Thingbaijam, Kiran K. S.; Twardzik, Cedric; van Driel, Martin; Vyas, Jagdish C.; Wang, Rongjiang; Yagi, Yuji; Zielke, Olaf

    2016-01-01

    Finite‐fault earthquake source inversions infer the (time‐dependent) displacement on the rupture surface from geophysical data. The resulting earthquake source models document the complexity of the rupture process. However, multiple source models for the same earthquake, obtained by different research teams, often exhibit remarkable dissimilarities. To address the uncertainties in earthquake‐source inversion methods and to understand strengths and weaknesses of the various approaches used, the Source Inversion Validation (SIV) project conducts a set of forward‐modeling exercises and inversion benchmarks. In this article, we describe the SIV strategy, the initial benchmarks, and current SIV results. Furthermore, we apply statistical tools for quantitative waveform comparison and for investigating source‐model (dis)similarities that enable us to rank the solutions, and to identify particularly promising source inversion approaches. All SIV exercises (with related data and descriptions) and statistical comparison tools are available via an online collaboration platform, and we encourage source modelers to use the SIV benchmarks for developing and testing new methods. We envision that the SIV efforts will lead to new developments for tackling the earthquake‐source imaging problem.

  11. MIiSR: Molecular Interactions in Super-Resolution Imaging Enables the Analysis of Protein Interactions, Dynamics and Formation of Multi-protein Structures.

    PubMed

    Caetano, Fabiana A; Dirk, Brennan S; Tam, Joshua H K; Cavanagh, P Craig; Goiko, Maria; Ferguson, Stephen S G; Pasternak, Stephen H; Dikeakos, Jimmy D; de Bruyn, John R; Heit, Bryan

    2015-12-01

    Our current understanding of the molecular mechanisms which regulate cellular processes such as vesicular trafficking has been enabled by conventional biochemical and microscopy techniques. However, these methods often obscure the heterogeneity of the cellular environment, thus precluding a quantitative assessment of the molecular interactions regulating these processes. Herein, we present Molecular Interactions in Super Resolution (MIiSR) software which provides quantitative analysis tools for use with super-resolution images. MIiSR combines multiple tools for analyzing intermolecular interactions, molecular clustering and image segmentation. These tools enable quantification, in the native environment of the cell, of molecular interactions and the formation of higher-order molecular complexes. The capabilities and limitations of these analytical tools are demonstrated using both modeled data and examples derived from the vesicular trafficking system, thereby providing an established and validated experimental workflow capable of quantitatively assessing molecular interactions and molecular complex formation within the heterogeneous environment of the cell.

  12. Quantitative computer-aided diagnostic algorithm for automated detection of peak lesion attenuation in differentiating clear cell from papillary and chromophobe renal cell carcinoma, oncocytoma, and fat-poor angiomyolipoma on multiphasic multidetector computed tomography.

    PubMed

    Coy, Heidi; Young, Jonathan R; Douek, Michael L; Brown, Matthew S; Sayre, James; Raman, Steven S

    2017-07-01

    To evaluate the performance of a novel, quantitative computer-aided diagnostic (CAD) algorithm on four-phase multidetector computed tomography (MDCT) to detect peak lesion attenuation to enable differentiation of clear cell renal cell carcinoma (ccRCC) from chromophobe RCC (chRCC), papillary RCC (pRCC), oncocytoma, and fat-poor angiomyolipoma (fp-AML). We queried our clinical databases to obtain a cohort of histologically proven renal masses with preoperative MDCT with four phases [unenhanced (U), corticomedullary (CM), nephrographic (NP), and excretory (E)]. A whole lesion 3D contour was obtained in all four phases. The CAD algorithm determined a region of interest (ROI) of peak lesion attenuation within the 3D lesion contour. For comparison, a manual ROI was separately placed in the most enhancing portion of the lesion by visual inspection for a reference standard, and in uninvolved renal cortex. Relative lesion attenuation for both CAD and manual methods was obtained by normalizing the CAD peak lesion attenuation ROI (and the reference standard manually placed ROI) to uninvolved renal cortex with the formula [(peak lesion attenuation ROI - cortex ROI)/cortex ROI] × 100%. ROC analysis and area under the curve (AUC) were used to assess diagnostic performance. Bland-Altman analysis was used to compare peak ROI between CAD and manual method. The study cohort comprised 200 patients with 200 unique renal masses: 106 (53%) ccRCC, 32 (16%) oncocytomas, 18 (9%) chRCCs, 34 (17%) pRCCs, and 10 (5%) fp-AMLs. In the CM phase, CAD-derived ROI enabled characterization of ccRCC from chRCC, pRCC, oncocytoma, and fp-AML with AUCs of 0.850 (95% CI 0.732-0.968), 0.959 (95% CI 0.930-0.989), 0.792 (95% CI 0.716-0.869), and 0.825 (95% CI 0.703-0.948), respectively. On Bland-Altman analysis, there was excellent agreement of CAD and manual methods with mean differences between 14 and 26 HU in each phase. A novel, quantitative CAD algorithm enabled robust peak HU lesion detection and discrimination of ccRCC from other renal lesions with similar performance compared to the manual method.

  13. High throughput and quantitative approaches for measuring circadian rhythms in cyanobacteria using bioluminescence

    PubMed Central

    Shultzaberger, Ryan K.; Paddock, Mark L.; Katsuki, Takeo; Greenspan, Ralph J.; Golden, Susan S.

    2016-01-01

    The temporal measurement of a bioluminescent reporter has proven to be one of the most powerful tools for characterizing circadian rhythms in the cyanobacterium Synechococcus elongatus. Primarily, two approaches have been used to automate this process: (1) detection of cell culture bioluminescence in 96-well plates by a photomultiplier tube-based plate-cycling luminometer (TopCount Microplate Scintillation and Luminescence Counter, Perkin Elmer) and (2) detection of individual colony bioluminescence by iteratively rotating a Petri dish under a cooled CCD camera using a computer-controlled turntable. Each approach has distinct advantages. The TopCount provides a more quantitative measurement of bioluminescence, enabling the direct comparison of clock output levels among strains. The computer-controlled turntable approach has a shorter set-up time and greater throughput, making it a more powerful phenotypic screening tool. While the latter approach is extremely useful, only a few labs have been able to build such an apparatus because of technical hurdles involved in coordinating and controlling both the camera and the turntable, and in processing the resulting images. This protocol provides instructions on how to construct, use, and process data from a computer-controlled turntable to measure the temporal changes in bioluminescence of individual cyanobacterial colonies. Furthermore, we describe how to prepare samples for use with the TopCount to minimize experimental noise, and generate meaningful quantitative measurements of clock output levels for advanced analysis. PMID:25662451

  14. Comparison of three‐dimensional analysis and stereological techniques for quantifying lithium‐ion battery electrode microstructures

    PubMed Central

    TAIWO, OLUWADAMILOLA O.; FINEGAN, DONAL P.; EASTWOOD, DAVID S.; FIFE, JULIE L.; BROWN, LEON D.; DARR, JAWWAD A.; LEE, PETER D.; BRETT, DANIEL J.L.

    2016-01-01

    Summary Lithium‐ion battery performance is intrinsically linked to electrode microstructure. Quantitative measurement of key structural parameters of lithium‐ion battery electrode microstructures will enable optimization as well as motivate systematic numerical studies for the improvement of battery performance. With the rapid development of 3‐D imaging techniques, quantitative assessment of 3‐D microstructures from 2‐D image sections by stereological methods appears outmoded; however, in spite of the proliferation of tomographic imaging techniques, it remains significantly easier to obtain two‐dimensional (2‐D) data sets. In this study, stereological prediction and three‐dimensional (3‐D) analysis techniques for quantitative assessment of key geometric parameters for characterizing battery electrode microstructures are examined and compared. Lithium‐ion battery electrodes were imaged using synchrotron‐based X‐ray tomographic microscopy. For each electrode sample investigated, stereological analysis was performed on reconstructed 2‐D image sections generated from tomographic imaging, whereas direct 3‐D analysis was performed on reconstructed image volumes. The analysis showed that geometric parameter estimation using 2‐D image sections is bound to be associated with ambiguity and that volume‐based 3‐D characterization of nonconvex, irregular and interconnected particles can be used to more accurately quantify spatially‐dependent parameters, such as tortuosity and pore‐phase connectivity. PMID:26999804

  15. Comparison of three-dimensional analysis and stereological techniques for quantifying lithium-ion battery electrode microstructures.

    PubMed

    Taiwo, Oluwadamilola O; Finegan, Donal P; Eastwood, David S; Fife, Julie L; Brown, Leon D; Darr, Jawwad A; Lee, Peter D; Brett, Daniel J L; Shearing, Paul R

    2016-09-01

    Lithium-ion battery performance is intrinsically linked to electrode microstructure. Quantitative measurement of key structural parameters of lithium-ion battery electrode microstructures will enable optimization as well as motivate systematic numerical studies for the improvement of battery performance. With the rapid development of 3-D imaging techniques, quantitative assessment of 3-D microstructures from 2-D image sections by stereological methods appears outmoded; however, in spite of the proliferation of tomographic imaging techniques, it remains significantly easier to obtain two-dimensional (2-D) data sets. In this study, stereological prediction and three-dimensional (3-D) analysis techniques for quantitative assessment of key geometric parameters for characterizing battery electrode microstructures are examined and compared. Lithium-ion battery electrodes were imaged using synchrotron-based X-ray tomographic microscopy. For each electrode sample investigated, stereological analysis was performed on reconstructed 2-D image sections generated from tomographic imaging, whereas direct 3-D analysis was performed on reconstructed image volumes. The analysis showed that geometric parameter estimation using 2-D image sections is bound to be associated with ambiguity and that volume-based 3-D characterization of nonconvex, irregular and interconnected particles can be used to more accurately quantify spatially-dependent parameters, such as tortuosity and pore-phase connectivity. © 2016 The Authors. Journal of Microscopy published by John Wiley & Sons Ltd on behalf of Royal Microscopical Society.

  16. Advanced forensic validation for human spermatozoa identification using SPERM HY-LITER™ Express with quantitative image analysis.

    PubMed

    Takamura, Ayari; Watanabe, Ken; Akutsu, Tomoko

    2017-07-01

    Identification of human semen is indispensable for the investigation of sexual assaults. Fluorescence staining methods using commercial kits, such as the series of SPERM HY-LITER™ kits, have been useful to detect human sperm via strong fluorescence. These kits have been examined from various forensic aspects. However, because of a lack of evaluation methods, these studies did not provide objective, or quantitative, descriptions of the results nor clear criteria for the decisions reached. In addition, the variety of validations was considerably limited. In this study, we conducted more advanced validations of SPERM HY-LITER™ Express using our established image analysis method. Use of this method enabled objective and specific identification of fluorescent sperm's spots and quantitative comparisons of the sperm detection performance under complex experimental conditions. For body fluid mixtures, we examined interference with the fluorescence staining from other body fluid components. Effects of sample decomposition were simulated in high humidity and high temperature conditions. Semen with quite low sperm concentrations, such as azoospermia and oligospermia samples, represented the most challenging cases in application of the kit. Finally, the tolerance of the kit against various acidic and basic environments was analyzed. The validations herein provide useful information for the practical applications of the SPERM HY-LITER™ Express kit, which were previously unobtainable. Moreover, the versatility of our image analysis method toward various complex cases was demonstrated.

  17. A framework for organizing and selecting quantitative approaches for benefit-harm assessment.

    PubMed

    Puhan, Milo A; Singh, Sonal; Weiss, Carlos O; Varadhan, Ravi; Boyd, Cynthia M

    2012-11-19

    Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches.

  18. A framework for organizing and selecting quantitative approaches for benefit-harm assessment

    PubMed Central

    2012-01-01

    Background Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. Methods We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Results Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. Conclusion The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches. PMID:23163976

  19. Characterization of articular cartilage by combining microscopic analysis with a fibril-reinforced finite-element model.

    PubMed

    Julkunen, Petro; Kiviranta, Panu; Wilson, Wouter; Jurvelin, Jukka S; Korhonen, Rami K

    2007-01-01

    Load-bearing characteristics of articular cartilage are impaired during tissue degeneration. Quantitative microscopy enables in vitro investigation of cartilage structure but determination of tissue functional properties necessitates experimental mechanical testing. The fibril-reinforced poroviscoelastic (FRPVE) model has been used successfully for estimation of cartilage mechanical properties. The model includes realistic collagen network architecture, as shown by microscopic imaging techniques. The aim of the present study was to investigate the relationships between the cartilage proteoglycan (PG) and collagen content as assessed by quantitative microscopic findings, and model-based mechanical parameters of the tissue. Site-specific variation of the collagen network moduli, PG matrix modulus and permeability was analyzed. Cylindrical cartilage samples (n=22) were harvested from various sites of the bovine knee and shoulder joints. Collagen orientation, as quantitated by polarized light microscopy, was incorporated into the finite-element model. Stepwise stress-relaxation experiments in unconfined compression were conducted for the samples, and sample-specific models were fitted to the experimental data in order to determine values of the model parameters. For comparison, Fourier transform infrared imaging and digital densitometry were used for the determination of collagen and PG content in the same samples, respectively. The initial and strain-dependent fibril network moduli as well as the initial permeability correlated significantly with the tissue collagen content. The equilibrium Young's modulus of the nonfibrillar matrix and the strain dependency of permeability were significantly associated with the tissue PG content. The present study demonstrates that modern quantitative microscopic methods in combination with the FRPVE model are feasible methods to characterize the structure-function relationships of articular cartilage.

  20. Using the realist perspective to link theory from qualitative evidence synthesis to quantitative studies: Broadening the matrix approach.

    PubMed

    van Grootel, Leonie; van Wesel, Floryt; O'Mara-Eves, Alison; Thomas, James; Hox, Joop; Boeije, Hennie

    2017-09-01

    This study describes an approach for the use of a specific type of qualitative evidence synthesis in the matrix approach, a mixed studies reviewing method. The matrix approach compares quantitative and qualitative data on the review level by juxtaposing concrete recommendations from the qualitative evidence synthesis against interventions in primary quantitative studies. However, types of qualitative evidence syntheses that are associated with theory building generate theoretical models instead of recommendations. Therefore, the output from these types of qualitative evidence syntheses cannot directly be used for the matrix approach but requires transformation. This approach allows for the transformation of these types of output. The approach enables the inference of moderation effects instead of direct effects from the theoretical model developed in a qualitative evidence synthesis. Recommendations for practice are formulated on the basis of interactional relations inferred from the qualitative evidence synthesis. In doing so, we apply the realist perspective to model variables from the qualitative evidence synthesis according to the context-mechanism-outcome configuration. A worked example shows that it is possible to identify recommendations from a theory-building qualitative evidence synthesis using the realist perspective. We created subsets of the interventions from primary quantitative studies based on whether they matched the recommendations or not and compared the weighted mean effect sizes of the subsets. The comparison shows a slight difference in effect sizes between the groups of studies. The study concludes that the approach enhances the applicability of the matrix approach. Copyright © 2017 John Wiley & Sons, Ltd.

  1. Functional ankle instability as a risk factor for osteoarthritis: using T2-mapping to analyze early cartilage degeneration in the ankle joint of young athletes.

    PubMed

    Golditz, T; Steib, S; Pfeifer, K; Uder, M; Gelse, K; Janka, R; Hennig, F F; Welsch, G H

    2014-10-01

    The aim of this study was to investigate, using T2-mapping, the impact of functional instability in the ankle joint on the development of early cartilage damage. Ethical approval for this study was provided. Thirty-six volunteers from the university sports program were divided into three groups according to their ankle status: functional ankle instability (FAI, initial ankle sprain with residual instability); ankle sprain Copers (initial sprain, without residual instability); and controls (without a history of ankle injuries). Quantitative T2-mapping magnetic resonance imaging (MRI) was performed at the beginning ('early-unloading') and at the end ('late-unloading') of the MR-examination, with a mean time span of 27 min. Zonal region-of-interest T2-mapping was performed on the talar and tibial cartilage in the deep and superficial layers. The inter-group comparisons of T2-values were analyzed using paired and unpaired t-tests. Statistical analysis of variance was performed. T2-values showed significant to highly significant differences in 11 of 12 regions throughout the groups. In early-unloading, the FAI-group showed a significant increase in quantitative T2-values in the medial, talar regions (P = 0.008, P = 0.027), whereas the Coper-group showed this enhancement in the central-lateral regions (P = 0.05). Especially the comparison of early-loading to late-unloading values revealed significantly decreasing T2-values over time laterally and significantly increasing T2-values medially in the FAI-group, which were not present in the Coper- or control-group. Functional instability causes unbalanced loading in the ankle joint, resulting in cartilage alterations as assessed by quantitative T2-mapping. This approach can visualize and localize early cartilage abnormalities, possibly enabling specific treatment options to prevent osteoarthritis in young athletes. Copyright © 2014 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  2. Physics-Based Image Segmentation Using First Order Statistical Properties and Genetic Algorithm for Inductive Thermography Imaging.

    PubMed

    Gao, Bin; Li, Xiaoqing; Woo, Wai Lok; Tian, Gui Yun

    2018-05-01

    Thermographic inspection has been widely applied to non-destructive testing and evaluation with the capabilities of rapid, contactless, and large surface area detection. Image segmentation is considered essential for identifying and sizing defects. To attain a high-level performance, specific physics-based models that describe defects generation and enable the precise extraction of target region are of crucial importance. In this paper, an effective genetic first-order statistical image segmentation algorithm is proposed for quantitative crack detection. The proposed method automatically extracts valuable spatial-temporal patterns from unsupervised feature extraction algorithm and avoids a range of issues associated with human intervention in laborious manual selection of specific thermal video frames for processing. An internal genetic functionality is built into the proposed algorithm to automatically control the segmentation threshold to render enhanced accuracy in sizing the cracks. Eddy current pulsed thermography will be implemented as a platform to demonstrate surface crack detection. Experimental tests and comparisons have been conducted to verify the efficacy of the proposed method. In addition, a global quantitative assessment index F-score has been adopted to objectively evaluate the performance of different segmentation algorithms.

  3. Diffusion blotting: a rapid and simple method for production of multiple blots from a single gel.

    PubMed

    Olsen, Ingrid; Wiker, Harald G

    2015-01-01

    A very simple and fast method for diffusion blotting of proteins from precast SDS-PAGE gels on a solid plastic support was developed. Diffusion blotting for 3 min gives a quantitative transfer of 10 % compared to 1-h electroblotting. For each subsequent blot from the same gel a doubling of transfer time is necessary to obtain the same amount of protein onto each blot. High- and low-molecular-weight components are transferred equally efficiently when compared to electroblotting. However, both methods do give a higher total transfer of the low-molecular-weight proteins compared to the large proteins. The greatest advantage of diffusion blotting is that several blots can be made from each lane, thus enabling testing of multiple antisera on virtually identical blots. The gel remains on the plastic support, which prevents it from stretching or shrinking. This ensures identical blots and facilitates more reliable molecular weight determination. Furthermore the proteins remaining in the gel can be stained with Coomassie Brilliant Blue or other methods for exact and easy comparison with the developed blots. These advantages make diffusion blotting the method of choice when quantitative protein transfer is not required.

  4. The Focinator v2-0 - Graphical Interface, Four Channels, Colocalization Analysis and Cell Phase Identification.

    PubMed

    Oeck, Sebastian; Malewicz, Nathalie M; Hurst, Sebastian; Al-Refae, Klaudia; Krysztofiak, Adam; Jendrossek, Verena

    2017-07-01

    The quantitative analysis of foci plays an important role in various cell biological methods. In the fields of radiation biology and experimental oncology, the effect of ionizing radiation, chemotherapy or molecularly targeted drugs on DNA damage induction and repair is frequently performed by the analysis of protein clusters or phosphorylated proteins recruited to so called repair foci at DNA damage sites, involving for example γ-H2A.X, 53BP1 or RAD51. We recently developed "The Focinator" as a reliable and fast tool for automated quantitative and qualitative analysis of nuclei and DNA damage foci. The refined software is now even more user-friendly due to a graphical interface and further features. Thus, we included an R-script-based mode for automated image opening, file naming, progress monitoring and an error report. Consequently, the evaluation no longer required the attendance of the operator after initial parameter definition. Moreover, the Focinator v2-0 is now able to perform multi-channel analysis of four channels and evaluation of protein-protein colocalization by comparison of up to three foci channels. This enables for example the quantification of foci in cells of a specific cell cycle phase.

  5. Automated Detection of Electroencephalography Artifacts in Human, Rodent and Canine Subjects using Machine Learning.

    PubMed

    Levitt, Joshua; Nitenson, Adam; Koyama, Suguru; Heijmans, Lonne; Curry, James; Ross, Jason T; Kamerling, Steven; Saab, Carl Y

    2018-06-23

    Electroencephalography (EEG) invariably contains extra-cranial artifacts that are commonly dealt with based on qualitative and subjective criteria. Failure to account for EEG artifacts compromises data interpretation. We have developed a quantitative and automated support vector machine (SVM)-based algorithm to accurately classify artifactual EEG epochs in awake rodent, canine and humans subjects. An embodiment of this method also enables the determination of 'eyes open/closed' states in human subjects. The levels of SVM accuracy for artifact classification in humans, Sprague Dawley rats and beagle dogs were 94.17%, 83.68%, and 85.37%, respectively, whereas 'eyes open/closed' states in humans were labeled with 88.60% accuracy. Each of these results was significantly higher than chance. Comparison with Existing Methods: Other existing methods, like those dependent on Independent Component Analysis, have not been tested in non-human subjects, and require full EEG montages, instead of only single channels, as this method does. We conclude that our EEG artifact detection algorithm provides a valid and practical solution to a common problem in the quantitative analysis and assessment of EEG in pre-clinical research settings across evolutionary spectra. Copyright © 2018. Published by Elsevier B.V.

  6. Simultaneous Quantification of Multiple Alternatively Spliced mRNA Transcripts Using Droplet Digital PCR.

    PubMed

    Sun, Bing; Zheng, Yun-Ling

    2018-01-01

    Currently there is no sensitive, precise, and reproducible method to quantitate alternative splicing of mRNA transcripts. Droplet digital™ PCR (ddPCR™) analysis allows for accurate digital counting for quantification of gene expression. Human telomerase reverse transcriptase (hTERT) is one of the essential components required for telomerase activity and for the maintenance of telomeres. Several alternatively spliced forms of hTERT mRNA in human primary and tumor cells have been reported in the literature. Using one pair of primers and two probes for hTERT, four alternatively spliced forms of hTERT (α-/β+, α+/β- single deletions, α-/β- double deletion, and nondeletion α+/β+) were accurately quantified through a novel analysis method via data collected from a single ddPCR reaction. In this chapter, we describe this ddPCR method that enables direct quantitative comparison of four alternatively spliced forms of the hTERT messenger RNA without the need for internal standards or multiple pairs of primers specific for each variant, eliminating the technical variation due to differential PCR amplification efficiency for different amplicons and the challenges of quantification using standard curves. This simple and straightforward method should have general utility for quantifying alternatively spliced gene transcripts.

  7. Subject-Specific Sparse Dictionary Learning for Atlas-Based Brain MRI Segmentation.

    PubMed

    Roy, Snehashis; He, Qing; Sweeney, Elizabeth; Carass, Aaron; Reich, Daniel S; Prince, Jerry L; Pham, Dzung L

    2015-09-01

    Quantitative measurements from segmentations of human brain magnetic resonance (MR) images provide important biomarkers for normal aging and disease progression. In this paper, we propose a patch-based tissue classification method from MR images that uses a sparse dictionary learning approach and atlas priors. Training data for the method consists of an atlas MR image, prior information maps depicting where different tissues are expected to be located, and a hard segmentation. Unlike most atlas-based classification methods that require deformable registration of the atlas priors to the subject, only affine registration is required between the subject and training atlas. A subject-specific patch dictionary is created by learning relevant patches from the atlas. Then the subject patches are modeled as sparse combinations of learned atlas patches leading to tissue memberships at each voxel. The combination of prior information in an example-based framework enables us to distinguish tissues having similar intensities but different spatial locations. We demonstrate the efficacy of the approach on the application of whole-brain tissue segmentation in subjects with healthy anatomy and normal pressure hydrocephalus, as well as lesion segmentation in multiple sclerosis patients. For each application, quantitative comparisons are made against publicly available state-of-the art approaches.

  8. Fast perfusion measurements in rat skeletal muscle at rest and during exercise with single-voxel FAIR (flow-sensitive alternating inversion recovery).

    PubMed

    Pohmann, Rolf; Künnecke, Basil; Fingerle, Jürgen; von Kienlin, Markus

    2006-01-01

    Non-invasive measurement of perfusion in skeletal muscle by in vivo magnetic resonance remains a challenge due to its low level and the correspondingly low signal-to-noise ratio. To enable accurate, quantitative, and time-resolved perfusion measurements in the leg muscle, a technique with a high sensitivity is required. By combining a flow-sensitive alternating inversion recovery (FAIR)-sequence with a single-voxel readout, we have developed a new technique to measure the perfusion in the rat gastrocnemius muscle at rest, yielding an average value of 19.4 +/- 4.8 mL/100 g/min (n = 22). In additional experiments, perfusion changes were elicited by acute ischemia and reperfusion or by exercise induced by electrical, noninvasive muscle stimulation with varying duration and intensity. The perfusion time courses during these manipulations were measured with a temporal resolution of 2.2 min, showing increases in perfusion of a factor of up to 2.5. In a direct comparison, the results agreed closely with values found with microsphere measurements in the same animals. The quantitative and noninvasive method can significantly facilitate the investigation of atherosclerotic diseases and the examination of drug efficacy.

  9. Retrieval of complex χ(2) parts for quantitative analysis of sum-frequency generation intensity spectra

    PubMed Central

    Hofmann, Matthias J.; Koelsch, Patrick

    2015-01-01

    Vibrational sum-frequency generation (SFG) spectroscopy has become an established technique for in situ surface analysis. While spectral recording procedures and hardware have been optimized, unique data analysis routines have yet to be established. The SFG intensity is related to probing geometries and properties of the system under investigation such as the absolute square of the second-order susceptibility χ(2)2. A conventional SFG intensity measurement does not grant access to the complex parts of χ(2) unless further assumptions have been made. It is therefore difficult, sometimes impossible, to establish a unique fitting solution for SFG intensity spectra. Recently, interferometric phase-sensitive SFG or heterodyne detection methods have been introduced to measure real and imaginary parts of χ(2) experimentally. Here, we demonstrate that iterative phase-matching between complex spectra retrieved from maximum entropy method analysis and fitting of intensity SFG spectra (iMEMfit) leads to a unique solution for the complex parts of χ(2) and enables quantitative analysis of SFG intensity spectra. A comparison between complex parts retrieved by iMEMfit applied to intensity spectra and phase sensitive experimental data shows excellent agreement between the two methods. PMID:26450297

  10. Target-based drug discovery for [Formula: see text]-globin disorders: drug target prediction using quantitative modeling with hybrid functional Petri nets.

    PubMed

    Mehraei, Mani; Bashirov, Rza; Tüzmen, Şükrü

    2016-10-01

    Recent molecular studies provide important clues into treatment of [Formula: see text]-thalassemia, sickle-cell anaemia and other [Formula: see text]-globin disorders revealing that increased production of fetal hemoglobin, that is normally suppressed in adulthood, can ameliorate the severity of these diseases. In this paper, we present a novel approach for drug prediction for [Formula: see text]-globin disorders. Our approach is centered upon quantitative modeling of interactions in human fetal-to-adult hemoglobin switch network using hybrid functional Petri nets. In accordance with the reverse pharmacology approach, we pose a hypothesis regarding modulation of specific protein targets that induce [Formula: see text]-globin and consequently fetal hemoglobin. Comparison of simulation results for the proposed strategy with the ones obtained for already existing drugs shows that our strategy is the optimal as it leads to highest level of [Formula: see text]-globin induction and thereby has potential beneficial therapeutic effects on [Formula: see text]-globin disorders. Simulation results enable verification of model coherence demonstrating that it is consistent with qPCR data available for known strategies and/or drugs.

  11. Quantitative evaluation of the matrix effect in bioanalytical methods based on LC-MS: A comparison of two approaches.

    PubMed

    Rudzki, Piotr J; Gniazdowska, Elżbieta; Buś-Kwaśnik, Katarzyna

    2018-06-05

    Liquid chromatography coupled to mass spectrometry (LC-MS) is a powerful tool for studying pharmacokinetics and toxicokinetics. Reliable bioanalysis requires the characterization of the matrix effect, i.e. influence of the endogenous or exogenous compounds on the analyte signal intensity. We have compared two methods for the quantitation of matrix effect. The CVs(%) of internal standard normalized matrix factors recommended by the European Medicines Agency were evaluated against internal standard normalized relative matrix effects derived from Matuszewski et al. (2003). Both methods use post-extraction spiked samples, but matrix factors require also neat solutions. We have tested both approaches using analytes of diverse chemical structures. The study did not reveal relevant differences in the results obtained with both calculation methods. After normalization with the internal standard, the CV(%) of the matrix factor was on average 0.5% higher than the corresponding relative matrix effect. The method adopted by the European Medicines Agency seems to be slightly more conservative in the analyzed datasets. Nine analytes of different structures enabled a general overview of the problem, still, further studies are encouraged to confirm our observations. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. PeptideDepot: flexible relational database for visual analysis of quantitative proteomic data and integration of existing protein information.

    PubMed

    Yu, Kebing; Salomon, Arthur R

    2009-12-01

    Recently, dramatic progress has been achieved in expanding the sensitivity, resolution, mass accuracy, and scan rate of mass spectrometers able to fragment and identify peptides through MS/MS. Unfortunately, this enhanced ability to acquire proteomic data has not been accompanied by a concomitant increase in the availability of flexible tools allowing users to rapidly assimilate, explore, and analyze this data and adapt to various experimental workflows with minimal user intervention. Here we fill this critical gap by providing a flexible relational database called PeptideDepot for organization of expansive proteomic data sets, collation of proteomic data with available protein information resources, and visual comparison of multiple quantitative proteomic experiments. Our software design, built upon the synergistic combination of a MySQL database for safe warehousing of proteomic data with a FileMaker-driven graphical user interface for flexible adaptation to diverse workflows, enables proteomic end-users to directly tailor the presentation of proteomic data to the unique analysis requirements of the individual proteomics lab. PeptideDepot may be deployed as an independent software tool or integrated directly with our high throughput autonomous proteomic pipeline used in the automated acquisition and post-acquisition analysis of proteomic data.

  13. TH-AB-209-09: Quantitative Imaging of Electrical Conductivity by VHF-Induced Thermoacoustics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patch, S; Hull, D; See, W

    Purpose: To demonstrate that very high frequency (VHF) induced thermoacoustics has the potential to provide quantitative images of electrical conductivity in Siemens/meter, much as shear wave elastography provides tissue stiffness in kPa. Quantitatively imaging a large organ requires exciting thermoacoustic pulses throughout the volume and broadband detection of those pulses because tomographic image reconstruction preserves frequency content. Applying the half-wavelength limit to a 200-micron inclusion inside a 7.5 cm diameter organ requires measurement sensitivity to frequencies ranging from 4 MHz down to 10 kHz, respectively. VHF irradiation provides superior depth penetration over near infrared used in photoacoustics. Additionally, VHF signalmore » production is proportional to electrical conductivity, and prostate cancer is known to suppress electrical conductivity of prostatic fluid. Methods: A dual-transducer system utilizing a P4-1 array connected to a Verasonics V1 system augmented by a lower frequency focused single element transducer was developed. Simultaneous acquisition of VHF-induced thermoacoustic pulses by both transducers enabled comparison of transducer performance. Data from the clinical array generated a stack of 96-images with separation of 0.3 mm, whereas the single element transducer imaged only in a single plane. In-plane resolution and quantitative accuracy were measured at isocenter. Results: The array provided volumetric imaging capability with superior resolution whereas the single element transducer provided superior quantitative accuracy. Combining axial images from both transducers preserved resolution of the P4-1 array and improved image contrast. Neither transducer was sensitive to frequencies below 50 kHz, resulting in a DC offset and low-frequency shading over fields of view exceeding 15 mm. Fresh human prostates were imaged ex vivo and volumetric reconstructions reveal structures rarely seen in diagnostic images. Conclusion: Quantitative whole-organ thermoacoustic tomography will be feasible by sparsely interspersing transducer elements sensitive to the low end of the ultrasonic range.« less

  14. Photo ion spectrometer

    DOEpatents

    Gruen, D.M.; Young, C.E.; Pellin, M.J.

    1989-12-26

    A charged particle spectrometer is described for performing ultrasensitive quantitative analysis of selected atomic components removed from a sample. Significant improvements in performing energy and angular refocusing spectroscopy are accomplished by means of a two dimensional structure for generating predetermined electromagnetic field boundary conditions. Both resonance and non-resonance ionization of selected neutral atomic components allow accumulation of increased chemical information. A multiplexed operation between a SIMS mode and a neutral atomic component ionization mode with EARTOF analysis enables comparison of chemical information from secondary ions and neutral atomic components removed from the sample. An electronic system is described for switching high level signals, such as SIMS signals, directly to a transient recorder and through a charge amplifier to the transient recorder for a low level signal pulse counting mode, such as for a neutral atomic component ionization mode. 12 figs.

  15. Who Is Doing the Housework in Multicultural Britain?

    PubMed Central

    Kan, Man-Yee; Laurie, Heather

    2016-01-01

    There is an extensive literature on the domestic division of labour within married and cohabiting couples and its relationship to gender equality within the household and the labour market. Most UK research focuses on the white majority population or is ethnicity ‘blind’, effectively ignoring potentially significant intersections between gender, ethnicity, socio-economic position and domestic labour. Quantitative empirical research on the domestic division of labour across ethnic groups has not been possible due to a lack of data that enables disaggregation by ethnic group. We address this gap using data from a nationally representative panel survey, Understanding Society, the UK Household Longitudinal Study containing sufficient sample sizes of ethnic minority groups for meaningful comparisons. We find significant variations in patterns of domestic labour by ethnic group, gender, education and employment status after controlling for individual and household characteristics. PMID:29416186

  16. Noise and contrast comparison of visual and infrared images of hazards as seen inside an automobile

    NASA Astrophysics Data System (ADS)

    Meitzler, Thomas J.; Bryk, Darryl; Sohn, Eui J.; Lane, Kimberly; Bednarz, David; Jusela, Daniel; Ebenstein, Samuel; Smith, Gregory H.; Rodin, Yelena; Rankin, James S., II; Samman, Amer M.

    2000-06-01

    The purpose of this experiment was to quantitatively measure driver performance for detecting potential road hazards in visual and infrared (IR) imagery of road scenes containing varying combinations of contrast and noise. This pilot test is a first step toward comparing various IR and visual sensors and displays for the purpose of an enhanced vision system to go inside the driver compartment. Visible and IR road imagery obtained was displayed on a large screen and on a PC monitor and subject response times were recorded. Based on the response time, detection probabilities were computed and compared to the known time of occurrence of a driving hazard. The goal was to see what combinations of sensor, contrast and noise enable subjects to have a higher detection probability of potential driving hazards.

  17. Imaging Atomic-Scale Clustering in III–V Semiconductor Alloys

    DOE PAGES

    Hirst, Louise C.; Kotulak, Nicole A.; Tomasulo, Stephanie; ...

    2017-03-13

    Quaternary alloys are essential for the development of high-performance optoelectronic devices. However, immiscibility of the constituent elements can make these materials vulnerable to phase segregation, which degrades the optical and electrical properties of the solid. High-efficiency III–V photovoltaic cells are particularly sensitive to this degradation. InAlAsSb lattice matched to InP is a promising candidate material for high-bandgap subcells of a multijunction photovoltaic device. However, previous studies of this material have identified characteristic signatures of compositional variation, including anomalous low-energy photoluminescence. In this paper, atomic-scale clustering is observed in InAlAsSb via quantitative scanning transmission electron microscopy. Finally, image quantification of atomicmore » column intensity ratios enables the comparison with simulated images, confirming the presence of nonrandom compositional variation in this multispecies alloy.« less

  18. A novel approach for the quantitation of carbohydrates in mash, wort, and beer with RP-HPLC using 1-naphthylamine for precolumn derivatization.

    PubMed

    Rakete, Stefan; Glomb, Marcus A

    2013-04-24

    A novel universal method for the determination of reducing mono-, di-, and oligosaccharides in complex matrices on RP-HPLC using 1-naphthylamine for precolumn derivatization with sodium cyanoborhydride was established to study changes in the carbohydrate profile during beer brewing. Fluorescence and mass spectrometric detection enabled very sensitive analyses of beer-relevant carbohydrates. Mass spectrometry additionally allowed the identification of the molecular weight and thereby the degree of polymerization of unknown carbohydrates. Thus, carbohydrates with up to 16 glucose units were detected. Comparison demonstrated that the novel method was superior to fluorophore-assisted carbohydrate electrophoresis (FACE). The results proved the HPLC method clearly to be more powerful in regard to sensitivity and resolution. Analogous to FACE, this method was designated fluorophore-assisted carbohydrate HPLC (FAC-HPLC).

  19. Simulating observations with HARMONI: the integral field spectrograph for the European Extremely Large Telescope

    NASA Astrophysics Data System (ADS)

    Zieleniewski, Simon; Thatte, Niranjan; Kendrew, Sarah; Houghton, Ryan; Tecza, Matthias; Clarke, Fraser; Fusco, Thierry; Swinbank, Mark

    2014-07-01

    With the next generation of extremely large telescopes commencing construction, there is an urgent need for detailed quantitative predictions of the scientific observations that these new telescopes will enable. Most of these new telescopes will have adaptive optics fully integrated with the telescope itself, allowing unprecedented spatial resolution combined with enormous sensitivity. However, the adaptive optics point spread function will be strongly wavelength dependent, requiring detailed simulations that accurately model these variations. We have developed a simulation pipeline for the HARMONI integral field spectrograph, a first light instrument for the European Extremely Large Telescope. The simulator takes high-resolution input data-cubes of astrophysical objects and processes them with accurate atmospheric, telescope and instrumental effects, to produce mock observed cubes for chosen observing parameters. The output cubes represent the result of a perfect data reduc- tion process, enabling a detailed analysis and comparison between input and output, showcasing HARMONI's capabilities. The simulations utilise a detailed knowledge of the telescope's wavelength dependent adaptive op- tics point spread function. We discuss the simulation pipeline and present an early example of the pipeline functionality for simulating observations of high redshift galaxies.

  20. A Dynamic Enhancement With Background Reduction Algorithm: Overview and Application to Satellite-Based Dust Storm Detection

    NASA Astrophysics Data System (ADS)

    Miller, Steven D.; Bankert, Richard L.; Solbrig, Jeremy E.; Forsythe, John M.; Noh, Yoo-Jeong; Grasso, Lewis D.

    2017-12-01

    This paper describes a Dynamic Enhancement Background Reduction Algorithm (DEBRA) applicable to multispectral satellite imaging radiometers. DEBRA uses ancillary information about the clear-sky background to reduce false detections of atmospheric parameters in complex scenes. Applied here to the detection of lofted dust, DEBRA enlists a surface emissivity database coupled with a climatological database of surface temperature to approximate the clear-sky equivalent signal for selected infrared-based multispectral dust detection tests. This background allows for suppression of false alarms caused by land surface features while retaining some ability to detect dust above those problematic surfaces. The algorithm is applicable to both day and nighttime observations and enables weighted combinations of dust detection tests. The results are provided quantitatively, as a detection confidence factor [0, 1], but are also readily visualized as enhanced imagery. Utilizing the DEBRA confidence factor as a scaling factor in false color red/green/blue imagery enables depiction of the targeted parameter in the context of the local meteorology and topography. In this way, the method holds utility to both automated clients and human analysts alike. Examples of DEBRA performance from notable dust storms and comparisons against other detection methods and independent observations are presented.

  1. Chemometric analysis of correlations between electronic absorption characteristics and structural and/or physicochemical parameters for ampholytic substances of biological and pharmaceutical relevance.

    PubMed

    Judycka-Proma, U; Bober, L; Gajewicz, A; Puzyn, T; Błażejowski, J

    2015-03-05

    Forty ampholytic compounds of biological and pharmaceutical relevance were subjected to chemometric analysis based on unsupervised and supervised learning algorithms. This enabled relations to be found between empirical spectral characteristics derived from electronic absorption data and structural and physicochemical parameters predicted by quantum chemistry methods or phenomenological relationships based on additivity rules. It was found that the energies of long wavelength absorption bands are correlated through multiparametric linear relationships with parameters reflecting the bulkiness features of the absorbing molecules as well as their nucleophilicity and electrophilicity. These dependences enable the quantitative analysis of spectral features of the compounds, as well as a comparison of their similarities and certain pharmaceutical and biological features. Three QSPR models to predict the energies of long-wavelength absorption in buffers with pH=2.5 and pH=7.0, as well as in methanol, were developed and validated in this study. These models can be further used to predict the long-wavelength absorption energies of untested substances (if they are structurally similar to the training compounds). Copyright © 2014 Elsevier B.V. All rights reserved.

  2. [MRI of focal liver lesions using a 1.5 turbo-spin-echo technique compared with spin-echo technique].

    PubMed

    Steiner, S; Vogl, T J; Fischer, P; Steger, W; Neuhaus, P; Keck, H

    1995-08-01

    The aim of our study was to evaluate a T2-weighted turbo-spinecho sequence in comparison to a T2-weighted spinecho sequence in imaging focal liver lesions. In our study 35 patients with suspected focal liver lesions were examined. Standardised imaging protocol included a conventional T2-weighted SE sequence (TR/TE = 2000/90/45, acquisition time = 10.20) as well as a T2-weighted TSE sequence (TR/TE = 4700/90, acquisition time = 6.33). Calculation of S/N and C/N ratio as a basis of quantitative evaluation was done using standard methods. A diagnostic score was implemented to enable qualitative assessment. In 7% (n = 2) the TSE sequence enabled detection of further liver lesions showing a size of less than 1 cm in diameter. Comparing anatomical details the TSE sequence was superior. S/N and C/N ratio of anatomic and pathologic structures of the TSE sequence were higher compared to results of the SE sequence. Our results indicate that the T2-weighted turbo-spinecho sequence is well appropriate for imaging focal liver lesions, and leads to reduction of imaging time.

  3. Drift mobility of photo-electrons in organic molecular crystals: Quantitative comparison between theory and experiment

    NASA Astrophysics Data System (ADS)

    Reineker, P.; Kenkre, V. M.; Kühne, R.

    1981-08-01

    A quantitative comparison of a simple theoretical prediction for the drift mobility of photo-electrons in organic molecular crystals, calculated within the model of the coupled band-like and hopping motion, with experiments in napthalene of Schein et al. and Karl et al. is given.

  4. Quantitative Determination of Aluminum in Deodorant Brands: A Guided Inquiry Learning Experience in Quantitative Analysis Laboratory

    ERIC Educational Resources Information Center

    Sedwick, Victoria; Leal, Anne; Turner, Dea; Kanu, A. Bakarr

    2018-01-01

    The monitoring of metals in commercial products is essential for protecting public health against the hazards of metal toxicity. This article presents a guided inquiry (GI) experimental lab approach in a quantitative analysis lab class that enabled students' to determine the levels of aluminum in deodorant brands. The utility of a GI experimental…

  5. Label-free quantitative cell division monitoring of endothelial cells by digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Kemper, Björn; Bauwens, Andreas; Vollmer, Angelika; Ketelhut, Steffi; Langehanenberg, Patrik; Müthing, Johannes; Karch, Helge; von Bally, Gert

    2010-05-01

    Digital holographic microscopy (DHM) enables quantitative multifocus phase contrast imaging for nondestructive technical inspection and live cell analysis. Time-lapse investigations on human brain microvascular endothelial cells demonstrate the use of DHM for label-free dynamic quantitative monitoring of cell division of mother cells into daughter cells. Cytokinetic DHM analysis provides future applications in toxicology and cancer research.

  6. Two-Photon Flow Cytometry

    NASA Technical Reports Server (NTRS)

    Zhog, Cheng Frank; Ye, Jing Yong; Norris, Theodore B.; Myc, Andrzej; Cao, Zhengyl; Bielinska, Anna; Thomas, Thommey; Baker, James R., Jr.

    2004-01-01

    Flow cytometry is a powerful technique for obtaining quantitative information from fluorescence in cells. Quantitation is achieved by assuring a high degree of uniformity in the optical excitation and detection, generally by using a highly controlled flow such as is obtained via hydrodynamic focusing. In this work, we demonstrate a two-beam, two- channel detection and two-photon excitation flow cytometry (T(sup 3)FC) system that enables multi-dye analysis to be performed very simply, with greatly relaxed requirements on the fluid flow. Two-photon excitation using a femtosecond near-infrared (NIR) laser has the advantages that it enables simultaneous excitation of multiple dyes and achieves very high signal-to-noise ratio through simplified filtering and fluorescence background reduction. By matching the excitation volume to the size of a cell, single-cell detection is ensured. Labeling of cells by targeted nanoparticles with multiple fluorophores enables normalization of the fluorescence signal and thus ratiometric measurements under nonuniform excitation. Quantitative size measurements can also be done even under conditions of nonuniform flow via a two-beam layout. This innovative detection scheme not only considerably simplifies the fluid flow system and the excitation and collection optics, it opens the way to quantitative cytometry in simple and compact microfluidics systems, or in vivo. Real-time detection of fluorescent microbeads in the vasculature of mouse ear demonstrates the ability to do flow cytometry in vivo. The conditions required to perform quantitative in vivo cytometry on labeled cells will be presented.

  7. Nanoelectronics enabled chronic multimodal neural platform in a mouse ischemic model

    PubMed Central

    Luan, Lan; Sullender, Colin T.; Li, Xue; Zhao, Zhengtuo; Zhu, Hanlin; Wei, Xiaoling; Xie, Chong; Dunn, Andrew K.

    2018-01-01

    Background Despite significant advancements of optical imaging techniques for mapping hemodynamics in small animal models, it remains challenging to combine imaging with spatially resolved electrical recording of individual neurons especially for longitudinal studies. This is largely due to the strong invasiveness to the living brain from the penetrating electrodes and their limited compatibility with longitudinal imaging. New Method We implant arrays of ultraflexible nanoelectronic threads (NETs) in mice for neural recording both at the brain surface and intracortically, which maintain great tissue compatibility chronically. By mounting a cranial window atop of the NET arrays that allows for chronic optical access, we establish a multimodal platform that combines spatially resolved electrical recording of neural activity and laser speckle contrast imaging (LSCI) of cerebral blood flow (CBF) for longitudinal studies. Results We induce peri-infarct depolarizations (PIDs) by targeted photothrombosis, and show the ability to detect its occurrence and propagation through spatiotemporal variations in both extracellular potentials and CBF. We also demonstrate chronic tracking of single-unit neural activity and CBF over days after photothrombosis, from which we observe reperfusion and increased firing rates. Comparison with Existing Method(s) This multimodal platform enables simultaneous mapping of neural activity and hemodynamic parameters at the microscale for quantitative, longitudinal comparisons with minimal perturbation to the baseline neurophysiology. Conclusion The ability to spatiotemporally resolve and chronically track CBF and neural electrical activity in the same living brain region has broad applications for studying the interplay between neural and hemodynamic responses in health and in cerebrovascular and neurological pathologies. PMID:29203409

  8. A review of job-exposure matrix methodology for application to workers exposed to radiation from internally deposited plutonium or other radioactive materials.

    PubMed

    Liu, Hanhua; Wakeford, Richard; Riddell, Anthony; O'Hagan, Jacqueline; MacGregor, David; Agius, Raymond; Wilson, Christine; Peace, Mark; de Vocht, Frank

    2016-03-01

    Any potential health effects of radiation emitted from radionuclides deposited in the bodies of workers exposed to radioactive materials can be directly investigated through epidemiological studies. However, estimates of radionuclide exposure and consequent tissue-specific doses, particularly for early workers for whom monitoring was relatively crude but exposures tended to be highest, can be uncertain, limiting the accuracy of risk estimates. We review the use of job-exposure matrices (JEMs) in peer-reviewed epidemiological and exposure assessment studies of nuclear industry workers exposed to radioactive materials as a method for addressing gaps in exposure data, and discuss methodology and comparability between studies. We identified nine studies of nuclear worker cohorts in France, Russia, the USA and the UK that had incorporated JEMs in their exposure assessments. All these JEMs were study or cohort-specific, and although broadly comparable methodologies were used in their construction, this is insufficient to enable the transfer of any one JEM to another study. Moreover there was often inadequate detail on whether, or how, JEMs were validated. JEMs have become more detailed and more quantitative, and this trend may eventually enable better comparison across, and the pooling of, studies. We conclude that JEMs have been shown to be a valuable exposure assessment methodology for imputation of missing exposure data for nuclear worker cohorts with data not missing at random. The next step forward for direct comparison or pooled analysis of complete cohorts would be the use of transparent and transferable methods.

  9. Quantitation of sweet steviol glycosides by means of a HILIC-MS/MS-SIDA approach.

    PubMed

    Well, Caroline; Frank, Oliver; Hofmann, Thomas

    2013-11-27

    Meeting the rising consumer demand for natural food ingredients, steviol glycosides, the sweet principle of Stevia rebaudiana Bertoni (Bertoni), have recently been approved as food additives in the European Union. As regulatory constraints require sensitive methods to analyze the sweet-tasting steviol glycosides in foods and beverages, a HILIC-MS/MS method was developed enabling the accurate and reliable quantitation of the major steviol glycosides stevioside, rebaudiosides A-F, steviolbioside, rubusoside, and dulcoside A by using the corresponding deuterated 16,17-dihydrosteviol glycosides as suitable internal standards. This quantitation not only enables the analysis of the individual steviol glycosides in foods and beverages but also can support the optimization of breeding and postharvest downstream processing of Stevia plants to produce preferentially sweet and least bitter tasting Stevia extracts.

  10. A peptide-retrieval strategy enables significant improvement of quantitative performance without compromising confidence of identification.

    PubMed

    Tu, Chengjian; Shen, Shichen; Sheng, Quanhu; Shyr, Yu; Qu, Jun

    2017-01-30

    Reliable quantification of low-abundance proteins in complex proteomes is challenging largely owing to the limited number of spectra/peptides identified. In this study we developed a straightforward method to improve the quantitative accuracy and precision of proteins by strategically retrieving the less confident peptides that were previously filtered out using the standard target-decoy search strategy. The filtered-out MS/MS spectra matched to confidently-identified proteins were recovered, and the peptide-spectrum-match FDR were re-calculated and controlled at a confident level of FDR≤1%, while protein FDR maintained at ~1%. We evaluated the performance of this strategy in both spectral count- and ion current-based methods. >60% increase of total quantified spectra/peptides was respectively achieved for analyzing a spike-in sample set and a public dataset from CPTAC. Incorporating the peptide retrieval strategy significantly improved the quantitative accuracy and precision, especially for low-abundance proteins (e.g. one-hit proteins). Moreover, the capacity of confidently discovering significantly-altered proteins was also enhanced substantially, as demonstrated with two spike-in datasets. In summary, improved quantitative performance was achieved by this peptide recovery strategy without compromising confidence of protein identification, which can be readily implemented in a broad range of quantitative proteomics techniques including label-free or labeling approaches. We hypothesize that more quantifiable spectra and peptides in a protein, even including less confident peptides, could help reduce variations and improve protein quantification. Hence the peptide retrieval strategy was developed and evaluated in two spike-in sample sets with different LC-MS/MS variations using both MS1- and MS2-based quantitative approach. The list of confidently identified proteins using the standard target-decoy search strategy was fixed and more spectra/peptides with less confidence matched to confident proteins were retrieved. However, the total peptide-spectrum-match false discovery rate (PSM FDR) after retrieval analysis was still controlled at a confident level of FDR≤1%. As expected, the penalty for occasionally incorporating incorrect peptide identifications is negligible by comparison with the improvements in quantitative performance. More quantifiable peptides, lower missing value rate, better quantitative accuracy and precision were significantly achieved for the same protein identifications by this simple strategy. This strategy is theoretically applicable for any quantitative approaches in proteomics and thereby provides more quantitative information, especially on low-abundance proteins. Published by Elsevier B.V.

  11. A Description of the Clinical Proteomic Tumor Analysis Consortium (CPTAC) Common Data Analysis Pipeline

    PubMed Central

    Rudnick, Paul A.; Markey, Sanford P.; Roth, Jeri; Mirokhin, Yuri; Yan, Xinjian; Tchekhovskoi, Dmitrii V.; Edwards, Nathan J.; Thangudu, Ratna R.; Ketchum, Karen A.; Kinsinger, Christopher R.; Mesri, Mehdi; Rodriguez, Henry; Stein, Stephen E.

    2016-01-01

    The Clinical Proteomic Tumor Analysis Consortium (CPTAC) has produced large proteomics datasets from the mass spectrometric interrogation of tumor samples previously analyzed by The Cancer Genome Atlas (TCGA) program. The availability of the genomic and proteomic data is enabling proteogenomic study for both reference (i.e., contained in major sequence databases) and non-reference markers of cancer. The CPTAC labs have focused on colon, breast, and ovarian tissues in the first round of analyses; spectra from these datasets were produced from 2D LC-MS/MS analyses and represent deep coverage. To reduce the variability introduced by disparate data analysis platforms (e.g., software packages, versions, parameters, sequence databases, etc.), the CPTAC Common Data Analysis Platform (CDAP) was created. The CDAP produces both peptide-spectrum-match (PSM) reports and gene-level reports. The pipeline processes raw mass spectrometry data according to the following: (1) Peak-picking and quantitative data extraction, (2) database searching, (3) gene-based protein parsimony, and (4) false discovery rate (FDR)-based filtering. The pipeline also produces localization scores for the phosphopeptide enrichment studies using the PhosphoRS program. Quantitative information for each of the datasets is specific to the sample processing, with PSM and protein reports containing the spectrum-level or gene-level (“rolled-up”) precursor peak areas and spectral counts for label-free or reporter ion log-ratios for 4plex iTRAQ™. The reports are available in simple tab-delimited formats and, for the PSM-reports, in mzIdentML. The goal of the CDAP is to provide standard, uniform reports for all of the CPTAC data, enabling comparisons between different samples and cancer types as well as across the major ‘omics fields. PMID:26860878

  12. Registration of 3D spectral OCT volumes using 3D SIFT feature point matching

    NASA Astrophysics Data System (ADS)

    Niemeijer, Meindert; Garvin, Mona K.; Lee, Kyungmoo; van Ginneken, Bram; Abràmoff, Michael D.; Sonka, Milan

    2009-02-01

    The recent introduction of next generation spectral OCT scanners has enabled routine acquisition of high resolution, 3D cross-sectional volumetric images of the retina. 3D OCT is used in the detection and management of serious eye diseases such as glaucoma and age-related macular degeneration. For follow-up studies, image registration is a vital tool to enable more precise, quantitative comparison of disease states. This work presents a registration method based on a recently introduced extension of the 2D Scale-Invariant Feature Transform (SIFT) framework1 to 3D.2 The SIFT feature extractor locates minima and maxima in the difference of Gaussian scale space to find salient feature points. It then uses histograms of the local gradient directions around each found extremum in 3D to characterize them in a 4096 element feature vector. Matching points are found by comparing the distance between feature vectors. We apply this method to the rigid registration of optic nerve head- (ONH) and macula-centered 3D OCT scans of the same patient that have only limited overlap. Three OCT data set pairs with known deformation were used for quantitative assessment of the method's robustness and accuracy when deformations of rotation and scaling were considered. Three-dimensional registration accuracy of 2.0+/-3.3 voxels was observed. The accuracy was assessed as average voxel distance error in N=1572 matched locations. The registration method was applied to 12 3D OCT scans (200 x 200 x 1024 voxels) of 6 normal eyes imaged in vivo to demonstrate the clinical utility and robustness of the method in a real-world environment.

  13. A Description of the Clinical Proteomic Tumor Analysis Consortium (CPTAC) Common Data Analysis Pipeline.

    PubMed

    Rudnick, Paul A; Markey, Sanford P; Roth, Jeri; Mirokhin, Yuri; Yan, Xinjian; Tchekhovskoi, Dmitrii V; Edwards, Nathan J; Thangudu, Ratna R; Ketchum, Karen A; Kinsinger, Christopher R; Mesri, Mehdi; Rodriguez, Henry; Stein, Stephen E

    2016-03-04

    The Clinical Proteomic Tumor Analysis Consortium (CPTAC) has produced large proteomics data sets from the mass spectrometric interrogation of tumor samples previously analyzed by The Cancer Genome Atlas (TCGA) program. The availability of the genomic and proteomic data is enabling proteogenomic study for both reference (i.e., contained in major sequence databases) and nonreference markers of cancer. The CPTAC laboratories have focused on colon, breast, and ovarian tissues in the first round of analyses; spectra from these data sets were produced from 2D liquid chromatography-tandem mass spectrometry analyses and represent deep coverage. To reduce the variability introduced by disparate data analysis platforms (e.g., software packages, versions, parameters, sequence databases, etc.), the CPTAC Common Data Analysis Platform (CDAP) was created. The CDAP produces both peptide-spectrum-match (PSM) reports and gene-level reports. The pipeline processes raw mass spectrometry data according to the following: (1) peak-picking and quantitative data extraction, (2) database searching, (3) gene-based protein parsimony, and (4) false-discovery rate-based filtering. The pipeline also produces localization scores for the phosphopeptide enrichment studies using the PhosphoRS program. Quantitative information for each of the data sets is specific to the sample processing, with PSM and protein reports containing the spectrum-level or gene-level ("rolled-up") precursor peak areas and spectral counts for label-free or reporter ion log-ratios for 4plex iTRAQ. The reports are available in simple tab-delimited formats and, for the PSM-reports, in mzIdentML. The goal of the CDAP is to provide standard, uniform reports for all of the CPTAC data to enable comparisons between different samples and cancer types as well as across the major omics fields.

  14. Spatiotemporal alignment of in utero BOLD-MRI series.

    PubMed

    Turk, Esra Abaci; Luo, Jie; Gagoski, Borjan; Pascau, Javier; Bibbo, Carolina; Robinson, Julian N; Grant, P Ellen; Adalsteinsson, Elfar; Golland, Polina; Malpica, Norberto

    2017-08-01

    To present a method for spatiotemporal alignment of in-utero magnetic resonance imaging (MRI) time series acquired during maternal hyperoxia for enabling improved quantitative tracking of blood oxygen level-dependent (BOLD) signal changes that characterize oxygen transport through the placenta to fetal organs. The proposed pipeline for spatiotemporal alignment of images acquired with a single-shot gradient echo echo-planar imaging includes 1) signal nonuniformity correction, 2) intravolume motion correction based on nonrigid registration, 3) correction of motion and nonrigid deformations across volumes, and 4) detection of the outlier volumes to be discarded from subsequent analysis. BOLD MRI time series collected from 10 pregnant women during 3T scans were analyzed using this pipeline. To assess pipeline performance, signal fluctuations between consecutive timepoints were examined. In addition, volume overlap and distance between manual region of interest (ROI) delineations in a subset of frames and the delineations obtained through propagation of the ROIs from the reference frame were used to quantify alignment accuracy. A previously demonstrated rigid registration approach was used for comparison. The proposed pipeline improved anatomical alignment of placenta and fetal organs over the state-of-the-art rigid motion correction methods. In particular, unexpected temporal signal fluctuations during the first normoxia period were significantly decreased (P < 0.01) and volume overlap and distance between region boundaries measures were significantly improved (P < 0.01). The proposed approach to align MRI time series enables more accurate quantitative studies of placental function by improving spatiotemporal alignment across placenta and fetal organs. 1 Technical Efficacy: Stage 1 J. MAGN. RESON. IMAGING 2017;46:403-412. © 2017 International Society for Magnetic Resonance in Medicine.

  15. Structure-preserving interpolation of temporal and spatial image sequences using an optical flow-based method.

    PubMed

    Ehrhardt, J; Säring, D; Handels, H

    2007-01-01

    Modern tomographic imaging devices enable the acquisition of spatial and temporal image sequences. But, the spatial and temporal resolution of such devices is limited and therefore image interpolation techniques are needed to represent images at a desired level of discretization. This paper presents a method for structure-preserving interpolation between neighboring slices in temporal or spatial image sequences. In a first step, the spatiotemporal velocity field between image slices is determined using an optical flow-based registration method in order to establish spatial correspondence between adjacent slices. An iterative algorithm is applied using the spatial and temporal image derivatives and a spatiotemporal smoothing step. Afterwards, the calculated velocity field is used to generate an interpolated image at the desired time by averaging intensities between corresponding points. Three quantitative measures are defined to evaluate the performance of the interpolation method. The behavior and capability of the algorithm is demonstrated by synthetic images. A population of 17 temporal and spatial image sequences are utilized to compare the optical flow-based interpolation method to linear and shape-based interpolation. The quantitative results show that the optical flow-based method outperforms the linear and shape-based interpolation statistically significantly. The interpolation method presented is able to generate image sequences with appropriate spatial or temporal resolution needed for image comparison, analysis or visualization tasks. Quantitative and qualitative measures extracted from synthetic phantoms and medical image data show that the new method definitely has advantages over linear and shape-based interpolation.

  16. Visualization and quantitative analysis of extrachromosomal telomere-repeat DNA in individual human cells by Halo-FISH

    PubMed Central

    Komosa, Martin; Root, Heather; Meyn, M. Stephen

    2015-01-01

    Current methods for characterizing extrachromosomal nuclear DNA in mammalian cells do not permit single-cell analysis, are often semi-quantitative and frequently biased toward the detection of circular species. To overcome these limitations, we developed Halo-FISH to visualize and quantitatively analyze extrachromosomal DNA in single cells. We demonstrate Halo-FISH by using it to analyze extrachromosomal telomere-repeat (ECTR) in human cells that use the Alternative Lengthening of Telomeres (ALT) pathway(s) to maintain telomere lengths. We find that GM847 and VA13 ALT cells average ∼80 detectable G/C-strand ECTR DNA molecules/nucleus, while U2OS ALT cells average ∼18 molecules/nucleus. In comparison, human primary and telomerase-positive cells contain <5 ECTR DNA molecules/nucleus. ECTR DNA in ALT cells exhibit striking cell-to-cell variations in number (<20 to >300), range widely in length (<1 to >200 kb) and are composed of primarily G- or C-strand telomere-repeat DNA. Halo-FISH enables, for the first time, the simultaneous analysis of ECTR DNA and chromosomal telomeres in a single cell. We find that ECTR DNA comprises ∼15% of telomere-repeat DNA in GM847 and VA13 cells, but <4% in U2OS cells. In addition to its use in ALT cell analysis, Halo-FISH can facilitate the study of a wide variety of extrachromosomal DNA in mammalian cells. PMID:25662602

  17. Large-scale multiplex absolute protein quantification of drug-metabolizing enzymes and transporters in human intestine, liver, and kidney microsomes by SWATH-MS: Comparison with MRM/SRM and HR-MRM/PRM.

    PubMed

    Nakamura, Kenji; Hirayama-Kurogi, Mio; Ito, Shingo; Kuno, Takuya; Yoneyama, Toshihiro; Obuchi, Wataru; Terasaki, Tetsuya; Ohtsuki, Sumio

    2016-08-01

    The purpose of the present study was to examine simultaneously the absolute protein amounts of 152 membrane and membrane-associated proteins, including 30 metabolizing enzymes and 107 transporters, in pooled microsomal fractions of human liver, kidney, and intestine by means of SWATH-MS with stable isotope-labeled internal standard peptides, and to compare the results with those obtained by MRM/SRM and high resolution (HR)-MRM/PRM. The protein expression levels of 27 metabolizing enzymes, 54 transporters, and six other membrane proteins were quantitated by SWATH-MS; other targets were below the lower limits of quantitation. Most of the values determined by SWATH-MS differed by less than 50% from those obtained by MRM/SRM or HR-MRM/PRM. Various metabolizing enzymes were expressed in liver microsomes more abundantly than in other microsomes. Ten, 13, and eight transporters listed as important for drugs by International Transporter Consortium were quantified in liver, kidney, and intestinal microsomes, respectively. Our results indicate that SWATH-MS enables large-scale multiplex absolute protein quantification while retaining similar quantitative capability to MRM/SRM or HR-MRM/PRM. SWATH-MS is expected to be useful methodology in the context of drug development for elucidating the molecular mechanisms of drug absorption, metabolism, and excretion in the human body based on protein profile information. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Surrogate matrix and surrogate analyte approaches for definitive quantitation of endogenous biomolecules.

    PubMed

    Jones, Barry R; Schultz, Gary A; Eckstein, James A; Ackermann, Bradley L

    2012-10-01

    Quantitation of biomarkers by LC-MS/MS is complicated by the presence of endogenous analytes. This challenge is most commonly overcome by calibration using an authentic standard spiked into a surrogate matrix devoid of the target analyte. A second approach involves use of a stable-isotope-labeled standard as a surrogate analyte to allow calibration in the actual biological matrix. For both methods, parallelism between calibration standards and the target analyte in biological matrix must be demonstrated in order to ensure accurate quantitation. In this communication, the surrogate matrix and surrogate analyte approaches are compared for the analysis of five amino acids in human plasma: alanine, valine, methionine, leucine and isoleucine. In addition, methodology based on standard addition is introduced, which enables a robust examination of parallelism in both surrogate analyte and surrogate matrix methods prior to formal validation. Results from additional assays are presented to introduce the standard-addition methodology and to highlight the strengths and weaknesses of each approach. For the analysis of amino acids in human plasma, comparable precision and accuracy were obtained by the surrogate matrix and surrogate analyte methods. Both assays were well within tolerances prescribed by regulatory guidance for validation of xenobiotic assays. When stable-isotope-labeled standards are readily available, the surrogate analyte approach allows for facile method development. By comparison, the surrogate matrix method requires greater up-front method development; however, this deficit is offset by the long-term advantage of simplified sample analysis.

  19. Validating internal controls for quantitative plant gene expression studies.

    PubMed

    Brunner, Amy M; Yakovlev, Igor A; Strauss, Steven H

    2004-08-18

    Real-time reverse transcription PCR (RT-PCR) has greatly improved the ease and sensitivity of quantitative gene expression studies. However, accurate measurement of gene expression with this method relies on the choice of a valid reference for data normalization. Studies rarely verify that gene expression levels for reference genes are adequately consistent among the samples used, nor compare alternative genes to assess which are most reliable for the experimental conditions analyzed. Using real-time RT-PCR to study the expression of 10 poplar (genus Populus) housekeeping genes, we demonstrate a simple method for determining the degree of stability of gene expression over a set of experimental conditions. Based on a traditional method for analyzing the stability of varieties in plant breeding, it defines measures of gene expression stability from analysis of variance (ANOVA) and linear regression. We found that the potential internal control genes differed widely in their expression stability over the different tissues, developmental stages and environmental conditions studied. Our results support that quantitative comparisons of candidate reference genes are an important part of real-time RT-PCR studies that seek to precisely evaluate variation in gene expression. The method we demonstrated facilitates statistical and graphical evaluation of gene expression stability. Selection of the best reference gene for a given set of experimental conditions should enable detection of biologically significant changes in gene expression that are too small to be revealed by less precise methods, or when highly variable reference genes are unknowingly used in real-time RT-PCR experiments.

  20. A comparative study of quantitative immunohistochemistry and quantum dot immunohistochemistry for mutation carrier identification in Lynch syndrome.

    PubMed

    Barrow, Emma; Evans, D Gareth; McMahon, Ray; Hill, James; Byers, Richard

    2011-03-01

    Lynch Syndrome is caused by mutations in DNA mismatch repair (MMR) genes. Mutation carrier identification is facilitated by immunohistochemical detection of the MMR proteins MHL1 and MSH2 in tumour tissue and is desirable as colonoscopic screening reduces mortality. However, protein detection by conventional immunohistochemistry (IHC) is subjective, and quantitative techniques are required. Quantum dots (QDs) are novel fluorescent labels that enable quantitative multiplex staining. This study compared their use with quantitative 3,3'-diaminobenzidine (DAB) IHC for the diagnosis of Lynch Syndrome. Tumour sections from 36 mutation carriers and six controls were obtained. These were stained with DAB on an automated platform using antibodies against MLH1 and MSH2. Multiplex QD immunofluorescent staining of the sections was performed using antibodies against MLH1, MSH2 and smooth muscle actin (SMA). Multispectral analysis of the slides was performed. The staining intensity of DAB and QDs was measured in multiple colonic crypts, and the mean intensity scores calculated. Receiver operating characteristic (ROC) curves of staining performance for the identification of mutation carriers were evaluated. For quantitative DAB IHC, the area under the MLH1 ROC curve was 0.872 (95% CI 0.763 to 0.981), and the area under the MSH2 ROC curve was 0.832 (95% CI 0.704 to 0.960). For quantitative QD IHC, the area under the MLH1 ROC curve was 0.812 (95% CI 0.681 to 0.943), and the area under the MSH2 ROC curve was 0.598 (95% CI 0.418 to 0.777). Despite the advantage of QD staining to enable several markers to be measured simultaneously, it is of lower utility than DAB IHC for the identification of MMR mutation carriers. Automated DAB IHC staining and quantitative slide analysis may enable high-throughput IHC.

  1. Identification of common coexpression modules based on quantitative network comparison.

    PubMed

    Jo, Yousang; Kim, Sanghyeon; Lee, Doheon

    2018-06-13

    Finding common molecular interactions from different samples is essential work to understanding diseases and other biological processes. Coexpression networks and their modules directly reflect sample-specific interactions among genes. Therefore, identification of common coexpression network or modules may reveal the molecular mechanism of complex disease or the relationship between biological processes. However, there has been no quantitative network comparison method for coexpression networks and we examined previous methods for other networks that cannot be applied to coexpression network. Therefore, we aimed to propose quantitative comparison methods for coexpression networks and to find common biological mechanisms between Huntington's disease and brain aging by the new method. We proposed two similarity measures for quantitative comparison of coexpression networks. Then, we performed experiments using known coexpression networks. We showed the validity of two measures and evaluated threshold values for similar coexpression network pairs from experiments. Using these similarity measures and thresholds, we quantitatively measured the similarity between disease-specific and aging-related coexpression modules and found similar Huntington's disease-aging coexpression module pairs. We identified similar Huntington's disease-aging coexpression module pairs and found that these modules are related to brain development, cell death, and immune response. It suggests that up-regulated cell signalling related cell death and immune/ inflammation response may be the common molecular mechanisms in the pathophysiology of HD and normal brain aging in the frontal cortex.

  2. On the Distinction Between Quantitative and Qualitative Research.

    ERIC Educational Resources Information Center

    Smith, P. L.

    Quantitative and qualitative research are differing modes of measurement, one using numbers and the other not. The assignment of numerals to represent properties enables a researcher to distinguish minutely between different properties. The major issue dividing these approaches to empirical research represents a philosophical dispute which has…

  3. Investigating Children's Abilities to Count and Make Quantitative Comparisons

    ERIC Educational Resources Information Center

    Lee, Joohi; Md-Yunus, Sham'ah

    2016-01-01

    This study was designed to investigate children's abilities to count and make quantitative comparisons. In addition, this study utilized reasoning questions (i.e., how did you know?). Thirty-four preschoolers, mean age 4.5 years old, participated in the study. According to the results, 89% of the children (n = 30) were able to do rote counting and…

  4. Quantitative single-molecule imaging by confocal laser scanning microscopy.

    PubMed

    Vukojevic, Vladana; Heidkamp, Marcus; Ming, Yu; Johansson, Björn; Terenius, Lars; Rigler, Rudolf

    2008-11-25

    A new approach to quantitative single-molecule imaging by confocal laser scanning microscopy (CLSM) is presented. It relies on fluorescence intensity distribution to analyze the molecular occurrence statistics captured by digital imaging and enables direct determination of the number of fluorescent molecules and their diffusion rates without resorting to temporal or spatial autocorrelation analyses. Digital images of fluorescent molecules were recorded by using fast scanning and avalanche photodiode detectors. In this way the signal-to-background ratio was significantly improved, enabling direct quantitative imaging by CLSM. The potential of the proposed approach is demonstrated by using standard solutions of fluorescent dyes, fluorescently labeled DNA molecules, quantum dots, and the Enhanced Green Fluorescent Protein in solution and in live cells. The method was verified by using fluorescence correlation spectroscopy. The relevance for biological applications, in particular, for live cell imaging, is discussed.

  5. ADvanced IMage Algebra (ADIMA): a novel method for depicting multiple sclerosis lesion heterogeneity, as demonstrated by quantitative MRI.

    PubMed

    Yiannakas, Marios C; Tozer, Daniel J; Schmierer, Klaus; Chard, Declan T; Anderson, Valerie M; Altmann, Daniel R; Miller, David H; Wheeler-Kingshott, Claudia A M

    2013-05-01

    There are modest correlations between multiple sclerosis (MS) disability and white matter lesion (WML) volumes, as measured by T2-weighted (T2w) magnetic resonance imaging (MRI) scans (T2-WML). This may partly reflect pathological heterogeneity in WMLs, which is not apparent on T2w scans. To determine if ADvanced IMage Algebra (ADIMA), a novel MRI post-processing method, can reveal WML heterogeneity from proton-density weighted (PDw) and T2w images. We obtained conventional PDw and T2w images from 10 patients with relapsing-remitting MS (RRMS) and ADIMA images were calculated from these. We classified all WML into bright (ADIMA-b) and dark (ADIMA-d) sub-regions, which were segmented. We obtained conventional T2-WML and T1-WML volumes for comparison, as well as the following quantitative magnetic resonance parameters: magnetisation transfer ratio (MTR), T1 and T2. Also, we assessed the reproducibility of the segmentation for ADIMA-b, ADIMA-d and T2-WML. Our study's ADIMA-derived volumes correlated with conventional lesion volumes (p < 0.05). ADIMA-b exhibited higher T1 and T2, and lower MTR than the T2-WML (p < 0.001). Despite the similarity in T1 values between ADIMA-b and T1-WML, these regions were only partly overlapping with each other. ADIMA-d exhibited quantitative characteristics similar to T2-WML; however, they were only partly overlapping. Mean intra- and inter-observer coefficients of variation for ADIMA-b, ADIMA-d and T2-WML volumes were all < 6 % and < 10 %, respectively. ADIMA enabled the simple classification of WML into two groups having different quantitative magnetic resonance properties, which can be reproducibly distinguished.

  6. Stable isotopic labeling-based quantitative targeted glycomics (i-QTaG).

    PubMed

    Kim, Kyoung-Jin; Kim, Yoon-Woo; Kim, Yun-Gon; Park, Hae-Min; Jin, Jang Mi; Hwan Kim, Young; Yang, Yung-Hun; Kyu Lee, Jun; Chung, Junho; Lee, Sun-Gu; Saghatelian, Alan

    2015-01-01

    Mass spectrometry (MS) analysis combined with stable isotopic labeling is a promising method for the relative quantification of aberrant glycosylation in diseases and disorders. We developed a stable isotopic labeling-based quantitative targeted glycomics (i-QTaG) technique for the comparative and quantitative analysis of total N-glycans using matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). We established the analytical procedure with the chemical derivatizations (i.e., sialic acid neutralization and stable isotopic labeling) of N-glycans using a model glycoprotein (bovine fetuin). Moreover, the i-QTaG using MALDI-TOF MS was evaluated with various molar ratios (1:1, 1:2, 1:5) of (13) C6 /(12) C6 -2-aminobenzoic acid-labeled glycans from normal human serum. Finally, this method was applied to direct comparison of the total N-glycan profiles between normal human sera (n = 8) and prostate cancer patient sera (n = 17). The intensities of the N-glycan peaks from i-QTaG method showed a good linearity (R(2) > 0.99) with the amount of the bovine fetuin glycoproteins. The ratios of relative intensity between the isotopically 2-AA labeled N-glycans were close to the theoretical molar ratios (1:1, 1:2, 1:5). We also demonstrated that the up-regulation of the Lewis antigen (~82%) in sera from prostate cancer patients. In this proof-of-concept study, we demonstrated that the i-QTaG method, which enables to achieve a reliable comparative quantitation of total N-glycans via MALDI-TOF MS analysis, has the potential to diagnose and monitor alterations in glycosylation associated with disease states or biotherapeutics. © 2015 American Institute of Chemical Engineers.

  7. ADvanced IMage Algebra (ADIMA): a novel method for depicting multiple sclerosis lesion heterogeneity, as demonstrated by quantitative MRI

    PubMed Central

    Tozer, Daniel J; Schmierer, Klaus; Chard, Declan T; Anderson, Valerie M; Altmann, Daniel R; Miller, David H; Wheeler-Kingshott, Claudia AM

    2013-01-01

    Background: There are modest correlations between multiple sclerosis (MS) disability and white matter lesion (WML) volumes, as measured by T2-weighted (T2w) magnetic resonance imaging (MRI) scans (T2-WML). This may partly reflect pathological heterogeneity in WMLs, which is not apparent on T2w scans. Objective: To determine if ADvanced IMage Algebra (ADIMA), a novel MRI post-processing method, can reveal WML heterogeneity from proton-density weighted (PDw) and T2w images. Methods: We obtained conventional PDw and T2w images from 10 patients with relapsing–remitting MS (RRMS) and ADIMA images were calculated from these. We classified all WML into bright (ADIMA-b) and dark (ADIMA-d) sub-regions, which were segmented. We obtained conventional T2-WML and T1-WML volumes for comparison, as well as the following quantitative magnetic resonance parameters: magnetisation transfer ratio (MTR), T1 and T2. Also, we assessed the reproducibility of the segmentation for ADIMA-b, ADIMA-d and T2-WML. Results: Our study’s ADIMA-derived volumes correlated with conventional lesion volumes (p < 0.05). ADIMA-b exhibited higher T1 and T2, and lower MTR than the T2-WML (p < 0.001). Despite the similarity in T1 values between ADIMA-b and T1-WML, these regions were only partly overlapping with each other. ADIMA-d exhibited quantitative characteristics similar to T2-WML; however, they were only partly overlapping. Mean intra- and inter-observer coefficients of variation for ADIMA-b, ADIMA-d and T2-WML volumes were all < 6 % and < 10 %, respectively. Conclusion: ADIMA enabled the simple classification of WML into two groups having different quantitative magnetic resonance properties, which can be reproducibly distinguished. PMID:23037551

  8. Fitting the message to the location: engaging adults with antimicrobial resistance in a World War 2 air raid shelter.

    PubMed

    Verran, Joanna; Haigh, Carol; Brooks, Jane; Butler, Jonathan; Redfern, James

    2018-05-31

    There are many different initiatives, global and local, designed to raise awareness of antimicrobial resistance (AMR) and change audience behaviour. However, it is not possible to assess the impact of specific, small-scale events on national and international outcomes - although one might acknowledge some contribution to the individual and collective knowledge and experience-focused 'science capital' As with any research, in preparation for a public engagement event, it is important to identify aims, and appropriate methods whose results might help satisfy those aims. Therefore, the aim of this paper was to develop, deliver and evaluate an event designed to engage an adult audience with AMR. The venue was a World War 2 air raid shelter, enabling comparison of the pre- and post-antibiotic eras via three different activity stations, focusing on nursing, the search for new antibiotics, and investigations into novel antimicrobials. The use of observers released the presenters from evaluation duties, enabling them to focus on their specific activities. Qualitative measures of audience engagement were combined with quantitative data. The evaluation revealed that adult audiences can easily be absorbed into an activity- particularly if hands-on - after a brief introduction. This research demonstrates that hands-on practical engagement with AMR can enable high level interaction and learning in an informal and enjoyable environment. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  9. Ionization Electron Signal Processing in Single Phase LArTPCs II. Data/Simulation Comparison and Performance in MicroBooNE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, C.; et al.

    The single-phase liquid argon time projection chamber (LArTPC) provides a large amount of detailed information in the form of fine-grained drifted ionization charge from particle traces. To fully utilize this information, the deposited charge must be accurately extracted from the raw digitized waveforms via a robust signal processing chain. Enabled by the ultra-low noise levels associated with cryogenic electronics in the MicroBooNE detector, the precise extraction of ionization charge from the induction wire planes in a single-phase LArTPC is qualitatively demonstrated on MicroBooNE data with event display images, and quantitatively demonstrated via waveform-level and track-level metrics. Improved performance of inductionmore » plane calorimetry is demonstrated through the agreement of extracted ionization charge measurements across different wire planes for various event topologies. In addition to the comprehensive waveform-level comparison of data and simulation, a calibration of the cryogenic electronics response is presented and solutions to various MicroBooNE-specific TPC issues are discussed. This work presents an important improvement in LArTPC signal processing, the foundation of reconstruction and therefore physics analyses in MicroBooNE.« less

  10. Quantitative maps of genetic interactions in yeast - comparative evaluation and integrative analysis.

    PubMed

    Lindén, Rolf O; Eronen, Ville-Pekka; Aittokallio, Tero

    2011-03-24

    High-throughput genetic screening approaches have enabled systematic means to study how interactions among gene mutations contribute to quantitative fitness phenotypes, with the aim of providing insights into the functional wiring diagrams of genetic interaction networks on a global scale. However, it is poorly known how well these quantitative interaction measurements agree across the screening approaches, which hinders their integrated use toward improving the coverage and quality of the genetic interaction maps in yeast and other organisms. Using large-scale data matrices from epistatic miniarray profiling (E-MAP), genetic interaction mapping (GIM), and synthetic genetic array (SGA) approaches, we carried out here a systematic comparative evaluation among these quantitative maps of genetic interactions in yeast. The relatively low association between the original interaction measurements or their customized scores could be improved using a matrix-based modelling framework, which enables the use of single- and double-mutant fitness estimates and measurements, respectively, when scoring genetic interactions. Toward an integrative analysis, we show how the detections from the different screening approaches can be combined to suggest novel positive and negative interactions which are complementary to those obtained using any single screening approach alone. The matrix approximation procedure has been made available to support the design and analysis of the future screening studies. We have shown here that even if the correlation between the currently available quantitative genetic interaction maps in yeast is relatively low, their comparability can be improved by means of our computational matrix approximation procedure, which will enable integrative analysis and detection of a wider spectrum of genetic interactions using data from the complementary screening approaches.

  11. Human genomic DNA quantitation system, H-Quant: development and validation for use in forensic casework.

    PubMed

    Shewale, Jaiprakash G; Schneida, Elaine; Wilson, Jonathan; Walker, Jerilyn A; Batzer, Mark A; Sinha, Sudhir K

    2007-03-01

    The human DNA quantification (H-Quant) system, developed for use in human identification, enables quantitation of human genomic DNA in biological samples. The assay is based on real-time amplification of AluYb8 insertions in hominoid primates. The relatively high copy number of subfamily-specific Alu repeats in the human genome enables quantification of very small amounts of human DNA. The oligonucleotide primers present in H-Quant are specific for human DNA and closely related great apes. During the real-time PCR, the SYBR Green I dye binds to the DNA that is synthesized by the human-specific AluYb8 oligonucleotide primers. The fluorescence of the bound SYBR Green I dye is measured at the end of each PCR cycle. The cycle at which the fluorescence crosses the chosen threshold correlates to the quantity of amplifiable DNA in that sample. The minimal sensitivity of the H-Quant system is 7.6 pg/microL of human DNA. The amplicon generated in the H-Quant assay is 216 bp, which is within the same range of the common amplifiable short tandem repeat (STR) amplicons. This size amplicon enables quantitation of amplifiable DNA as opposed to a quantitation of degraded or nonamplifiable DNA of smaller sizes. Development and validation studies were performed on the 7500 real-time PCR system following the Quality Assurance Standards for Forensic DNA Testing Laboratories.

  12. Quantitative comparison of 3D third harmonic generation and fluorescence microscopy images.

    PubMed

    Zhang, Zhiqing; Kuzmin, Nikolay V; Groot, Marie Louise; de Munck, Jan C

    2018-01-01

    Third harmonic generation (THG) microscopy is a label-free imaging technique that shows great potential for rapid pathology of brain tissue during brain tumor surgery. However, the interpretation of THG brain images should be quantitatively linked to images of more standard imaging techniques, which so far has been done qualitatively only. We establish here such a quantitative link between THG images of mouse brain tissue and all-nuclei-highlighted fluorescence images, acquired simultaneously from the same tissue area. For quantitative comparison of a substantial pair of images, we present here a segmentation workflow that is applicable for both THG and fluorescence images, with a precision of 91.3 % and 95.8 % achieved respectively. We find that the correspondence between the main features of the two imaging modalities amounts to 88.9 %, providing quantitative evidence of the interpretation of dark holes as brain cells. Moreover, 80 % bright objects in THG images overlap with nuclei highlighted in the fluorescence images, and they are 2 times smaller than the dark holes, showing that cells of different morphologies can be recognized in THG images. We expect that the described quantitative comparison is applicable to other types of brain tissue and with more specific staining experiments for cell type identification. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. SCOPA and META-SCOPA: software for the analysis and aggregation of genome-wide association studies of multiple correlated phenotypes.

    PubMed

    Mägi, Reedik; Suleimanov, Yury V; Clarke, Geraldine M; Kaakinen, Marika; Fischer, Krista; Prokopenko, Inga; Morris, Andrew P

    2017-01-11

    Genome-wide association studies (GWAS) of single nucleotide polymorphisms (SNPs) have been successful in identifying loci contributing genetic effects to a wide range of complex human diseases and quantitative traits. The traditional approach to GWAS analysis is to consider each phenotype separately, despite the fact that many diseases and quantitative traits are correlated with each other, and often measured in the same sample of individuals. Multivariate analyses of correlated phenotypes have been demonstrated, by simulation, to increase power to detect association with SNPs, and thus may enable improved detection of novel loci contributing to diseases and quantitative traits. We have developed the SCOPA software to enable GWAS analysis of multiple correlated phenotypes. The software implements "reverse regression" methodology, which treats the genotype of an individual at a SNP as the outcome and the phenotypes as predictors in a general linear model. SCOPA can be applied to quantitative traits and categorical phenotypes, and can accommodate imputed genotypes under a dosage model. The accompanying META-SCOPA software enables meta-analysis of association summary statistics from SCOPA across GWAS. Application of SCOPA to two GWAS of high-and low-density lipoprotein cholesterol, triglycerides and body mass index, and subsequent meta-analysis with META-SCOPA, highlighted stronger association signals than univariate phenotype analysis at established lipid and obesity loci. The META-SCOPA meta-analysis also revealed a novel signal of association at genome-wide significance for triglycerides mapping to GPC5 (lead SNP rs71427535, p = 1.1x10 -8 ), which has not been reported in previous large-scale GWAS of lipid traits. The SCOPA and META-SCOPA software enable discovery and dissection of multiple phenotype association signals through implementation of a powerful reverse regression approach.

  14. Comparison of Models and Whole-Genome Profiling Approaches for Genomic-Enabled Prediction of Septoria Tritici Blotch, Stagonospora Nodorum Blotch, and Tan Spot Resistance in Wheat.

    PubMed

    Juliana, Philomin; Singh, Ravi P; Singh, Pawan K; Crossa, Jose; Rutkoski, Jessica E; Poland, Jesse A; Bergstrom, Gary C; Sorrells, Mark E

    2017-07-01

    The leaf spotting diseases in wheat that include Septoria tritici blotch (STB) caused by , Stagonospora nodorum blotch (SNB) caused by , and tan spot (TS) caused by pose challenges to breeding programs in selecting for resistance. A promising approach that could enable selection prior to phenotyping is genomic selection that uses genome-wide markers to estimate breeding values (BVs) for quantitative traits. To evaluate this approach for seedling and/or adult plant resistance (APR) to STB, SNB, and TS, we compared the predictive ability of least-squares (LS) approach with genomic-enabled prediction models including genomic best linear unbiased predictor (GBLUP), Bayesian ridge regression (BRR), Bayes A (BA), Bayes B (BB), Bayes Cπ (BC), Bayesian least absolute shrinkage and selection operator (BL), and reproducing kernel Hilbert spaces markers (RKHS-M), a pedigree-based model (RKHS-P) and RKHS markers and pedigree (RKHS-MP). We observed that LS gave the lowest prediction accuracies and RKHS-MP, the highest. The genomic-enabled prediction models and RKHS-P gave similar accuracies. The increase in accuracy using genomic prediction models over LS was 48%. The mean genomic prediction accuracies were 0.45 for STB (APR), 0.55 for SNB (seedling), 0.66 for TS (seedling) and 0.48 for TS (APR). We also compared markers from two whole-genome profiling approaches: genotyping by sequencing (GBS) and diversity arrays technology sequencing (DArTseq) for prediction. While, GBS markers performed slightly better than DArTseq, combining markers from the two approaches did not improve accuracies. We conclude that implementing GS in breeding for these diseases would help to achieve higher accuracies and rapid gains from selection. Copyright © 2017 Crop Science Society of America.

  15. The Local Geometry of Multiattribute Tradeoff Preferences

    PubMed Central

    McGeachie, Michael; Doyle, Jon

    2011-01-01

    Existing representations for multiattribute ceteris paribus preference statements have provided useful treatments and clear semantics for qualitative comparisons, but have not provided similarly clear representations or semantics for comparisons involving quantitative tradeoffs. We use directional derivatives and other concepts from elementary differential geometry to interpret conditional multiattribute ceteris paribus preference comparisons that state bounds on quantitative tradeoff ratios. This semantics extends the familiar economic notion of marginal rate of substitution to multiple continuous or discrete attributes. The same geometric concepts also provide means for interpreting statements about the relative importance of different attributes. PMID:21528018

  16. Application of shift-and-add algorithms for imaging objects within biological media

    NASA Astrophysics Data System (ADS)

    Aizert, Avishai; Moshe, Tomer; Abookasis, David

    2017-01-01

    The Shift-and-Add (SAA) technique is a simple mathematical operation developed to reconstruct, at high spatial resolution, atmospherically degraded solar images obtained from stellar speckle interferometry systems. This method shifts and assembles individual degraded short-exposure images into a single average image with significantly improved contrast and detail. Since the inhomogeneous refractive indices of biological tissue causes light scattering similar to that induced by optical turbulence in the atmospheric layers, we assume that SAA methods can be successfully implemented to reconstruct the image of an object within a scattering biological medium. To test this hypothesis, five SAA algorithms were evaluated for reconstructing images acquired from multiple viewpoints. After successfully retrieving the hidden object's shape, quantitative image quality metrics were derived, enabling comparison of imaging error across a spectrum of layer thicknesses, demonstrating the relative efficacy of each SAA algorithm for biological imaging.

  17. Data-Driven Surface Traversability Analysis for Mars 2020 Landing Site Selection

    NASA Technical Reports Server (NTRS)

    Ono, Masahiro; Rothrock, Brandon; Almeida, Eduardo; Ansar, Adnan; Otero, Richard; Huertas, Andres; Heverly, Matthew

    2015-01-01

    The objective of this paper is three-fold: 1) to describe the engineering challenges in the surface mobility of the Mars 2020 Rover mission that are considered in the landing site selection processs, 2) to introduce new automated traversability analysis capabilities, and 3) to present the preliminary analysis results for top candidate landing sites. The analysis capabilities presented in this paper include automated terrain classification, automated rock detection, digital elevation model (DEM) generation, and multi-ROI (region of interest) route planning. These analysis capabilities enable to fully utilize the vast volume of high-resolution orbiter imagery, quantitatively evaluate surface mobility requirements for each candidate site, and reject subjectivity in the comparison between sites in terms of engineering considerations. The analysis results supported the discussion in the Second Landing Site Workshop held in August 2015, which resulted in selecting eight candidate sites that will be considered in the third workshop.

  18. FireHose Streaming Benchmarks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karl Anderson, Steve Plimpton

    2015-01-27

    The FireHose Streaming Benchmarks are a suite of stream-processing benchmarks defined to enable comparison of streaming software and hardware, both quantitatively vis-a-vis the rate at which they can process data, and qualitatively by judging the effort involved to implement and run the benchmarks. Each benchmark has two parts. The first is a generator which produces and outputs datums at a high rate in a specific format. The second is an analytic which reads the stream of datums and is required to perform a well-defined calculation on the collection of datums, typically to find anomalous datums that have been created inmore » the stream by the generator. The FireHose suite provides code for the generators, sample code for the analytics (which users are free to re-implement in their own custom frameworks), and a precise definition of each benchmark calculation.« less

  19. An FD-LC-MS/MS Proteomic Strategy for Revealing Cellular Protein Networks: A Conditional Superoxide Dismutase 1 Knockout Cells

    PubMed Central

    Ichibangase, Tomoko; Sugawara, Yasuhiro; Yamabe, Akio; Koshiyama, Akiyo; Yoshimura, Akari; Enomoto, Takemi; Imai, Kazuhiro

    2012-01-01

    Systems biology aims to understand biological phenomena in terms of complex biological and molecular interactions, and thus proteomics plays an important role in elucidating protein networks. However, many proteomic methods have suffered from their high variability, resulting in only showing altered protein names. Here, we propose a strategy for elucidating cellular protein networks based on an FD-LC-MS/MS proteomic method. The strategy permits reproducible relative quantitation of differences in protein levels between different cell populations and allows for integration of the data with those obtained through other methods. We demonstrate the validity of the approach through a comparison of differential protein expression in normal and conditional superoxide dismutase 1 gene knockout cells and believe that beginning with an FD-LC-MS/MS proteomic approach will enable researchers to elucidate protein networks more easily and comprehensively. PMID:23029042

  20. A method for validation of finite element forming simulation on basis of a pointwise comparison of distance and curvature

    NASA Astrophysics Data System (ADS)

    Dörr, Dominik; Joppich, Tobias; Schirmaier, Fabian; Mosthaf, Tobias; Kärger, Luise; Henning, Frank

    2016-10-01

    Thermoforming of continuously fiber reinforced thermoplastics (CFRTP) is ideally suited to thin walled and complex shaped products. By means of forming simulation, an initial validation of the producibility of a specific geometry, an optimization of the forming process and the prediction of fiber-reorientation due to forming is possible. Nevertheless, applied methods need to be validated. Therefor a method is presented, which enables the calculation of error measures for the mismatch between simulation results and experimental tests, based on measurements with a conventional coordinate measuring device. As a quantitative measure, describing the curvature is provided, the presented method is also suitable for numerical or experimental sensitivity studies on wrinkling behavior. The applied methods for forming simulation, implemented in Abaqus explicit, are presented and applied to a generic geometry. The same geometry is tested experimentally and simulation and test results are compared by the proposed validation method.

  1. Hybrid multiphase CFD simulation for liquid-liquid interfacial area prediction in annular centrifugal contactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wardle, K.E.

    2013-07-01

    Liquid-liquid contacting equipment used in solvent extraction processes has the dual purpose of mixing and separating two immiscible fluids. Consequently, such devices inherently encompass a wide variety of multiphase flow regimes. A hybrid multiphase computational fluid dynamics (CFD) solver which combines the Eulerian multi-fluid method with VOF (volume of fluid) sharp interface capturing has been developed for application to annular centrifugal contactors. This solver has been extended to enable prediction of mean droplet size and liquid-liquid interfacial area through a single moment population balance method. Simulations of liquid-liquid mixing in a simplified geometry and a model annular centrifugal contactor aremore » reported with droplet breakup/coalescence models being calibrated versus available experimental data. Quantitative comparison is made for two different housing vane geometries and it is found that the predicted droplet size is significantly smaller for vane geometries which result in higher annular liquid holdup.« less

  2. TASI: A software tool for spatial-temporal quantification of tumor spheroid dynamics.

    PubMed

    Hou, Yue; Konen, Jessica; Brat, Daniel J; Marcus, Adam I; Cooper, Lee A D

    2018-05-08

    Spheroid cultures derived from explanted cancer specimens are an increasingly utilized resource for studying complex biological processes like tumor cell invasion and metastasis, representing an important bridge between the simplicity and practicality of 2-dimensional monolayer cultures and the complexity and realism of in vivo animal models. Temporal imaging of spheroids can capture the dynamics of cell behaviors and microenvironments, and when combined with quantitative image analysis methods, enables deep interrogation of biological mechanisms. This paper presents a comprehensive open-source software framework for Temporal Analysis of Spheroid Imaging (TASI) that allows investigators to objectively characterize spheroid growth and invasion dynamics. TASI performs spatiotemporal segmentation of spheroid cultures, extraction of features describing spheroid morpho-phenotypes, mathematical modeling of spheroid dynamics, and statistical comparisons of experimental conditions. We demonstrate the utility of this tool in an analysis of non-small cell lung cancer spheroids that exhibit variability in metastatic and proliferative behaviors.

  3. Modular analysis of the probabilistic genetic interaction network.

    PubMed

    Hou, Lin; Wang, Lin; Qian, Minping; Li, Dong; Tang, Chao; Zhu, Yunping; Deng, Minghua; Li, Fangting

    2011-03-15

    Epistatic Miniarray Profiles (EMAP) has enabled the mapping of large-scale genetic interaction networks; however, the quantitative information gained from EMAP cannot be fully exploited since the data are usually interpreted as a discrete network based on an arbitrary hard threshold. To address such limitations, we adopted a mixture modeling procedure to construct a probabilistic genetic interaction network and then implemented a Bayesian approach to identify densely interacting modules in the probabilistic network. Mixture modeling has been demonstrated as an effective soft-threshold technique of EMAP measures. The Bayesian approach was applied to an EMAP dataset studying the early secretory pathway in Saccharomyces cerevisiae. Twenty-seven modules were identified, and 14 of those were enriched by gold standard functional gene sets. We also conducted a detailed comparison with state-of-the-art algorithms, hierarchical cluster and Markov clustering. The experimental results show that the Bayesian approach outperforms others in efficiently recovering biologically significant modules.

  4. Petri-net-based 2D design of DNA walker circuits.

    PubMed

    Gilbert, David; Heiner, Monika; Rohr, Christian

    2018-01-01

    We consider localised DNA computation, where a DNA strand walks along a binary decision graph to compute a binary function. One of the challenges for the design of reliable walker circuits consists in leakage transitions, which occur when a walker jumps into another branch of the decision graph. We automatically identify leakage transitions, which allows for a detailed qualitative and quantitative assessment of circuit designs, design comparison, and design optimisation. The ability to identify leakage transitions is an important step in the process of optimising DNA circuit layouts where the aim is to minimise the computational error inherent in a circuit while minimising the area of the circuit. Our 2D modelling approach of DNA walker circuits relies on coloured stochastic Petri nets which enable functionality, topology and dimensionality all to be integrated in one two-dimensional model. Our modelling and analysis approach can be easily extended to 3-dimensional walker systems.

  5. Enabling three-dimensional densitometric measurements using laboratory source X-ray micro-computed tomography

    NASA Astrophysics Data System (ADS)

    Pankhurst, M. J.; Fowler, R.; Courtois, L.; Nonni, S.; Zuddas, F.; Atwood, R. C.; Davis, G. R.; Lee, P. D.

    2018-01-01

    We present new software allowing significantly improved quantitative mapping of the three-dimensional density distribution of objects using laboratory source polychromatic X-rays via a beam characterisation approach (c.f. filtering or comparison to phantoms). One key advantage is that a precise representation of the specimen material is not required. The method exploits well-established, widely available, non-destructive and increasingly accessible laboratory-source X-ray tomography. Beam characterisation is performed in two stages: (1) projection data are collected through a range of known materials utilising a novel hardware design integrated into the rotation stage; and (2) a Python code optimises a spectral response model of the system. We provide hardware designs for use with a rotation stage able to be tilted, yet the concept is easily adaptable to virtually any laboratory system and sample, and implicitly corrects the image artefact known as beam hardening.

  6. Virtualising the Quantitative Research Methods Course: An Island-Based Approach

    ERIC Educational Resources Information Center

    Baglin, James; Reece, John; Baker, Jenalle

    2015-01-01

    Many recent improvements in pedagogical practice have been enabled by the rapid development of innovative technologies, particularly for teaching quantitative research methods and statistics. This study describes the design, implementation, and evaluation of a series of specialised computer laboratory sessions. The sessions combined the use of an…

  7. Recent Progress in the Remote Detection of Vapours and Gaseous Pollutants.

    ERIC Educational Resources Information Center

    Moffat, A. J.; And Others

    Work has been continuing on the correlation spectrometry techniques described at previous remote sensing symposiums. Advances in the techniques are described which enable accurate quantitative measurements of diffused atmospheric gases to be made using controlled light sources, accurate quantitative measurements of gas clouds relative to…

  8. Automated classification of cell morphology by coherence-controlled holographic microscopy

    NASA Astrophysics Data System (ADS)

    Strbkova, Lenka; Zicha, Daniel; Vesely, Pavel; Chmelik, Radim

    2017-08-01

    In the last few years, classification of cells by machine learning has become frequently used in biology. However, most of the approaches are based on morphometric (MO) features, which are not quantitative in terms of cell mass. This may result in poor classification accuracy. Here, we study the potential contribution of coherence-controlled holographic microscopy enabling quantitative phase imaging for the classification of cell morphologies. We compare our approach with the commonly used method based on MO features. We tested both classification approaches in an experiment with nutritionally deprived cancer tissue cells, while employing several supervised machine learning algorithms. Most of the classifiers provided higher performance when quantitative phase features were employed. Based on the results, it can be concluded that the quantitative phase features played an important role in improving the performance of the classification. The methodology could be valuable help in refining the monitoring of live cells in an automated fashion. We believe that coherence-controlled holographic microscopy, as a tool for quantitative phase imaging, offers all preconditions for the accurate automated analysis of live cell behavior while enabling noninvasive label-free imaging with sufficient contrast and high-spatiotemporal phase sensitivity.

  9. Automated classification of cell morphology by coherence-controlled holographic microscopy.

    PubMed

    Strbkova, Lenka; Zicha, Daniel; Vesely, Pavel; Chmelik, Radim

    2017-08-01

    In the last few years, classification of cells by machine learning has become frequently used in biology. However, most of the approaches are based on morphometric (MO) features, which are not quantitative in terms of cell mass. This may result in poor classification accuracy. Here, we study the potential contribution of coherence-controlled holographic microscopy enabling quantitative phase imaging for the classification of cell morphologies. We compare our approach with the commonly used method based on MO features. We tested both classification approaches in an experiment with nutritionally deprived cancer tissue cells, while employing several supervised machine learning algorithms. Most of the classifiers provided higher performance when quantitative phase features were employed. Based on the results, it can be concluded that the quantitative phase features played an important role in improving the performance of the classification. The methodology could be valuable help in refining the monitoring of live cells in an automated fashion. We believe that coherence-controlled holographic microscopy, as a tool for quantitative phase imaging, offers all preconditions for the accurate automated analysis of live cell behavior while enabling noninvasive label-free imaging with sufficient contrast and high-spatiotemporal phase sensitivity. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  10. Investigating the Educational Value of Social Learning Networks: A Quantitative Analysis

    ERIC Educational Resources Information Center

    Dafoulas, Georgios; Shokri, Azam

    2016-01-01

    Purpose: The emergence of Education 2.0 enabled technology-enhanced learning, necessitating new pedagogical approaches, while e-learning has evolved into an instrumental pedagogy of collaboration through affordances of social media. Social learning networks and ubiquitous learning enabled individual and group learning through social engagement and…

  11. KEY COMPARISON: Key comparison CCQM-K60: Total selenium and selenomethionine in selenised wheat flour

    NASA Astrophysics Data System (ADS)

    Goenaga Infante, Heidi; Sargent, Mike

    2010-01-01

    Key comparison CCQM-K60 was performed to assess the analytical capabilities of national metrology institutes (NMIs) to accurately quantitate the mass fraction of selenomethionine (SeMet) and total selenium (at low mg kg-1 levels) in selenised wheat flour. It was organized by the Inorganic Analysis Working Group (IAWG) of the Comité Consultatif pour la Quantité de Matière (CCQM) as a follow-up key comparison to the previous pilot study CCQM-P86 on selenised yeast tablets. LGC Limited (Teddington, UK) and the Institute for National Measurement Standards, National Research Council Canada (NRCC, Ottawa, Canada) acted as the coordinating laboratories. CCQM-K60 was organized in parallel with a pilot study (CCQM-P86.1) involving not only NMIs but also expert laboratories worldwide, thus enabling them to assess their capabilities, discover problems and learn how to modify analytical procedures accordingly. Nine results for total Se and four results for SeMet were reported by the participant NMIs. Methods used for sample preparation were microwave assisted acid digestion for total Se and multiple-step enzymatic hydrolysis and hydrolysis with methanesulfonic acid for SeMet. For total Se, detection techniques included inductively coupled plasma mass spectrometry (ICP-MS) with external calibration, standard additions or isotope dilution analysis (IDMS); instrumental neutron activation analysis (INAA); and graphite furnace atomic absorption spectrometry (GFAAS) with external calibration. For determination of SeMet in the wheat flour sample, the four NMIs relied upon measurements using species-specific IDMS (using 76Se-enriched SeMet) with HPLC-ICP-MS. Eight of the nine participating NMIs reported results for total Se within 3.5% deviation from the key comparison reference value (KCRV). For SeMet, the four participating NMIs reported results within 3.2% deviation from the KCRV. This shows that the performance of the majority of the CCQM-K60 participants was very good, illustrating their ability to obtain accurate results for such analytes in a complex food matrix containing approximately 17 mg kg-1 Se. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (MRA).

  12. Deep Learning for Magnetic Resonance Fingerprinting: A New Approach for Predicting Quantitative Parameter Values from Time Series.

    PubMed

    Hoppe, Elisabeth; Körzdörfer, Gregor; Würfl, Tobias; Wetzl, Jens; Lugauer, Felix; Pfeuffer, Josef; Maier, Andreas

    2017-01-01

    The purpose of this work is to evaluate methods from deep learning for application to Magnetic Resonance Fingerprinting (MRF). MRF is a recently proposed measurement technique for generating quantitative parameter maps. In MRF a non-steady state signal is generated by a pseudo-random excitation pattern. A comparison of the measured signal in each voxel with the physical model yields quantitative parameter maps. Currently, the comparison is done by matching a dictionary of simulated signals to the acquired signals. To accelerate the computation of quantitative maps we train a Convolutional Neural Network (CNN) on simulated dictionary data. As a proof of principle we show that the neural network implicitly encodes the dictionary and can replace the matching process.

  13. Comparison of Grid Nudging and Spectral Nudging Techniques for Dynamical Climate Downscaling within the WRF Model

    NASA Astrophysics Data System (ADS)

    Fan, X.; Chen, L.; Ma, Z.

    2010-12-01

    Climate downscaling has been an active research and application area in the past several decades focusing on regional climate studies. Dynamical downscaling, in addition to statistical methods, has been widely used in downscaling as the advanced modern numerical weather and regional climate models emerge. The utilization of numerical models enables that a full set of climate variables are generated in the process of downscaling, which are dynamically consistent due to the constraints of physical laws. While we are generating high resolution regional climate, the large scale climate patterns should be retained. To serve this purpose, nudging techniques, including grid analysis nudging and spectral nudging, have been used in different models. There are studies demonstrating the benefit and advantages of each nudging technique; however, the results are sensitive to many factors such as nudging coefficients and the amount of information to nudge to, and thus the conclusions are controversy. While in a companion work of developing approaches for quantitative assessment of the downscaled climate, in this study, the two nudging techniques are under extensive experiments in the Weather Research and Forecasting (WRF) model. Using the same model provides fair comparability. Applying the quantitative assessments provides objectiveness of comparison. Three types of downscaling experiments were performed for one month of choice. The first type is serving as a base whereas the large scale information is communicated through lateral boundary conditions only; the second is using the grid analysis nudging; and the third is using spectral nudging. Emphases are given to the experiments of different nudging coefficients and nudging to different variables in the grid analysis nudging; while in spectral nudging, we focus on testing the nudging coefficients, different wave numbers on different model levels to nudge.

  14. Extreme-phenotype genome-wide association study (XP-GWAS): a method for identifying trait-associated variants by sequencing pools of individuals selected from a diversity panel.

    PubMed

    Yang, Jinliang; Jiang, Haiying; Yeh, Cheng-Ting; Yu, Jianming; Jeddeloh, Jeffrey A; Nettleton, Dan; Schnable, Patrick S

    2015-11-01

    Although approaches for performing genome-wide association studies (GWAS) are well developed, conventional GWAS requires high-density genotyping of large numbers of individuals from a diversity panel. Here we report a method for performing GWAS that does not require genotyping of large numbers of individuals. Instead XP-GWAS (extreme-phenotype GWAS) relies on genotyping pools of individuals from a diversity panel that have extreme phenotypes. This analysis measures allele frequencies in the extreme pools, enabling discovery of associations between genetic variants and traits of interest. This method was evaluated in maize (Zea mays) using the well-characterized kernel row number trait, which was selected to enable comparisons between the results of XP-GWAS and conventional GWAS. An exome-sequencing strategy was used to focus sequencing resources on genes and their flanking regions. A total of 0.94 million variants were identified and served as evaluation markers; comparisons among pools showed that 145 of these variants were statistically associated with the kernel row number phenotype. These trait-associated variants were significantly enriched in regions identified by conventional GWAS. XP-GWAS was able to resolve several linked QTL and detect trait-associated variants within a single gene under a QTL peak. XP-GWAS is expected to be particularly valuable for detecting genes or alleles responsible for quantitative variation in species for which extensive genotyping resources are not available, such as wild progenitors of crops, orphan crops, and other poorly characterized species such as those of ecological interest. © 2015 The Authors The Plant Journal published by Society for Experimental Biology and John Wiley & Sons Ltd.

  15. A combined impact-process evaluation of a program promoting active transport to school: understanding the factors that shaped program effectiveness.

    PubMed

    Crawford, S; Garrard, J

    2013-01-01

    This mixed methods study was a comprehensive impact-process evaluation of the Ride2School program in metropolitan and regional areas in Victoria, Australia. The program aimed to promote transport to school for primary school children. Qualitative and quantitative data were collected at baseline and followup from two primary schools involved in the pilot phase of the program and two matched comparison schools, and a further 13 primary schools that participated in the implementation phase of the program. Classroom surveys, structured and unstructured observations, and interviews with Ride2School program staff were used to evaluate the pilot program. For the 13 schools in the second phase of the program, parents and students completed questionnaires at baseline (N = 889) and followup (N = 761). Based on the quantitative data, there was little evidence of an overall increase in active transport to school across participating schools, although impacts varied among individual schools. Qualitative data in the form of observations, interviews, and focus group discussions with students, school staff, and program staff provided insight into the reasons for variable program impacts. This paper highlights the benefits of undertaking a mixed methods approach to evaluating active transport to school programs that enables both measurement and understanding of program impacts.

  16. Comparison of microfluidic digital PCR and conventional quantitative PCR for measuring copy number variation.

    PubMed

    Whale, Alexandra S; Huggett, Jim F; Cowen, Simon; Speirs, Valerie; Shaw, Jacqui; Ellison, Stephen; Foy, Carole A; Scott, Daniel J

    2012-06-01

    One of the benefits of Digital PCR (dPCR) is the potential for unparalleled precision enabling smaller fold change measurements. An example of an assessment that could benefit from such improved precision is the measurement of tumour-associated copy number variation (CNV) in the cell free DNA (cfDNA) fraction of patient blood plasma. To investigate the potential precision of dPCR and compare it with the established technique of quantitative PCR (qPCR), we used breast cancer cell lines to investigate HER2 gene amplification and modelled a range of different CNVs. We showed that, with equal experimental replication, dPCR could measure a smaller CNV than qPCR. As dPCR precision is directly dependent upon both the number of replicate measurements and the template concentration, we also developed a method to assist the design of dPCR experiments for measuring CNV. Using an existing model (based on Poisson and binomial distributions) to derive an expression for the variance inherent in dPCR, we produced a power calculation to define the experimental size required to reliably detect a given fold change at a given template concentration. This work will facilitate any future translation of dPCR to key diagnostic applications, such as cancer diagnostics and analysis of cfDNA.

  17. Optical-sectioning microscopy of protoporphyrin IX fluorescence in human gliomas: standardization and quantitative comparison with histology

    NASA Astrophysics Data System (ADS)

    Wei, Linpeng; Chen, Ye; Yin, Chengbo; Borwege, Sabine; Sanai, Nader; Liu, Jonathan T. C.

    2017-04-01

    Systemic delivery of 5-aminolevulinic acid leads to enhanced fluorescence image contrast in many tumors due to the increased accumulation of protoporphyrin IX (PpIX), a fluorescent porphyrin that is associated with tumor burden and proliferation. The value of PpIX-guided resection of malignant gliomas has been demonstrated in prospective randomized clinical studies in which a twofold greater extent of resection and improved progression-free survival have been observed. In low-grade gliomas and at the diffuse infiltrative margins of all gliomas, PpIX fluorescence is often too weak to be detected with current low-resolution surgical microscopes that are used in operating rooms. However, it has been demonstrated that high-resolution optical-sectioning microscopes are capable of detecting the sparse and punctate accumulations of PpIX that are undetectable via conventional low-power surgical fluorescence microscopes. To standardize the performance of high-resolution optical-sectioning devices for future clinical use, we have developed an imaging phantom and methods to ensure that the imaging of PpIX-expressing brain tissues can be performed reproducibly. Ex vivo imaging studies with a dual-axis confocal microscope demonstrate that these methods enable the acquisition of images from unsectioned human brain tissues that quantitatively and consistently correlate with images of histologically processed tissue sections.

  18. Functionalization of SBA-15 mesoporous silica by Cu-phosphonate units: Probing of synthesis route

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laskowski, Lukasz, E-mail: lukasz.laskowski@kik.pcz.pl; Czestochowa University of Technology, Institute of Physics, Al. Armii Krajowej 19, 42-201 Czestochowa; Laskowska, Magdalena, E-mail: magdalena.laskowska@onet.pl

    2014-12-15

    Mesoporous silica SBA-15 containing propyl-copper phosphonate units was investigated. The structure of mesoporous samples was tested by N{sub 2} isothermal sorption (BET and BHJ analysis), TEM microscopy and X-Ray scattering. Quantitative analysis EDX has given information about proportions between component atoms in the sample. Quantitative elemental analysis has been carried out to support EDX. To examine bounding between copper atoms and phosphonic units the Raman spectroscopy was carried out. As a support of Raman scattering, the theoretical calculations were made based on density functional theory, with the B3LYP method. By comparison of the calculated vibrational spectra of the molecule withmore » experimental results, distribution of the active units inside silica matrix has been determined. - Graphical abstract: The present study is devoted to mesoporous silica SBA-15 containing propyl-copper phosphonate units. The species were investigated to confirm of synthesis procedure correctness by the micro-Raman technique combined with DFT numerical simulations. Complementary research was carried out to test the structure of mesoporous samples. - Highlights: • SBA-15 silica functionalized with propyl-copper phosphonate units was synthesized. • Synthesis efficiency probed by Raman study supported with DFT simulations. • Homogenous distribution of active units was proved. • Synthesis route enables precise control of distance between copper ions.« less

  19. Contrast detection in fluid-saturated media with magnetic resonance poroelastography

    PubMed Central

    Perriñez, Phillip R.; Pattison, Adam J.; Kennedy, Francis E.; Weaver, John B.; Paulsen, Keith D.

    2010-01-01

    Purpose: Recent interest in the poroelastic behavior of tissues has led to the development of magnetic resonance poroelastography (MRPE) as an alternative to single-phase MR elastographic image reconstruction. In addition to the elastic parameters (i.e., Lamé’s constants) commonly associated with magnetic resonance elastography (MRE), MRPE enables estimation of the time-harmonic pore-pressure field induced by external mechanical vibration. Methods: This study presents numerical simulations that demonstrate the sensitivity of the computed displacement and pore-pressure fields to a priori estimates of the experimentally derived model parameters. In addition, experimental data collected in three poroelastic phantoms are used to assess the quantitative accuracy of MR poroelastographic imaging through comparisons with both quasistatic and dynamic mechanical tests. Results: The results indicate hydraulic conductivity to be the dominant parameter influencing the deformation behavior of poroelastic media under conditions applied during MRE. MRPE estimation of the matrix shear modulus was bracketed by the values determined from independent quasistatic and dynamic mechanical measurements as expected, whereas the contrast ratios for embedded inclusions were quantitatively similar (10%–15% difference between the reconstructed images and the mechanical tests). Conclusions: The findings suggest that the addition of hydraulic conductivity and a viscoelastic solid component as parameters in the reconstruction may be warranted. PMID:20831058

  20. A Combined Impact-Process Evaluation of a Program Promoting Active Transport to School: Understanding the Factors That Shaped Program Effectiveness

    PubMed Central

    Crawford, S.; Garrard, J.

    2013-01-01

    This mixed methods study was a comprehensive impact-process evaluation of the Ride2School program in metropolitan and regional areas in Victoria, Australia. The program aimed to promote transport to school for primary school children. Qualitative and quantitative data were collected at baseline and followup from two primary schools involved in the pilot phase of the program and two matched comparison schools, and a further 13 primary schools that participated in the implementation phase of the program. Classroom surveys, structured and unstructured observations, and interviews with Ride2School program staff were used to evaluate the pilot program. For the 13 schools in the second phase of the program, parents and students completed questionnaires at baseline (N = 889) and followup (N = 761). Based on the quantitative data, there was little evidence of an overall increase in active transport to school across participating schools, although impacts varied among individual schools. Qualitative data in the form of observations, interviews, and focus group discussions with students, school staff, and program staff provided insight into the reasons for variable program impacts. This paper highlights the benefits of undertaking a mixed methods approach to evaluating active transport to school programs that enables both measurement and understanding of program impacts. PMID:23606865

  1. PeptideDepot: Flexible Relational Database for Visual Analysis of Quantitative Proteomic Data and Integration of Existing Protein Information

    PubMed Central

    Yu, Kebing; Salomon, Arthur R.

    2010-01-01

    Recently, dramatic progress has been achieved in expanding the sensitivity, resolution, mass accuracy, and scan rate of mass spectrometers able to fragment and identify peptides through tandem mass spectrometry (MS/MS). Unfortunately, this enhanced ability to acquire proteomic data has not been accompanied by a concomitant increase in the availability of flexible tools allowing users to rapidly assimilate, explore, and analyze this data and adapt to a variety of experimental workflows with minimal user intervention. Here we fill this critical gap by providing a flexible relational database called PeptideDepot for organization of expansive proteomic data sets, collation of proteomic data with available protein information resources, and visual comparison of multiple quantitative proteomic experiments. Our software design, built upon the synergistic combination of a MySQL database for safe warehousing of proteomic data with a FileMaker-driven graphical user interface for flexible adaptation to diverse workflows, enables proteomic end-users to directly tailor the presentation of proteomic data to the unique analysis requirements of the individual proteomics lab. PeptideDepot may be deployed as an independent software tool or integrated directly with our High Throughput Autonomous Proteomic Pipeline (HTAPP) used in the automated acquisition and post-acquisition analysis of proteomic data. PMID:19834895

  2. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods.

    PubMed

    Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A

    2014-12-01

    Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  3. Measurements in quantitative research: how to select and report on research instruments.

    PubMed

    Hagan, Teresa L

    2014-07-01

    Measures exist to numerically represent degrees of attributes. Quantitative research is based on measurement and is conducted in a systematic, controlled manner. These measures enable researchers to perform statistical tests, analyze differences between groups, and determine the effectiveness of treatments. If something is not measurable, it cannot be tested.

  4. Evaluation of the clinical sensitivity for the quantification of human immunodeficiency virus type 1 RNA in plasma: Comparison of the new COBAS TaqMan HIV-1 with three current HIV-RNA assays--LCx HIV RNA quantitative, VERSANT HIV-1 RNA 3.0 (bDNA) and COBAS AMPLICOR HIV-1 Monitor v1.5.

    PubMed

    Katsoulidou, Antigoni; Petrodaskalaki, Maria; Sypsa, Vana; Papachristou, Eleni; Anastassopoulou, Cleo G; Gargalianos, Panagiotis; Karafoulidou, Anastasia; Lazanas, Marios; Kordossis, Theodoros; Andoniadou, Anastasia; Hatzakis, Angelos

    2006-02-01

    The COBAS TaqMan HIV-1 test (Roche Diagnostics) was compared with the LCx HIV RNA quantitative assay (Abbott Laboratories), the Versant HIV-1 RNA 3.0 (bDNA) assay (Bayer) and the COBAS Amplicor HIV-1 Monitor v1.5 test (Roche Diagnostics), using plasma samples of various viral load levels from HIV-1-infected individuals. In the comparison of TaqMan with LCx, TaqMan identified as positive 77.5% of the 240 samples versus 72.1% identified by LCx assay, while their overall agreement was 94.6% and the quantitative results of samples that were positive by both methods were strongly correlated (r=0.91). Similarly, in the comparison of TaqMan with bDNA 3.0, both methods identified 76.3% of the 177 samples as positive, while their overall agreement was 95.5% and the quantitative results of samples that were positive by both methods were strongly correlated (r=0.95). Finally, in the comparison of TaqMan with Monitor v1.5, TaqMan identified 79.5% of the 156 samples as positive versus 80.1% identified by Monitor v1.5, while their overall agreement was 95.5% and the quantitative results of samples that were positive by both methods were strongly correlated (r=0.96). In conclusion, the new COBAS TaqMan HIV-1 test showed excellent agreement with other widely used commercially available tests for the quantitation of HIV-1 viral load.

  5. GeLC-MS-based proteomics of Chromobacterium violaceum: comparison of proteome changes elicited by hydrogen peroxide

    PubMed Central

    Lima, D. C.; Duarte, F. T.; Medeiros, V. K. S.; Carvalho, P. C.; Nogueira, F. C. S.; Araujo, G. D. T.; Domont, G. B.; Batistuzzo de Medeiros, S. R.

    2016-01-01

    Chromobacterium violaceum is a free-living bacillus with several genes that enables it survival under different harsh environments such as oxidative and temperature stresses. Here we performed a label-free quantitative proteomic study to unravel the molecular mechanisms that enable C. violaceum to survive oxidative stress. To achieve this, total proteins extracted from control and C. violaceum cultures exposed during two hours with 8 mM hydrogen peroxide were analyzed using GeLC-MS proteomics. Analysis revealed that under the stress condition, the bacterium expressed proteins that protected it from the damage caused by reactive oxygen condition and decreasing the abundance of proteins responsible for bacterial growth and catabolism. GeLC-MS proteomics analysis provided an overview of the metabolic pathways involved in the response of C. violaceum to oxidative stress ultimately aggregating knowledge of the response of this organism to environmental stress. This study identified approximately 1500 proteins, generating the largest proteomic coverage of C. violaceum so far. We also detected proteins with unknown function that we hypothesize to be part of new mechanisms related to oxidative stress defense. Finally, we identified the mechanism of clustered regularly interspaced short palindromic repeats (CRISPR), which has not yet been reported for this organism. PMID:27321545

  6. GeLC-MS-based proteomics of Chromobacterium violaceum: comparison of proteome changes elicited by hydrogen peroxide.

    PubMed

    Lima, D C; Duarte, F T; Medeiros, V K S; Carvalho, P C; Nogueira, F C S; Araujo, G D T; Domont, G B; Batistuzzo de Medeiros, S R

    2016-06-20

    Chromobacterium violaceum is a free-living bacillus with several genes that enables it survival under different harsh environments such as oxidative and temperature stresses. Here we performed a label-free quantitative proteomic study to unravel the molecular mechanisms that enable C. violaceum to survive oxidative stress. To achieve this, total proteins extracted from control and C. violaceum cultures exposed during two hours with 8 mM hydrogen peroxide were analyzed using GeLC-MS proteomics. Analysis revealed that under the stress condition, the bacterium expressed proteins that protected it from the damage caused by reactive oxygen condition and decreasing the abundance of proteins responsible for bacterial growth and catabolism. GeLC-MS proteomics analysis provided an overview of the metabolic pathways involved in the response of C. violaceum to oxidative stress ultimately aggregating knowledge of the response of this organism to environmental stress. This study identified approximately 1500 proteins, generating the largest proteomic coverage of C. violaceum so far. We also detected proteins with unknown function that we hypothesize to be part of new mechanisms related to oxidative stress defense. Finally, we identified the mechanism of clustered regularly interspaced short palindromic repeats (CRISPR), which has not yet been reported for this organism.

  7. Multiplexed target detection using DNA-binding dye chemistry in droplet digital PCR.

    PubMed

    McDermott, Geoffrey P; Do, Duc; Litterst, Claudia M; Maar, Dianna; Hindson, Christopher M; Steenblock, Erin R; Legler, Tina C; Jouvenot, Yann; Marrs, Samuel H; Bemis, Adam; Shah, Pallavi; Wong, Josephine; Wang, Shenglong; Sally, David; Javier, Leanne; Dinio, Theresa; Han, Chunxiao; Brackbill, Timothy P; Hodges, Shawn P; Ling, Yunfeng; Klitgord, Niels; Carman, George J; Berman, Jennifer R; Koehler, Ryan T; Hiddessen, Amy L; Walse, Pramod; Bousse, Luc; Tzonev, Svilen; Hefner, Eli; Hindson, Benjamin J; Cauly, Thomas H; Hamby, Keith; Patel, Viresh P; Regan, John F; Wyatt, Paul W; Karlin-Neumann, George A; Stumbo, David P; Lowe, Adam J

    2013-12-03

    Two years ago, we described the first droplet digital PCR (ddPCR) system aimed at empowering all researchers with a tool that removes the substantial uncertainties associated with using the analogue standard, quantitative real-time PCR (qPCR). This system enabled TaqMan hydrolysis probe-based assays for the absolute quantification of nucleic acids. Due to significant advancements in droplet chemistry and buoyed by the multiple benefits associated with dye-based target detection, we have created a "second generation" ddPCR system compatible with both TaqMan-probe and DNA-binding dye detection chemistries. Herein, we describe the operating characteristics of DNA-binding dye based ddPCR and offer a side-by-side comparison to TaqMan probe detection. By partitioning each sample prior to thermal cycling, we demonstrate that it is now possible to use a DNA-binding dye for the quantification of multiple target species from a single reaction. The increased resolution associated with partitioning also made it possible to visualize and account for signals arising from nonspecific amplification products. We expect that the ability to combine the precision of ddPCR with both DNA-binding dye and TaqMan probe detection chemistries will further enable the research community to answer complex and diverse genetic questions.

  8. Computer-automated ABCD versus dermatologists with different degrees of experience in dermoscopy.

    PubMed

    Piccolo, Domenico; Crisman, Giuliana; Schoinas, Spyridon; Altamura, Davide; Peris, Ketty

    2014-01-01

    Dermoscopy is a very useful and non-invasive technique for in vivo observation and preoperative diagnosis of pigmented skin lesions (PSLs) inasmuch as it enables analysis of surface and subsurface structures that are not discernible to the naked eye. The authors used the ABCD rule of dermoscopy to test the accuracy of melanoma diagnosis with respect to a panel of 165 PSLs and the intra- and inter-observer diagnostic agreement obtained between three dermatologists with different degrees of experience, one General Practitioner and a DDA for computer-assisted diagnosis (Nevuscreen(®), Arkè s.a.s., Avezzano, Italy). 165 Pigmented Skin Lesions from 165 patients were selected. Histopathological examination revealed 132 benign melanocytic skin lesions and 33 melanomas. The kappa statistic, sensitivity, specificity and predictive positive and negative values were calculated to measure agreement between all the human observers and in comparison with the automated DDA. Our results revealed poor reproducibility of the semi-quantitative algorithm devised by Stolz et al. independently of observers' experience in dermoscopy. Nevuscreen(®) (Arkè s.a.s., Avezzano, Italy) proved to be 'user friendly' to all observers, thus enabling a more critical evaluation of each lesion and representing a helpful tool for clinicians without significant experience in dermoscopy in improving and achieving more accurate diagnosis of PSLs.

  9. Nanofluidic transport through isolated carbon nanotube channels: Advances, controversies, and challenges

    DOE PAGES

    Guo, Shirui; Meshot, Eric R.; Kuykendall, Tevye; ...

    2015-06-02

    Owing to their simple chemistry and structure, controllable geometry, and a plethora of unusual yet exciting transport properties, carbon nanotubes (CNTs) have emerged as exceptional channels for fundamental nanofluidic studies, as well as building blocks for future fluidic devices that can outperform current technology in many applications. Leveraging the unique fluidic properties of CNTs in advanced systems requires a full understanding of their physical origin. Recent advancements in nanofabrication technology enable nanofluidic devices to be built with a single, nanometer-wide CNT as a fluidic pathway. These novel platforms with isolated CNT nanochannels offer distinct advantages for establishing quantitative structure–transport correlationsmore » in comparison with membranes containing many CNT pores. In addition, they are promising components for single-molecule sensors as well as for building nanotube-based circuits wherein fluidics and electronics can be coupled. With such advanced device architecture, molecular and ionic transport can be manipulated with vastly enhanced control for applications in sensing, separation, detection, and therapeutic delivery. Recent achievements in fabricating isolated-CNT nanofluidic platforms are highlighted, along with the most-significant findings each platform enables for water, ion, and molecular transport. Furthermore, the implications of these findings and remaining open questions on the exceptional fluidic properties of CNTs are also discussed.« less

  10. Infrared-optical transmission and reflection measurements on loose powders

    NASA Astrophysics Data System (ADS)

    Kuhn, J.; Korder, S.; Arduini-Schuster, M. C.; Caps, R.; Fricke, J.

    1993-09-01

    A method is described to determine quantitatively the infrared-optical properties of loose powder beds via directional-hemispherical transmission and reflection measurements. Instead of the integration of the powders into a potassium bromide (KBr) or a paraffin oil matrix, which would drastically alter the scattering behavior, the powders are placed onto supporting layers of polyethylene (PE) and KBr. A commercial spectrometer is supplemented by an external optics, which enables measurements on horizontally arranged samples. For data evaluation we use a solution of the equation of radiative transfer in the 3-flux approximation under boundary conditions adapted to the PE or KBr/powder system. A comparison with Kubelka-Munk's theory and Schuster's 2-flux approximation is performed, which shows that 3-flux approximation yields results closest to the exact solution. Equations are developed, which correct transmission and reflection of the samples for the influence of the supporting layer and calculate the specific extinction and the albedo of the powder and thus enables us to separate scattering and absorption part of the extinction spectrum. Measurements on TiO2 powder are presented, which show the influence of preparation techniques and data evaluation with different methods to obtain the albedo. The specific extinction of various TiO2 powders is presented.

  11. Mutual Photoluminescence Quenching and Photovoltaic Effect in Large-Area Single-Layer MoS2-Polymer Heterojunctions.

    PubMed

    Shastry, Tejas A; Balla, Itamar; Bergeron, Hadallia; Amsterdam, Samuel H; Marks, Tobin J; Hersam, Mark C

    2016-11-22

    Two-dimensional transition metal dichalcogenides (TMDCs) have recently attracted attention due to their superlative optical and electronic properties. In particular, their extraordinary optical absorption and semiconducting band gap have enabled demonstrations of photovoltaic response from heterostructures composed of TMDCs and other organic or inorganic materials. However, these early studies were limited to devices at the micrometer scale and/or failed to exploit the unique optical absorption properties of single-layer TMDCs. Here we present an experimental realization of a large-area type-II photovoltaic heterojunction using single-layer molybdenum disulfide (MoS 2 ) as the primary absorber, by coupling it to the organic π-donor polymer PTB7. This TMDC-polymer heterojunction exhibits photoluminescence intensity that is tunable as a function of the thickness of the polymer layer, ultimately enabling complete quenching of the TMDC photoluminescence. The strong optical absorption in the TMDC-polymer heterojunction produces an internal quantum efficiency exceeding 40% for an overall cell thickness of less than 20 nm, resulting in exceptional current density per absorbing thickness in comparison to other organic and inorganic solar cells. Furthermore, this work provides insight into the recombination processes in type-II TMDC-polymer heterojunctions and thus provides quantitative guidance to ongoing efforts to realize efficient TMDC-based solar cells.

  12. COMPARISON OF GENETIC METHODS TO OPTICAL METHODS IN THE IDENTIFICATION AND ASSESSMENT OF MOLD IN THE BUILT ENVIRONMENT -- COMPARISON OF TAQMAN AND MICROSCOPIC ANALYSIS OF CLADOSPORIUM SPORES RETRIEVED FROM ZEFON AIR-O-CELL TRACES

    EPA Science Inventory

    Recent advances in the sequencing of relevant water intrusion fungi by the EPA, combined with the development of probes and primers have allowed for the unequivocal quantitative and qualitative identification of fungi in selected matrices.

    In this pilot study, quantitative...

  13. Understanding Knowledge-Sharing Breakdowns: A Meeting of the Quantitative and Qualitative Minds

    ERIC Educational Resources Information Center

    Soller, Amy

    2004-01-01

    The rapid advance of distance learning and networking technology has enabled universities and corporations to reach out and educate students across time and space barriers. Although this technology enables structured collaborative learning activities, online groups often do not enjoy the same benefits as face-to-face learners, and their…

  14. Applying Knowledge of Quantitative Design and Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  15. Quantitative prediction of phase transformations in silicon during nanoindentation

    NASA Astrophysics Data System (ADS)

    Zhang, Liangchi; Basak, Animesh

    2013-08-01

    This paper establishes the first quantitative relationship between the phases transformed in silicon and the shape characteristics of nanoindentation curves. Based on an integrated analysis using TEM and unit cell properties of phases, the volumes of the phases emerged in a nanoindentation are formulated as a function of pop-out size and depth of nanoindentation impression. This simple formula enables a fast, accurate and quantitative prediction of the phases in a nanoindentation cycle, which has been impossible before.

  16. Congruent climate-related genecological responses from molecular markers and quantitative traits for western white pine (Pinus monticola)

    Treesearch

    Bryce A. Richardson; Gerald E. Rehfeldt; Mee-Sook Kim

    2009-01-01

    Analyses of molecular and quantitative genetic data demonstrate the existence of congruent climate-related patterns in western white pine (Pinus monticola). Two independent studies allowed comparisons of amplified fragment length polymorphism (AFLP) markers with quantitative variation in adaptive traits. Principal component analyses...

  17. LipidQC: Method Validation Tool for Visual Comparison to SRM 1950 Using NIST Interlaboratory Comparison Exercise Lipid Consensus Mean Estimate Values.

    PubMed

    Ulmer, Candice Z; Ragland, Jared M; Koelmel, Jeremy P; Heckert, Alan; Jones, Christina M; Garrett, Timothy J; Yost, Richard A; Bowden, John A

    2017-12-19

    As advances in analytical separation techniques, mass spectrometry instrumentation, and data processing platforms continue to spur growth in the lipidomics field, more structurally unique lipid species are detected and annotated. The lipidomics community is in need of benchmark reference values to assess the validity of various lipidomics workflows in providing accurate quantitative measurements across the diverse lipidome. LipidQC addresses the harmonization challenge in lipid quantitation by providing a semiautomated process, independent of analytical platform, for visual comparison of experimental results of National Institute of Standards and Technology Standard Reference Material (SRM) 1950, "Metabolites in Frozen Human Plasma", against benchmark consensus mean concentrations derived from the NIST Lipidomics Interlaboratory Comparison Exercise.

  18. Determining absolute protein numbers by quantitative fluorescence microscopy.

    PubMed

    Verdaasdonk, Jolien Suzanne; Lawrimore, Josh; Bloom, Kerry

    2014-01-01

    Biological questions are increasingly being addressed using a wide range of quantitative analytical tools to examine protein complex composition. Knowledge of the absolute number of proteins present provides insights into organization, function, and maintenance and is used in mathematical modeling of complex cellular dynamics. In this chapter, we outline and describe three microscopy-based methods for determining absolute protein numbers--fluorescence correlation spectroscopy, stepwise photobleaching, and ratiometric comparison of fluorescence intensity to known standards. In addition, we discuss the various fluorescently labeled proteins that have been used as standards for both stepwise photobleaching and ratiometric comparison analysis. A detailed procedure for determining absolute protein number by ratiometric comparison is outlined in the second half of this chapter. Counting proteins by quantitative microscopy is a relatively simple yet very powerful analytical tool that will increase our understanding of protein complex composition. © 2014 Elsevier Inc. All rights reserved.

  19. Advancing Precision Nuclear Medicine and Molecular Imaging for Lymphoma.

    PubMed

    Wright, Chadwick L; Maly, Joseph J; Zhang, Jun; Knopp, Michael V

    2017-01-01

    PET with fluorodeoxyglucose F 18 ( 18 F FDG-PET) is a meaningful biomarker for the detection, targeted biopsy, and treatment of lymphoma. This article reviews the evolution of 18 F FDG-PET as a putative biomarker for lymphoma and addresses the current capabilities, challenges, and opportunities to enable precision medicine practices for lymphoma. Precision nuclear medicine is driven by new imaging technologies and methodologies to more accurately detect malignant disease. Although quantitative assessment of response is limited, such technologies will enable a more precise metabolic mapping with much higher definition image detail and thus may make it a robust and valid quantitative response assessment methodology. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. PTMScout, a Web Resource for Analysis of High Throughput Post-translational Proteomics Studies*

    PubMed Central

    Naegle, Kristen M.; Gymrek, Melissa; Joughin, Brian A.; Wagner, Joel P.; Welsch, Roy E.; Yaffe, Michael B.; Lauffenburger, Douglas A.; White, Forest M.

    2010-01-01

    The rate of discovery of post-translational modification (PTM) sites is increasing rapidly and is significantly outpacing our biological understanding of the function and regulation of those modifications. To help meet this challenge, we have created PTMScout, a web-based interface for viewing, manipulating, and analyzing high throughput experimental measurements of PTMs in an effort to facilitate biological understanding of protein modifications in signaling networks. PTMScout is constructed around a custom database of PTM experiments and contains information from external protein and post-translational resources, including gene ontology annotations, Pfam domains, and Scansite predictions of kinase and phosphopeptide binding domain interactions. PTMScout functionality comprises data set comparison tools, data set summary views, and tools for protein assignments of peptides identified by mass spectrometry. Analysis tools in PTMScout focus on informed subset selection via common criteria and on automated hypothesis generation through subset labeling derived from identification of statistically significant enrichment of other annotations in the experiment. Subset selection can be applied through the PTMScout flexible query interface available for quantitative data measurements and data annotations as well as an interface for importing data set groupings by external means, such as unsupervised learning. We exemplify the various functions of PTMScout in application to data sets that contain relative quantitative measurements as well as data sets lacking quantitative measurements, producing a set of interesting biological hypotheses. PTMScout is designed to be a widely accessible tool, enabling generation of multiple types of biological hypotheses from high throughput PTM experiments and advancing functional assignment of novel PTM sites. PTMScout is available at http://ptmscout.mit.edu. PMID:20631208

  1. Qualitative and quantitative two-dimensional thin-layer chromatography/high performance liquid chromatography/diode-array/electrospray-ionization-time-of-flight mass spectrometry of cholinesterase inhibitors.

    PubMed

    Mroczek, Tomasz

    2016-09-10

    Recently launched thin-layer chromatography-mass spectrometry (TLC-MS) interface enabling extraction of compounds directly from TLC plates into MS ion source was unusually extended into two-dimensional thin-layer chromatography/high performance liquid chromatography (2D, TLC/HPLC) system by its a direct connection to a rapid resolution 50×2.1mm, I.D. C18 column compartment followed by detection by diode array (DAD) and electrospray ionisation time-of-flight mass spectrometry (ESI-TOF-MS). In this way, even not separated bands of complicated mixtures of natural compounds could be analysed structurally, only within 1-2min after development of TLC plates. In comparison to typically applied TLC-MS interface, no ion suppression for acidic mobile phases was observed. Also, substantial increase in ESI-TOF-MS sensitivities and quality of spectra, were noticed. It has been utilised in combination with TLC- based bioautographic approaches of acetylcholinesterase (AChE) inhibitors, However, it can be also applied in any other procedures related to bioactivity (e.g. 2,2-Diphenyl-1-picryl-hydrazyl-DPPH screen test for radicals). This system has been also used for determination of half maximal inhibitory concentration (IC50 values) of the active inhibitor-galanthamine, as an example. Moreover, AChE inhibitory potencies of some of purified plant extracts, never studied before, have been quantitatively measured. This is first report of usage such the 2D TLC/HPLC/MS system both for qualitative and quantitative evaluation of cholinesterase inhibitors in biological matrices. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Validating internal controls for quantitative plant gene expression studies

    PubMed Central

    Brunner, Amy M; Yakovlev, Igor A; Strauss, Steven H

    2004-01-01

    Background Real-time reverse transcription PCR (RT-PCR) has greatly improved the ease and sensitivity of quantitative gene expression studies. However, accurate measurement of gene expression with this method relies on the choice of a valid reference for data normalization. Studies rarely verify that gene expression levels for reference genes are adequately consistent among the samples used, nor compare alternative genes to assess which are most reliable for the experimental conditions analyzed. Results Using real-time RT-PCR to study the expression of 10 poplar (genus Populus) housekeeping genes, we demonstrate a simple method for determining the degree of stability of gene expression over a set of experimental conditions. Based on a traditional method for analyzing the stability of varieties in plant breeding, it defines measures of gene expression stability from analysis of variance (ANOVA) and linear regression. We found that the potential internal control genes differed widely in their expression stability over the different tissues, developmental stages and environmental conditions studied. Conclusion Our results support that quantitative comparisons of candidate reference genes are an important part of real-time RT-PCR studies that seek to precisely evaluate variation in gene expression. The method we demonstrated facilitates statistical and graphical evaluation of gene expression stability. Selection of the best reference gene for a given set of experimental conditions should enable detection of biologically significant changes in gene expression that are too small to be revealed by less precise methods, or when highly variable reference genes are unknowingly used in real-time RT-PCR experiments. PMID:15317655

  3. A whole-cell bioreporter assay for quantitative genotoxicity evaluation of environmental samples.

    PubMed

    Jiang, Bo; Li, Guanghe; Xing, Yi; Zhang, Dayi; Jia, Jianli; Cui, Zhisong; Luan, Xiao; Tang, Hui

    2017-10-01

    Whole-cell bioreporters have emerged as promising tools for genotoxicity evaluation, due to their rapidity, cost-effectiveness, sensitivity and selectivity. In this study, a method for detecting genotoxicity in environmental samples was developed using the bioluminescent whole-cell bioreporter Escherichia coli recA::luxCDABE. To further test its performance in a real world scenario, the E. coli bioreporter was applied in two cases: i) soil samples collected from chromium(VI) contaminated sites; ii) crude oil contaminated seawater collected after the Jiaozhou Bay oil spill which occurred in 2013. The chromium(VI) contaminated soils were pretreated by water extraction, and directly exposed to the bioreporter in two phases: aqueous soil extraction (water phase) and soil supernatant (solid phase). The results indicated that both extractable and soil particle fixed chromium(VI) were bioavailable to the bioreporter, and the solid-phase contact bioreporter assay provided a more precise evaluation of soil genotoxicity. For crude oil contaminated seawater, the response of the bioreporter clearly illustrated the spatial and time change in genotoxicity surrounding the spill site, suggesting that the crude oil degradation process decreased the genotoxic risk to ecosystem. In addition, the performance of the bioreporter was simulated by a modified cross-regulation gene expression model, which quantitatively described the DNA damage response of the E. coli bioreporter. Accordingly, the bioluminescent response of the bioreporter was calculated as the mitomycin C equivalent, enabling quantitative comparison of genotoxicities between different environmental samples. This bioreporter assay provides a rapid and sensitive screening tool for direct genotoxicity assessment of environmental samples. Copyright © 2017. Published by Elsevier Ltd.

  4. 75 FR 68468 - List of Fisheries for 2011

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-08

    ...-existent; therefore, quantitative data on the frequency of incidental mortality and serious injury is... currently available for most of these marine mammals on the high seas, and quantitative comparison of...

  5. Mixed methods research.

    PubMed

    Halcomb, Elizabeth; Hickman, Louise

    2015-04-08

    Mixed methods research involves the use of qualitative and quantitative data in a single research project. It represents an alternative methodological approach, combining qualitative and quantitative research approaches, which enables nurse researchers to explore complex phenomena in detail. This article provides a practical overview of mixed methods research and its application in nursing, to guide the novice researcher considering a mixed methods research project.

  6. Quantitative evaluation of treatment related changes on multi-parametric MRI after laser interstitial thermal therapy of prostate cancer

    NASA Astrophysics Data System (ADS)

    Viswanath, Satish; Toth, Robert; Rusu, Mirabela; Sperling, Dan; Lepor, Herbert; Futterer, Jurgen; Madabhushi, Anant

    2013-03-01

    Laser interstitial thermal therapy (LITT) has recently shown great promise as a treatment strategy for localized, focal, low-grade, organ-confined prostate cancer (CaP). Additionally, LITT is compatible with multi-parametric magnetic resonance imaging (MP-MRI) which in turn enables (1) high resolution, accurate localization of ablation zones on in vivo MP-MRI prior to LITT, and (2) real-time monitoring of temperature changes in vivo via MR thermometry during LITT. In spite of rapidly increasing interest in the use of LITT for treating low grade, focal CaP, very little is known about treatment-related changes following LITT. There is thus a clear need for studying post-LITT changes via MP-MRI and consequently to attempt to (1) quantitatively identify MP-MRI markers predictive of favorable treatment response and longer term patient outcome, and (2) identify which MP-MRI markers are most sensitive to post-LITT changes in the prostate. In this work, we present the first attempt at examining focal treatment-related changes on a per-voxel basis (high resolution) via quantitative evaluation of MR parameters pre- and post-LITT. A retrospective cohort of MP-MRI data comprising both pre- and post- LITT T2-weighted (T2w) and diffusion-weighted (DWI) acquisitions was considered, where DWI MRI yielded an Apparent Diffusion Co-efficient (ADC) map. A spatially constrained affine registration scheme was implemented to first bring T2w and ADC images into alignment within each of the pre- and post-LITT acquisitions, following which the pre- and post-LITT acquisitions were aligned. Pre- and post-LITT MR parameters (T2w intensity, ADC value) were then standardized to a uniform scale (to correct for intensity drift) and then quantified via the raw intensity values as well as via texture features derived from T2w MRI. In order to quantify imaging changes as a result of LITT, absolute differences were calculated between the normalized pre- and post-LITT MRI parameters. Quantitatively combining the ADC and T2w MRI parameters enabled construction of an integrated MP-MRI difference map that was highly indicative of changes specific to the LITT ablation zone. Preliminary quantitative comparison of the changes in different MR parameters indicated that T2w texture may be highly sensitive as well as specific in identifying changes within the ablation zone pre- and post-LITT. Visual evaluation of the differences in T2w texture features pre- and post-LITT also appeared to provide an indication of LITT-related effects such as edema. Our preliminary results thus indicate great potential for non-invasive MP-MRI imaging markers for determining focal treatment related changes, and hence long- and short-term patient outcome.

  7. Quantitative secondary electron detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agrawal, Jyoti; Joy, David C.; Nayak, Subuhadarshi

    Quantitative Secondary Electron Detection (QSED) using the array of solid state devices (SSD) based electron-counters enable critical dimension metrology measurements in materials such as semiconductors, nanomaterials, and biological samples (FIG. 3). Methods and devices effect a quantitative detection of secondary electrons with the array of solid state detectors comprising a number of solid state detectors. An array senses the number of secondary electrons with a plurality of solid state detectors, counting the number of secondary electrons with a time to digital converter circuit in counter mode.

  8. Generation of High-Quality SWATH® Acquisition Data for Label-free Quantitative Proteomics Studies Using TripleTOF® Mass Spectrometers

    PubMed Central

    Schilling, Birgit; Gibson, Bradford W.; Hunter, Christie L.

    2017-01-01

    Data-independent acquisition is a powerful mass spectrometry technique that enables comprehensive MS and MS/MS analysis of all detectable species, providing an information rich data file that can be mined deeply. Here, we describe how to acquire high-quality SWATH® Acquisition data to be used for large quantitative proteomic studies. We specifically focus on using variable sized Q1 windows for acquisition of MS/MS data for generating higher specificity quantitative data. PMID:28188533

  9. High-resolution quantitative determination of dielectric function by using scattering scanning near-field optical microscopy

    PubMed Central

    Tranca, D. E.; Stanciu, S. G.; Hristu, R.; Stoichita, C.; Tofail, S. A. M.; Stanciu, G. A.

    2015-01-01

    A new method for high-resolution quantitative measurement of the dielectric function by using scattering scanning near-field optical microscopy (s-SNOM) is presented. The method is based on a calibration procedure that uses the s-SNOM oscillating dipole model of the probe-sample interaction and quantitative s-SNOM measurements. The nanoscale capabilities of the method have the potential to enable novel applications in various fields such as nano-electronics, nano-photonics, biology or medicine. PMID:26138665

  10. A comparison of three fiber tract delineation methods and their impact on white matter analysis.

    PubMed

    Sydnor, Valerie J; Rivas-Grajales, Ana María; Lyall, Amanda E; Zhang, Fan; Bouix, Sylvain; Karmacharya, Sarina; Shenton, Martha E; Westin, Carl-Fredrik; Makris, Nikos; Wassermann, Demian; O'Donnell, Lauren J; Kubicki, Marek

    2018-05-19

    Diffusion magnetic resonance imaging (dMRI) is an important method for studying white matter connectivity in the brain in vivo in both healthy and clinical populations. Improvements in dMRI tractography algorithms, which reconstruct macroscopic three-dimensional white matter fiber pathways, have allowed for methodological advances in the study of white matter; however, insufficient attention has been paid to comparing post-tractography methods that extract white matter fiber tracts of interest from whole-brain tractography. Here we conduct a comparison of three representative and conceptually distinct approaches to fiber tract delineation: 1) a manual multiple region of interest-based approach, 2) an atlas-based approach, and 3) a groupwise fiber clustering approach, by employing methods that exemplify these approaches to delineate the arcuate fasciculus, the middle longitudinal fasciculus, and the uncinate fasciculus in 10 healthy male subjects. We enable qualitative comparisons across methods, conduct quantitative evaluations of tract volume, tract length, mean fractional anisotropy, and true positive and true negative rates, and report measures of intra-method and inter-method agreement. We discuss methodological similarities and differences between the three approaches and the major advantages and drawbacks of each, and review research and clinical contexts for which each method may be most apposite. Emphasis is given to the means by which different white matter fiber tract delineation approaches may systematically produce variable results, despite utilizing the same input tractography and reliance on similar anatomical knowledge. Copyright © 2018. Published by Elsevier Inc.

  11. Quantitative force measurements using frequency modulation atomic force microscopy—theoretical foundations

    NASA Astrophysics Data System (ADS)

    Sader, John E.; Uchihashi, Takayuki; Higgins, Michael J.; Farrell, Alan; Nakayama, Yoshikazu; Jarvis, Suzanne P.

    2005-03-01

    Use of the atomic force microscope (AFM) in quantitative force measurements inherently requires a theoretical framework enabling conversion of the observed deflection properties of the cantilever to an interaction force. In this paper, the theoretical foundations of using frequency modulation atomic force microscopy (FM-AFM) in quantitative force measurements are examined and rigorously elucidated, with consideration being given to both 'conservative' and 'dissipative' interactions. This includes a detailed discussion of the underlying assumptions involved in such quantitative force measurements, the presentation of globally valid explicit formulae for evaluation of so-called 'conservative' and 'dissipative' forces, discussion of the origin of these forces, and analysis of the applicability of FM-AFM to quantitative force measurements in liquid.

  12. Studying learning in the healthcare setting: the potential of quantitative diary methods.

    PubMed

    Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke

    2015-08-01

    Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples' experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the field of medical education. Quantitative diary methods offer several methodological advantages, such as measuring aspects of learning with great detail, accuracy and authenticity. Moreover, they enable researchers to study how and under which conditions learning in the health care setting occurs and in which way learning can be promoted. Hence, quantitative diary methods may contribute to theory development and the optimization of teaching methods in medical education.

  13. Identification and Quantitation of Asparagine and Citrulline Using High-Performance Liquid Chromatography (HPLC)

    PubMed Central

    Bai, Cheng; Reilly, Charles C.; Wood, Bruce W.

    2007-01-01

    High-performance liquid chromatography (HPLC) analysis was used for identification of two problematic ureides, asparagine and citrulline. We report here a technique that takes advantage of the predictable delay in retention time of the co-asparagine/citrulline peak to enable both qualitative and quantitative analysis of asparagine and citrulline using the Platinum EPS reverse-phase C18 column (Alltech Associates). Asparagine alone is eluted earlier than citrulline alone, but when both of them are present in biological samples they may co-elute. HPLC retention times for asparagine and citrulline were influenced by other ureides in the mixture. We found that at various asparagines and citrulline ratios [= 3:1, 1:1, and 1:3; corresponding to 75:25, 50:50, and 25:75 (μMol ml−1/μMol ml−1)], the resulting peak exhibited different retention times. Adjustment of ureide ratios as internal standards enables peak identification and quantification. Both chemicals were quantified in xylem sap samples of pecan [Carya illinoinensis (Wangenh.) K. Koch] trees. Analysis revealed that tree nickel nutrition status affects relative concentrations of Urea Cycle intermediates, asparagine and citrulline, present in sap. Consequently, we concluded that the HPLC methods are presented to enable qualitative and quantitative analysis of these metabolically important ureides. PMID:19662174

  14. Identification and quantitation of asparagine and citrulline using high-performance liquid chromatography (HPLC).

    PubMed

    Bai, Cheng; Reilly, Charles C; Wood, Bruce W

    2007-03-28

    High-performance liquid chromatography (HPLC) analysis was used for identification of two problematic ureides, asparagine and citrulline. We report here a technique that takes advantage of the predictable delay in retention time of the co-asparagine/citrulline peak to enable both qualitative and quantitative analysis of asparagine and citrulline using the Platinum EPS reverse-phase C18 column (Alltech Associates). Asparagine alone is eluted earlier than citrulline alone, but when both of them are present in biological samples they may co-elute. HPLC retention times for asparagine and citrulline were influenced by other ureides in the mixture. We found that at various asparagines and citrulline ratios [= 3:1, 1:1, and 1:3; corresponding to 75:25, 50:50, and 25:75 (microMol ml(-1)/microMol ml(-1))], the resulting peak exhibited different retention times. Adjustment of ureide ratios as internal standards enables peak identification and quantification. Both chemicals were quantified in xylem sap samples of pecan [Carya illinoinensis (Wangenh.) K. Koch] trees. Analysis revealed that tree nickel nutrition status affects relative concentrations of Urea Cycle intermediates, asparagine and citrulline, present in sap. Consequently, we concluded that the HPLC methods are presented to enable qualitative and quantitative analysis of these metabolically important ureides.

  15. Detailed comparison between DNS and wind tunnel experiment for an airfoil at Re = 20,000 with a view towards control

    NASA Astrophysics Data System (ADS)

    Tank, Joseph; Jacobs, Gustaaf; Spedding, Geoffrey

    2017-11-01

    The reduction in size and weight of electronic devices in recent years has enabled the use of small flying devices that operate at Re <1.5 x 105 for a variety of applications. At these low Re, the boundary layer often separates before the trailing edge, even at low angles of attack, leading to aerodynamic behaviors that are not predicted by classical inviscid theories. There is currently no comprehensive database of airfoil data in this Re regime, where the sensitivity of the boundary layer behavior to small disturbances in the free stream often leads to discrepancies between results generated in different facilities. Here we provide experimental results generated in a wind tunnel with a low turbulence intensity for a NACA 65(1)-412 airfoil at Re = 2 x 104. Several unexpected phenomena are observed in force balance results and explanations are proposed based on PIV flow visualization. Qualitative and quantitative comparisons are made with results from a DNS code using higher-order discontinuous Galerkin methods. Internal acoustic forcing at locations dictated by Lagrangian Coherent Structure behavior is explored as a potential closed loop flow control strategy. Support from AFOSR Grant# FA9550-16-1-0392 under Dr Doug Smith is most gratefully acknowledged.

  16. Benchmarking all-atom simulations using hydrogen exchange

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skinner, John J.; Yu, Wookyung; Gichana, Elizabeth K.

    We are now able to fold small proteins reversibly to their native structures [Lindorff-Larsen K, Piana S, Dror RO, Shaw DE (2011) Science 334(6055):517–520] using long-time molecular dynamics (MD) simulations. Our results indicate that modern force fields can reproduce the energy surface near the native structure. In this paper, to test how well the force fields recapitulate the other regions of the energy surface, MD trajectories for a variant of protein G are compared with data from site-resolved hydrogen exchange (HX) and other biophysical measurements. Because HX monitors the breaking of individual H-bonds, this experimental technique identifies the stability andmore » H-bond content of excited states, thus enabling quantitative comparison with the simulations. Contrary to experimental findings of a cooperative, all-or-none unfolding process, the simulated denatured state ensemble, on average, is highly collapsed with some transient or persistent native 2° structure. The MD trajectories of this protein G variant and other small proteins exhibit excessive intramolecular H-bonding even for the most expanded conformations, suggesting that the force fields require improvements in describing H-bonding and backbone hydration. Finally and moreover, these comparisons provide a general protocol for validating the ability of simulations to accurately capture rare structural fluctuations.« less

  17. Inter-laboratory analysis of selected genetically modified plant reference materials with digital PCR.

    PubMed

    Dobnik, David; Demšar, Tina; Huber, Ingrid; Gerdes, Lars; Broeders, Sylvia; Roosens, Nancy; Debode, Frederic; Berben, Gilbert; Žel, Jana

    2018-01-01

    Digital PCR (dPCR), as a new technology in the field of genetically modified (GM) organism (GMO) testing, enables determination of absolute target copy numbers. The purpose of our study was to test the transferability of methods designed for quantitative PCR (qPCR) to dPCR and to carry out an inter-laboratory comparison of the performance of two different dPCR platforms when determining the absolute GM copy numbers and GM copy number ratio in reference materials certified for GM content in mass fraction. Overall results in terms of measured GM% were within acceptable variation limits for both tested dPCR systems. However, the determined absolute copy numbers for individual genes or events showed higher variability between laboratories in one third of the cases, most possibly due to variability in the technical work, droplet size variability, and analysis of the raw data. GMO quantification with dPCR and qPCR was comparable. As methods originally designed for qPCR performed well in dPCR systems, already validated qPCR assays can most generally be used for dPCR technology with the purpose of GMO detection. Graphical abstract The output of three different PCR-based platforms was assessed in an inter-laboratory comparison.

  18. Extracting morphologies from third harmonic generation images of structurally normal human brain tissue.

    PubMed

    Zhang, Zhiqing; Kuzmin, Nikolay V; Groot, Marie Louise; de Munck, Jan C

    2017-06-01

    The morphologies contained in 3D third harmonic generation (THG) images of human brain tissue can report on the pathological state of the tissue. However, the complexity of THG brain images makes the usage of modern image processing tools, especially those of image filtering, segmentation and validation, to extract this information challenging. We developed a salient edge-enhancing model of anisotropic diffusion for image filtering, based on higher order statistics. We split the intrinsic 3-phase segmentation problem into two 2-phase segmentation problems, each of which we solved with a dedicated model, active contour weighted by prior extreme. We applied the novel proposed algorithms to THG images of structurally normal ex-vivo human brain tissue, revealing key tissue components-brain cells, microvessels and neuropil, enabling statistical characterization of these components. Comprehensive comparison to manually delineated ground truth validated the proposed algorithms. Quantitative comparison to second harmonic generation/auto-fluorescence images, acquired simultaneously from the same tissue area, confirmed the correctness of the main THG features detected. The software and test datasets are available from the authors. z.zhang@vu.nl. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  19. Benchmarking all-atom simulations using hydrogen exchange

    DOE PAGES

    Skinner, John J.; Yu, Wookyung; Gichana, Elizabeth K.; ...

    2014-10-27

    We are now able to fold small proteins reversibly to their native structures [Lindorff-Larsen K, Piana S, Dror RO, Shaw DE (2011) Science 334(6055):517–520] using long-time molecular dynamics (MD) simulations. Our results indicate that modern force fields can reproduce the energy surface near the native structure. In this paper, to test how well the force fields recapitulate the other regions of the energy surface, MD trajectories for a variant of protein G are compared with data from site-resolved hydrogen exchange (HX) and other biophysical measurements. Because HX monitors the breaking of individual H-bonds, this experimental technique identifies the stability andmore » H-bond content of excited states, thus enabling quantitative comparison with the simulations. Contrary to experimental findings of a cooperative, all-or-none unfolding process, the simulated denatured state ensemble, on average, is highly collapsed with some transient or persistent native 2° structure. The MD trajectories of this protein G variant and other small proteins exhibit excessive intramolecular H-bonding even for the most expanded conformations, suggesting that the force fields require improvements in describing H-bonding and backbone hydration. Finally and moreover, these comparisons provide a general protocol for validating the ability of simulations to accurately capture rare structural fluctuations.« less

  20. Effects of calibration methods on quantitative material decomposition in photon-counting spectral computed tomography using a maximum a posteriori estimator.

    PubMed

    Curtis, Tyler E; Roeder, Ryan K

    2017-10-01

    Advances in photon-counting detectors have enabled quantitative material decomposition using multi-energy or spectral computed tomography (CT). Supervised methods for material decomposition utilize an estimated attenuation for each material of interest at each photon energy level, which must be calibrated based upon calculated or measured values for known compositions. Measurements using a calibration phantom can advantageously account for system-specific noise, but the effect of calibration methods on the material basis matrix and subsequent quantitative material decomposition has not been experimentally investigated. Therefore, the objective of this study was to investigate the influence of the range and number of contrast agent concentrations within a modular calibration phantom on the accuracy of quantitative material decomposition in the image domain. Gadolinium was chosen as a model contrast agent in imaging phantoms, which also contained bone tissue and water as negative controls. The maximum gadolinium concentration (30, 60, and 90 mM) and total number of concentrations (2, 4, and 7) were independently varied to systematically investigate effects of the material basis matrix and scaling factor calibration on the quantitative (root mean squared error, RMSE) and spatial (sensitivity and specificity) accuracy of material decomposition. Images of calibration and sample phantoms were acquired using a commercially available photon-counting spectral micro-CT system with five energy bins selected to normalize photon counts and leverage the contrast agent k-edge. Material decomposition of gadolinium, calcium, and water was performed for each calibration method using a maximum a posteriori estimator. Both the quantitative and spatial accuracy of material decomposition were most improved by using an increased maximum gadolinium concentration (range) in the basis matrix calibration; the effects of using a greater number of concentrations were relatively small in magnitude by comparison. The material basis matrix calibration was more sensitive to changes in the calibration methods than the scaling factor calibration. The material basis matrix calibration significantly influenced both the quantitative and spatial accuracy of material decomposition, while the scaling factor calibration influenced quantitative but not spatial accuracy. Importantly, the median RMSE of material decomposition was as low as ~1.5 mM (~0.24 mg/mL gadolinium), which was similar in magnitude to that measured by optical spectroscopy on the same samples. The accuracy of quantitative material decomposition in photon-counting spectral CT was significantly influenced by calibration methods which must therefore be carefully considered for the intended diagnostic imaging application. © 2017 American Association of Physicists in Medicine.

  1. Grid workflow validation using ontology-based tacit knowledge: A case study for quantitative remote sensing applications

    NASA Astrophysics Data System (ADS)

    Liu, Jia; Liu, Longli; Xue, Yong; Dong, Jing; Hu, Yingcui; Hill, Richard; Guang, Jie; Li, Chi

    2017-01-01

    Workflow for remote sensing quantitative retrieval is the ;bridge; between Grid services and Grid-enabled application of remote sensing quantitative retrieval. Workflow averts low-level implementation details of the Grid and hence enables users to focus on higher levels of application. The workflow for remote sensing quantitative retrieval plays an important role in remote sensing Grid and Cloud computing services, which can support the modelling, construction and implementation of large-scale complicated applications of remote sensing science. The validation of workflow is important in order to support the large-scale sophisticated scientific computation processes with enhanced performance and to minimize potential waste of time and resources. To research the semantic correctness of user-defined workflows, in this paper, we propose a workflow validation method based on tacit knowledge research in the remote sensing domain. We first discuss the remote sensing model and metadata. Through detailed analysis, we then discuss the method of extracting the domain tacit knowledge and expressing the knowledge with ontology. Additionally, we construct the domain ontology with Protégé. Through our experimental study, we verify the validity of this method in two ways, namely data source consistency error validation and parameters matching error validation.

  2. Towards quantitative mass spectrometry-based metabolomics in microbial and mammalian systems.

    PubMed

    Kapoore, Rahul Vijay; Vaidyanathan, Seetharaman

    2016-10-28

    Metabolome analyses are a suite of analytical approaches that enable us to capture changes in the metabolome (small molecular weight components, typically less than 1500 Da) in biological systems. Mass spectrometry (MS) has been widely used for this purpose. The key challenge here is to be able to capture changes in a reproducible and reliant manner that is representative of the events that take place in vivo Typically, the analysis is carried out in vitro, by isolating the system and extracting the metabolome. MS-based approaches enable us to capture metabolomic changes with high sensitivity and resolution. When developing the technique for different biological systems, there are similarities in challenges and differences that are specific to the system under investigation. Here, we review some of the challenges in capturing quantitative changes in the metabolome with MS based approaches, primarily in microbial and mammalian systems.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Author(s).

  3. Attendance at NHS mandatory training sessions.

    PubMed

    Brand, Darren

    2015-02-17

    To identify factors that affect NHS healthcare professionals' attendance at mandatory training sessions. A quantitative approach was used, with a questionnaire sent to 400 randomly selected participants. A total of 122 responses were received, providing a mix of qualitative and quantitative data. Quantitative data were analysed using statistical methods. Open-ended responses were reviewed using thematic analysis. Clinical staff value mandatory training sessions highly. They are aware of the requirement to keep practice up-to-date and ensure patient safety remains a priority. However, changes to the delivery format of mandatory training sessions are required to enable staff to participate more easily, as staff are often unable to attend. The delivery of mandatory training should move from classroom-based sessions into the clinical area to maximise participation. Delivery should be assisted by local 'experts' who are able to customise course content to meet local requirements and the requirements of different staff groups. Improved arrangements to provide staff cover, for those attending training, would enable more staff to attend training sessions.

  4. Micro/nano-computed tomography technology for quantitative dynamic, multi-scale imaging of morphogenesis.

    PubMed

    Gregg, Chelsea L; Recknagel, Andrew K; Butcher, Jonathan T

    2015-01-01

    Tissue morphogenesis and embryonic development are dynamic events challenging to quantify, especially considering the intricate events that happen simultaneously in different locations and time. Micro- and more recently nano-computed tomography (micro/nanoCT) has been used for the past 15 years to characterize large 3D fields of tortuous geometries at high spatial resolution. We and others have advanced micro/nanoCT imaging strategies for quantifying tissue- and organ-level fate changes throughout morphogenesis. Exogenous soft tissue contrast media enables visualization of vascular lumens and tissues via extravasation. Furthermore, the emergence of antigen-specific tissue contrast enables direct quantitative visualization of protein and mRNA expression. Micro-CT X-ray doses appear to be non-embryotoxic, enabling longitudinal imaging studies in live embryos. In this chapter we present established soft tissue contrast protocols for obtaining high-quality micro/nanoCT images and the image processing techniques useful for quantifying anatomical and physiological information from the data sets.

  5. High-precision morphology: bifocal 4D-microscopy enables the comparison of detailed cell lineages of two chordate species separated for more than 525 million years.

    PubMed

    Stach, Thomas; Anselmi, Chiara

    2015-12-23

    Understanding the evolution of divergent developmental trajectories requires detailed comparisons of embryologies at appropriate levels. Cell lineages, the accurate visualization of cleavage patterns, tissue fate restrictions, and morphogenetic movements that occur during the development of individual embryos are currently available for few disparate animal taxa, encumbering evolutionarily meaningful comparisons. Tunicates, considered to be close relatives of vertebrates, are marine invertebrates whose fossil record dates back to 525 million years ago. Life-history strategies across this subphylum are radically different, and include biphasic ascidians with free swimming larvae and a sessile adult stage, and the holoplanktonic larvaceans. Despite considerable progress, notably on the molecular level, the exact extent of evolutionary conservation and innovation during embryology remain obscure. Here, using the innovative technique of bifocal 4D-microscopy, we demonstrate exactly which characteristics in the cell lineages of the ascidian Phallusia mammillata and the larvacean Oikopleura dioica were conserved and which were altered during evolution. Our accurate cell lineage trees in combination with detailed three-dimensional representations clearly identify conserved correspondence in relative cell position, cell identity, and fate restriction in several lines from all prospective larval tissues. At the same time, we precisely pinpoint differences observable at all levels of development. These differences comprise fate restrictions, tissue types, complex morphogenetic movement patterns, numerous cases of heterochronous acceleration in the larvacean embryo, and differences in bilateral symmetry. Our results demonstrate in extraordinary detail the multitude of developmental levels amenable to evolutionary innovation, including subtle changes in the timing of fate restrictions as well as dramatic alterations in complex morphogenetic movements. We anticipate that the precise spatial and temporal cell lineage data will moreover serve as a high-precision guide to devise experimental investigations of other levels, such as molecular interactions between cells or changes in gene expression underlying the documented structural evolutionary changes. Finally, the quantitative amount of digital high-precision morphological data will enable and necessitate software-based similarity assessments as the basis of homology hypotheses.

  6. Standardizing Quality Assessment of Fused Remotely Sensed Images

    NASA Astrophysics Data System (ADS)

    Pohl, C.; Moellmann, J.; Fries, K.

    2017-09-01

    The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment) in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS) to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR) and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.

  7. Quantitation of Staphylococcus aureus in Seawater Using CHROMagar™ SA

    PubMed Central

    Pombo, David; Hui, Jennifer; Kurano, Michelle; Bankowski, Matthew J; Seifried, Steven E

    2010-01-01

    A microbiological algorithm has been developed to analyze beach water samples for the determination of viable colony forming units (CFU) of Staphylococcus aureus (S. aureus). Membrane filtration enumeration of S. aureus from recreational beach waters using the chromogenic media CHROMagar™SA alone yields a positive predictive value (PPV) of 70%. Presumptive CHROMagar™SA colonies were confirmed as S. aureus by 24-hour tube coagulase test. Combined, these two tests yield a PPV of 100%. This algorithm enables accurate quantitation of S. aureus in seawater in 72 hours and could support risk-prediction processes for recreational waters. A more rapid protocol, utilizing a 4-hour tube coagulase confirmatory test, enables a 48-hour turnaround time with a modest false negative rate of less than 10%. PMID:20222490

  8. High pressure rinsing system comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D. Sertore; M. Fusetti; P. Michelato

    2007-06-01

    High pressure rinsing (HPR) is a key process for the surface preparation of high field superconducting cavities. A portable apparatus for the water jet characterization, based on the transferred momentum between the water jet and a load cell, has been used in different laboratories. This apparatus allows to collected quantitative parameters that characterize the HPR water jet. In this paper, we present a quantitative comparison of the different water jet produced by various nozzles routinely used in different laboratories for the HPR process

  9. Field Demonstration Report Applied Innovative Technologies for Characterization of Nitrocellulose- and Nitroglycerine Contaminated Buildings and Soils, Rev 1

    DTIC Science & Technology

    2007-01-05

    positive / false negatives. The quantitative on-site methods were evaluated using linear regression analysis and relative percent difference (RPD) comparison...Conclusion ...............................................................................................3-9 3.2 Quantitative Analysis Using CRREL...3-37 3.3 Quantitative Analysis for NG by GC/TID.........................................................3-38 3.3.1 Introduction

  10. Propulsion Diagnostic Method Evaluation Strategy (ProDiMES) User's Guide

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.

    2010-01-01

    This report is a User's Guide for the Propulsion Diagnostic Method Evaluation Strategy (ProDiMES). ProDiMES is a standard benchmarking problem and a set of evaluation metrics to enable the comparison of candidate aircraft engine gas path diagnostic methods. This Matlab (The Mathworks, Inc.) based software tool enables users to independently develop and evaluate diagnostic methods. Additionally, a set of blind test case data is also distributed as part of the software. This will enable the side-by-side comparison of diagnostic approaches developed by multiple users. The Users Guide describes the various components of ProDiMES, and provides instructions for the installation and operation of the tool.

  11. Evaluation of intensity drift correction strategies using MetaboDrift, a normalization tool for multi-batch metabolomics data.

    PubMed

    Thonusin, Chanisa; IglayReger, Heidi B; Soni, Tanu; Rothberg, Amy E; Burant, Charles F; Evans, Charles R

    2017-11-10

    In recent years, mass spectrometry-based metabolomics has increasingly been applied to large-scale epidemiological studies of human subjects. However, the successful use of metabolomics in this context is subject to the challenge of detecting biologically significant effects despite substantial intensity drift that often occurs when data are acquired over a long period or in multiple batches. Numerous computational strategies and software tools have been developed to aid in correcting for intensity drift in metabolomics data, but most of these techniques are implemented using command-line driven software and custom scripts which are not accessible to all end users of metabolomics data. Further, it has not yet become routine practice to assess the quantitative accuracy of drift correction against techniques which enable true absolute quantitation such as isotope dilution mass spectrometry. We developed an Excel-based tool, MetaboDrift, to visually evaluate and correct for intensity drift in a multi-batch liquid chromatography - mass spectrometry (LC-MS) metabolomics dataset. The tool enables drift correction based on either quality control (QC) samples analyzed throughout the batches or using QC-sample independent methods. We applied MetaboDrift to an original set of clinical metabolomics data from a mixed-meal tolerance test (MMTT). The performance of the method was evaluated for multiple classes of metabolites by comparison with normalization using isotope-labeled internal standards. QC sample-based intensity drift correction significantly improved correlation with IS-normalized data, and resulted in detection of additional metabolites with significant physiological response to the MMTT. The relative merits of different QC-sample curve fitting strategies are discussed in the context of batch size and drift pattern complexity. Our drift correction tool offers a practical, simplified approach to drift correction and batch combination in large metabolomics studies. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. A reference genetic linkage map of apomictic Hieracium species based on expressed markers derived from developing ovule transcripts

    PubMed Central

    Shirasawa, Kenta; Hand, Melanie L.; Henderson, Steven T.; Okada, Takashi; Johnson, Susan D.; Taylor, Jennifer M.; Spriggs, Andrew; Siddons, Hayley; Hirakawa, Hideki; Isobe, Sachiko; Tabata, Satoshi; Koltunow, Anna M. G.

    2015-01-01

    Background and Aims Apomixis in plants generates clonal progeny with a maternal genotype through asexual seed formation. Hieracium subgenus Pilosella (Asteraceae) contains polyploid, highly heterozygous apomictic and sexual species. Within apomictic Hieracium, dominant genetic loci independently regulate the qualitative developmental components of apomixis. In H. praealtum, LOSS OF APOMEIOSIS (LOA) enables formation of embryo sacs without meiosis and LOSS OF PARTHENOGENESIS (LOP) enables fertilization-independent seed formation. A locus required for fertilization-independent endosperm formation (AutE) has been identified in H. piloselloides. Additional quantitative loci appear to influence the penetrance of the qualitative loci, although the controlling genes remain unknown. This study aimed to develop the first genetic linkage maps for sexual and apomictic Hieracium species using simple sequence repeat (SSR) markers derived from expressed transcripts within the developing ovaries. Methods RNA from microdissected Hieracium ovule cell types and ovaries was sequenced and SSRs were identified. Two different F1 mapping populations were created to overcome difficulties associated with genome complexity and asexual reproduction. SSR markers were analysed within each mapping population to generate draft linkage maps for apomictic and sexual Hieracium species. Key Results A collection of 14 684 Hieracium expressed SSR markers were developed and linkage maps were constructed for Hieracium species using a subset of the SSR markers. Both the LOA and LOP loci were successfully assigned to linkage groups; however, AutE could not be mapped using the current populations. Comparisons with lettuce (Lactuca sativa) revealed partial macrosynteny between the two Asteraceae species. Conclusions A collection of SSR markers and draft linkage maps were developed for two apomictic and one sexual Hieracium species. These maps will support cloning of controlling genes at LOA and LOP loci in Hieracium and should also assist with identification of quantitative loci that affect the expressivity of apomixis. Future work will focus on mapping AutE using alternative populations. PMID:25538115

  13. Optimized protocol for quantitative multiple reaction monitoring-based proteomic analysis of formalin-fixed, paraffin embedded tissues

    PubMed Central

    Kennedy, Jacob J.; Whiteaker, Jeffrey R.; Schoenherr, Regine M.; Yan, Ping; Allison, Kimberly; Shipley, Melissa; Lerch, Melissa; Hoofnagle, Andrew N.; Baird, Geoffrey Stuart; Paulovich, Amanda G.

    2016-01-01

    Despite a clinical, economic, and regulatory imperative to develop companion diagnostics, precious few new biomarkers have been successfully translated into clinical use, due in part to inadequate protein assay technologies to support large-scale testing of hundreds of candidate biomarkers in formalin-fixed paraffin embedded (FFPE) tissues. While the feasibility of using targeted, multiple reaction monitoring-mass spectrometry (MRM-MS) for quantitative analyses of FFPE tissues has been demonstrated, protocols have not been systematically optimized for robust quantification across a large number of analytes, nor has the performance of peptide immuno-MRM been evaluated. To address this gap, we used a test battery approach coupled to MRM-MS with the addition of stable isotope labeled standard peptides (targeting 512 analytes) to quantitatively evaluate the performance of three extraction protocols in combination with three trypsin digestion protocols (i.e. 9 processes). A process based on RapiGest buffer extraction and urea-based digestion was identified to enable similar quantitation results from FFPE and frozen tissues. Using the optimized protocols for MRM-based analysis of FFPE tissues, median precision was 11.4% (across 249 analytes). There was excellent correlation between measurements made on matched FFPE and frozen tissues, both for direct MRM analysis (R2 = 0.94) and immuno-MRM (R2 = 0.89). The optimized process enables highly reproducible, multiplex, standardizable, quantitative MRM in archival tissue specimens. PMID:27462933

  14. Interpreting comprehensive two-dimensional gas chromatography using peak topography maps with application to petroleum forensics.

    PubMed

    Ghasemi Damavandi, Hamidreza; Sen Gupta, Ananya; Nelson, Robert K; Reddy, Christopher M

    2016-01-01

    Comprehensive two-dimensional gas chromatography [Formula: see text] provides high-resolution separations across hundreds of compounds in a complex mixture, thus unlocking unprecedented information for intricate quantitative interpretation. We exploit this compound diversity across the [Formula: see text] topography to provide quantitative compound-cognizant interpretation beyond target compound analysis with petroleum forensics as a practical application. We focus on the [Formula: see text] topography of biomarker hydrocarbons, hopanes and steranes, as they are generally recalcitrant to weathering. We introduce peak topography maps (PTM) and topography partitioning techniques that consider a notably broader and more diverse range of target and non-target biomarker compounds compared to traditional approaches that consider approximately 20 biomarker ratios. Specifically, we consider a range of 33-154 target and non-target biomarkers with highest-to-lowest peak ratio within an injection ranging from 4.86 to 19.6 (precise numbers depend on biomarker diversity of individual injections). We also provide a robust quantitative measure for directly determining "match" between samples, without necessitating training data sets. We validate our methods across 34 [Formula: see text] injections from a diverse portfolio of petroleum sources, and provide quantitative comparison of performance against established statistical methods such as principal components analysis (PCA). Our data set includes a wide range of samples collected following the 2010 Deepwater Horizon disaster that released approximately 160 million gallons of crude oil from the Macondo well (MW). Samples that were clearly collected following this disaster exhibit statistically significant match [Formula: see text] using PTM-based interpretation against other closely related sources. PTM-based interpretation also provides higher differentiation between closely correlated but distinct sources than obtained using PCA-based statistical comparisons. In addition to results based on this experimental field data, we also provide extentive perturbation analysis of the PTM method over numerical simulations that introduce random variability of peak locations over the [Formula: see text] biomarker ROI image of the MW pre-spill sample (sample [Formula: see text] in Additional file 4: Table S1). We compare the robustness of the cross-PTM score against peak location variability in both dimensions and compare the results against PCA analysis over the same set of simulated images. Detailed description of the simulation experiment and discussion of results are provided in Additional file 1: Section S8. We provide a peak-cognizant informational framework for quantitative interpretation of [Formula: see text] topography. Proposed topographic analysis enables [Formula: see text] forensic interpretation across target petroleum biomarkers, while including the nuances of lesser-known non-target biomarkers clustered around the target peaks. This allows potential discovery of hitherto unknown connections between target and non-target biomarkers.

  15. Coupling Reagent for UV/vis Absorbing Azobenzene-Based Quantitative Analysis of the Extent of Functional Group Immobilization on Silica.

    PubMed

    Choi, Ra-Young; Lee, Chang-Hee; Jun, Chul-Ho

    2018-05-18

    A methallylsilane coupling reagent, containing both a N-hydroxysuccinimidyl(NHS)-ester group and a UV/vis absorbing azobenzene linker undergoes acid-catalyzed immobilization on silica. Analysis of the UV/vis absorption band associated with the azobenzene group in the adduct enables facile quantitative determination of the extent of loading of the NHS groups. Reaction of NHS-groups on the silica surface with amine groups of GOx and rhodamine can be employed to generate enzyme or dye-immobilized silica for quantitative analysis.

  16. A quantitative reconstruction software suite for SPECT imaging

    NASA Astrophysics Data System (ADS)

    Namías, Mauro; Jeraj, Robert

    2017-11-01

    Quantitative Single Photon Emission Tomography (SPECT) imaging allows for measurement of activity concentrations of a given radiotracer in vivo. Although SPECT has usually been perceived as non-quantitative by the medical community, the introduction of accurate CT based attenuation correction and scatter correction from hybrid SPECT/CT scanners has enabled SPECT systems to be as quantitative as Positron Emission Tomography (PET) systems. We implemented a software suite to reconstruct quantitative SPECT images from hybrid or dedicated SPECT systems with a separate CT scanner. Attenuation, scatter and collimator response corrections were included in an Ordered Subset Expectation Maximization (OSEM) algorithm. A novel scatter fraction estimation technique was introduced. The SPECT/CT system was calibrated with a cylindrical phantom and quantitative accuracy was assessed with an anthropomorphic phantom and a NEMA/IEC image quality phantom. Accurate activity measurements were achieved at an organ level. This software suite helps increasing quantitative accuracy of SPECT scanners.

  17. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons

    PubMed Central

    2014-01-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829

  18. Surgical task analysis of simulated laparoscopic cholecystectomy with a navigation system.

    PubMed

    Sugino, T; Kawahira, H; Nakamura, R

    2014-09-01

       Advanced surgical procedures, which have become complex and difficult, increase the burden of surgeons. Quantitative analysis of surgical procedures can improve training, reduce variability, and enable optimization of surgical procedures. To this end, a surgical task analysis system was developed that uses only surgical navigation information.    Division of the surgical procedure, task progress analysis, and task efficiency analysis were done. First, the procedure was divided into five stages. Second, the operating time and progress rate were recorded to document task progress during specific stages, including the dissecting task. Third, the speed of the surgical instrument motion (mean velocity and acceleration), as well as the size and overlap ratio of the approximate ellipse of the location log data distribution, was computed to estimate the task efficiency during each stage. These analysis methods were evaluated based on experimental validation with two groups of surgeons, i.e., skilled and "other" surgeons. The performance metrics and analytical parameters included incidents during the operation, the surgical environment, and the surgeon's skills or habits.    Comparison of groups revealed that skilled surgeons tended to perform the procedure in less time and involved smaller regions; they also manipulated the surgical instruments more gently.    Surgical task analysis developed for quantitative assessment of surgical procedures and surgical performance may provide practical methods and metrics for objective evaluation of surgical expertise.

  19. Automation of the ELISpot assay for high-throughput detection of antigen-specific T-cell responses.

    PubMed

    Almeida, Coral-Ann M; Roberts, Steven G; Laird, Rebecca; McKinnon, Elizabeth; Ahmed, Imran; Pfafferott, Katja; Turley, Joanne; Keane, Niamh M; Lucas, Andrew; Rushton, Ben; Chopra, Abha; Mallal, Simon; John, Mina

    2009-05-15

    The enzyme linked immunospot (ELISpot) assay is a fundamental tool in cellular immunology, providing both quantitative and qualitative information on cellular cytokine responses to defined antigens. It enables the comprehensive screening of patient derived peripheral blood mononuclear cells to reveal the antigenic restriction of T-cell responses and is an emerging technique in clinical laboratory investigation of certain infectious diseases. As with all cellular-based assays, the final results of the assay are dependent on a number of technical variables that may impact precision if not highly standardised between operators. When studies that are large scale or using multiple antigens are set up manually, these assays may be labour intensive, have many manual handling steps, are subject to data and sample integrity failure and may show large inter-operator variability. Here we describe the successful automated performance of the interferon (IFN)-gamma ELISpot assay from cell counting through to electronic capture of cytokine quantitation and present the results of a comparison between automated and manual performance of the ELISpot assay. The mean number of spot forming units enumerated by both methods for limiting dilutions of CMV, EBV and influenza (CEF)-derived peptides in six healthy individuals were highly correlated (r>0.83, p<0.05). The precision results from the automated system compared favourably with the manual ELISpot and further ensured electronic tracking, increased through-put and reduced turnaround time.

  20. PET guidance for liver radiofrequency ablation: an evaluation

    NASA Astrophysics Data System (ADS)

    Lei, Peng; Dandekar, Omkar; Mahmoud, Faaiza; Widlus, David; Malloy, Patrick; Shekhar, Raj

    2007-03-01

    Radiofrequency ablation (RFA) is emerging as the primary mode of treatment of unresectable malignant liver tumors. With current intraoperative imaging modalities, quick, precise, and complete localization of lesions remains a challenge for liver RFA. Fusion of intraoperative CT and preoperative PET images, which relies on PET and CT registration, can produce a new image with complementary metabolic and anatomic data and thus greatly improve the targeting accuracy. Unlike neurological images, alignment of abdominal images by combined PET/CT scanner is prone to errors as a result of large nonrigid misalignment in abdominal images. Our use of a normalized mutual information-based 3D nonrigid registration technique has proven powerful for whole-body PET and CT registration. We demonstrate here that this technique is capable of acceptable abdominal PET and CT registration as well. In five clinical cases, both qualitative and quantitative validation showed that the registration is robust and accurate. Quantitative accuracy was evaluated by comparison between the result from the algorithm and clinical experts. The accuracy of registration is much less than the allowable margin in liver RFA. Study findings show the technique's potential to enable the augmentation of intraoperative CT with preoperative PET to reduce procedure time, avoid repeating procedures, provide clinicians with complementary functional/anatomic maps, avoid omitting dispersed small lesions, and improve the accuracy of tumor targeting in liver RFA.

  1. Fast analysis of wood preservers using laser induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Uhl, A.; Loebe, K.; Kreuchwig, L.

    2001-06-01

    Laser-induced breakdown spectroscopy (LIBS) is used for the investigation of wood preservers in timber and in furniture. Both experiments in laboratory and practical applications in recycling facilities and on a building site prove the new possibilities for the fast detection of harmful agents in wood. A commercial system was developed for mobile laser-plasma-analysis as well as for industrial use in sorting plants. The universal measuring principle in combination with an Echelle optics permits real simultaneous multi-element-analysis in the range of 200-780 nm with a resolution of a few picometers. It enables the user to detect main and trace elements in wood within a few seconds, nearly independent of the matrix, knowing that different kinds of wood show an equal elemental composition. Sample preparation is not required. The quantitative analysis of inorganic wood preservers (containing, e.g. Cu, Cr, B, As, Pb, Hg) has been performed exactly using carbon as reference element. It can be shown that the detection limits for heavy metals in wood are in the ppm-range. Additional information is given concerning the quantitative analysis. Statistical data, e.g. the standard deviation (S.D.), were determined and calibration curves were used for each particular element. A comparison between ICP-AES and LIBS is given using depth profile correction factors regarding the different penetration depths with respect to the different volumes in wood analyzed by both analytical methods.

  2. Phase-contrast Hounsfield units of fixated and non-fixated soft-tissue samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willner, Marian; Fior, Gabriel; Marschner, Mathias

    X-ray phase-contrast imaging is a novel technology that achieves high soft-tissue contrast. Although its clinical impact is still under investigation, the technique may potentially improve clinical diagnostics. In conventional attenuation-based X-ray computed tomography, radiological diagnostics are quantified by Hounsfield units. Corresponding Hounsfield units for phase-contrast imaging have been recently introduced, enabling a setup-independent comparison and standardized interpretation of imaging results. Thus far, the experimental values of few tissue types have been reported; these values have been determined from fixated tissue samples. This study presents phase-contrast Hounsfield units for various types of non-fixated human soft tissues. A large variety of tissuemore » specimens ranging from adipose, muscle and connective tissues to liver, kidney and pancreas tissues were imaged by a grating interferometer with a rotating-anode X-ray tube and a photon-counting detector. In addition, we investigated the effects of formalin fixation on the quantitative phase-contrast imaging results.« less

  3. A biomechanical modeling guided simultaneous motion estimation and image reconstruction technique (SMEIR-Bio) for 4D-CBCT reconstruction

    NASA Astrophysics Data System (ADS)

    Huang, Xiaokun; Zhang, You; Wang, Jing

    2017-03-01

    Four-dimensional (4D) cone-beam computed tomography (CBCT) enables motion tracking of anatomical structures and removes artifacts introduced by motion. However, the imaging time/dose of 4D-CBCT is substantially longer/higher than traditional 3D-CBCT. We previously developed a simultaneous motion estimation and image reconstruction (SMEIR) algorithm, to reconstruct high-quality 4D-CBCT from limited number of projections to reduce the imaging time/dose. However, the accuracy of SMEIR is limited in reconstructing low-contrast regions with fine structure details. In this study, we incorporate biomechanical modeling into the SMEIR algorithm (SMEIR-Bio), to improve the reconstruction accuracy at low-contrast regions with fine details. The efficacy of SMEIR-Bio is evaluated using 11 lung patient cases and compared to that of the original SMEIR algorithm. Qualitative and quantitative comparisons showed that SMEIR-Bio greatly enhances the accuracy of reconstructed 4D-CBCT volume in low-contrast regions, which can potentially benefit multiple clinical applications including the treatment outcome analysis.

  4. Design of an Electric Propulsion System for SCEPTOR

    NASA Technical Reports Server (NTRS)

    Dubois, Arthur; van der Geest, Martin; Bevirt, JoeBen; Clarke, Sean; Christie, Robert J.; Borer, Nicholas K.

    2016-01-01

    The rise of electric propulsion systems has pushed aircraft designers towards new and potentially transformative concepts. As part of this effort, NASA is leading the SCEPTOR program which aims at designing a fully electric distributed propulsion general aviation aircraft. This article highlights critical aspects of the design of SCEPTOR's propulsion system conceived at Joby Aviation in partnership with NASA, including motor electromagnetic design and optimization as well as cooling system integration. The motor is designed with a finite element based multi-objective optimization approach. This provides insight into important design tradeoffs such as mass versus efficiency, and enables a detailed quantitative comparison between different motor topologies. Secondly, a complete design and Computational Fluid Dynamics analysis of the air breathing cooling system is presented. The cooling system is fully integrated into the nacelle, contains little to no moving parts and only incurs a small drag penalty. Several concepts are considered and compared over a range of operating conditions. The study presents trade-offs between various parameters such as cooling efficiency, drag, mechanical simplicity and robustness.

  5. Relationship of Interplanetary Shock Micro and Macro Characteristics: A Wind Study

    NASA Technical Reports Server (NTRS)

    Szabo, Adam; Koval, A

    2008-01-01

    The non-linear least squared MHD fitting technique of Szabo 11 9941 has been recently further refined to provide realistic confidence regions for interplanetary shock normal directions and speeds. Analyzing Wind observed interplanetary shocks from 1995 to 200 1, macro characteristics such as shock strength, Theta Bn and Mach numbers can be compared to the details of shock micro or kinetic structures. The now commonly available very high time resolution (1 1 or 22 vectors/sec) Wind magnetic field data allows the precise characterization of shock kinetic structures, such as the size of the foot, ramp, overshoot and the duration of damped oscillations on either side of the shock. Detailed comparison of the shock micro and macro characteristics will be given. This enables the elucidation of shock kinetic features, relevant for particle energization processes, for observations where high time resolution data is not available. Moreover, establishing a quantitative relationship between the shock micro and macro structures will improve the confidence level of shock fitting techniques during disturbed solar wind conditions.

  6. A Comparison of Compressed Sensing and Sparse Recovery Algorithms Applied to Simulation Data

    DOE PAGES

    Fan, Ya Ju; Kamath, Chandrika

    2016-09-01

    The move toward exascale computing for scientific simulations is placing new demands on compression techniques. It is expected that the I/O system will not be able to support the volume of data that is expected to be written out. To enable quantitative analysis and scientific discovery, we are interested in techniques that compress high-dimensional simulation data and can provide perfect or near-perfect reconstruction. In this paper, we explore the use of compressed sensing (CS) techniques to reduce the size of the data before they are written out. Using large-scale simulation data, we investigate how the sufficient sparsity condition and themore » contrast in the data affect the quality of reconstruction and the degree of compression. Also, we provide suggestions for the practical implementation of CS techniques and compare them with other sparse recovery methods. Finally, our results show that despite longer times for reconstruction, compressed sensing techniques can provide near perfect reconstruction over a range of data with varying sparsity.« less

  7. A Comparison of Compressed Sensing and Sparse Recovery Algorithms Applied to Simulation Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fan, Ya Ju; Kamath, Chandrika

    The move toward exascale computing for scientific simulations is placing new demands on compression techniques. It is expected that the I/O system will not be able to support the volume of data that is expected to be written out. To enable quantitative analysis and scientific discovery, we are interested in techniques that compress high-dimensional simulation data and can provide perfect or near-perfect reconstruction. In this paper, we explore the use of compressed sensing (CS) techniques to reduce the size of the data before they are written out. Using large-scale simulation data, we investigate how the sufficient sparsity condition and themore » contrast in the data affect the quality of reconstruction and the degree of compression. Also, we provide suggestions for the practical implementation of CS techniques and compare them with other sparse recovery methods. Finally, our results show that despite longer times for reconstruction, compressed sensing techniques can provide near perfect reconstruction over a range of data with varying sparsity.« less

  8. Atlas-based liver segmentation and hepatic fat-fraction assessment for clinical trials.

    PubMed

    Yan, Zhennan; Zhang, Shaoting; Tan, Chaowei; Qin, Hongxing; Belaroussi, Boubakeur; Yu, Hui Jing; Miller, Colin; Metaxas, Dimitris N

    2015-04-01

    Automated assessment of hepatic fat-fraction is clinically important. A robust and precise segmentation would enable accurate, objective and consistent measurement of hepatic fat-fraction for disease quantification, therapy monitoring and drug development. However, segmenting the liver in clinical trials is a challenging task due to the variability of liver anatomy as well as the diverse sources the images were acquired from. In this paper, we propose an automated and robust framework for liver segmentation and assessment. It uses single statistical atlas registration to initialize a robust deformable model to obtain fine segmentation. Fat-fraction map is computed by using chemical shift based method in the delineated region of liver. This proposed method is validated on 14 abdominal magnetic resonance (MR) volumetric scans. The qualitative and quantitative comparisons show that our proposed method can achieve better segmentation accuracy with less variance comparing with two other atlas-based methods. Experimental results demonstrate the promises of our assessment framework. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Imaging of oxygen and hypoxia in cell and tissue samples.

    PubMed

    Papkovsky, Dmitri B; Dmitriev, Ruslan I

    2018-05-14

    Molecular oxygen (O 2 ) is a key player in cell mitochondrial function, redox balance and oxidative stress, normal tissue function and many common disease states. Various chemical, physical and biological methods have been proposed for measurement, real-time monitoring and imaging of O 2 concentration, state of decreased O 2 (hypoxia) and related parameters in cells and tissue. Here, we review the established and emerging optical microscopy techniques allowing to visualize O 2 levels in cells and tissue samples, mostly under in vitro and ex vivo, but also under in vivo settings. Particular examples include fluorescent hypoxia stains, fluorescent protein reporter systems, phosphorescent probes and nanosensors of different types. These techniques allow high-resolution mapping of O 2 gradients in live or post-mortem tissue, in 2D or 3D, qualitatively or quantitatively. They enable control and monitoring of oxygenation conditions and their correlation with other biomarkers of cell and tissue function. Comparison of these techniques and corresponding imaging setups, their analytical capabilities and typical applications are given.

  10. High-Speed and Scalable Whole-Brain Imaging in Rodents and Primates.

    PubMed

    Seiriki, Kaoru; Kasai, Atsushi; Hashimoto, Takeshi; Schulze, Wiebke; Niu, Misaki; Yamaguchi, Shun; Nakazawa, Takanobu; Inoue, Ken-Ichi; Uezono, Shiori; Takada, Masahiko; Naka, Yuichiro; Igarashi, Hisato; Tanuma, Masato; Waschek, James A; Ago, Yukio; Tanaka, Kenji F; Hayata-Takano, Atsuko; Nagayasu, Kazuki; Shintani, Norihito; Hashimoto, Ryota; Kunii, Yasuto; Hino, Mizuki; Matsumoto, Junya; Yabe, Hirooki; Nagai, Takeharu; Fujita, Katsumasa; Matsuda, Toshio; Takuma, Kazuhiro; Baba, Akemichi; Hashimoto, Hitoshi

    2017-06-21

    Subcellular resolution imaging of the whole brain and subsequent image analysis are prerequisites for understanding anatomical and functional brain networks. Here, we have developed a very high-speed serial-sectioning imaging system named FAST (block-face serial microscopy tomography), which acquires high-resolution images of a whole mouse brain in a speed range comparable to that of light-sheet fluorescence microscopy. FAST enables complete visualization of the brain at a resolution sufficient to resolve all cells and their subcellular structures. FAST renders unbiased quantitative group comparisons of normal and disease model brain cells for the whole brain at a high spatial resolution. Furthermore, FAST is highly scalable to non-human primate brains and human postmortem brain tissues, and can visualize neuronal projections in a whole adult marmoset brain. Thus, FAST provides new opportunities for global approaches that will allow for a better understanding of brain systems in multiple animal models and in human diseases. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Adapting CALIPSO Climate Measurements for Near Real Time Analyses and Forecasting

    NASA Technical Reports Server (NTRS)

    Vaughan, Mark A.; Trepte, Charles R.; Winker, David M.; Avery, Melody A.; Campbell, James; Hoff, Ray; Young, Stuart; Getzewich, Brian J.; Tackett, Jason L.; Kar, Jayanta

    2011-01-01

    The Cloud-Aerosol Lidar and Infrared Pathfinder satellite Observations (CALIPSO) mission was originally conceived and designed as a climate measurements mission, with considerable latency between data acquisition and the release of the level 1 and level 2 data products. However, the unique nature of the CALIPSO lidar backscatter profiles quickly led to the qualitative use of CALIPSO?s near real time (i.e., ? expedited?) lidar data imagery in several different forecasting applications. To enable quantitative use of their near real time analyses, the CALIPSO project recently expanded their expedited data catalog to include all of the standard level 1 and level 2 lidar data products. Also included is a new cloud cleared level 1.5 profile product developed for use by operational forecast centers for verification of aerosol predictions. This paper describes the architecture and content of the CALIPSO expedited data products. The fidelity and accuracy of the expedited products are assessed via comparisons to the standard CALIPSO data products.

  12. Phase-Contrast Hounsfield Units of Fixated and Non-Fixated Soft-Tissue Samples

    PubMed Central

    Willner, Marian; Fior, Gabriel; Marschner, Mathias; Birnbacher, Lorenz; Schock, Jonathan; Braun, Christian; Fingerle, Alexander A.; Noël, Peter B.; Rummeny, Ernst J.; Pfeiffer, Franz; Herzen, Julia

    2015-01-01

    X-ray phase-contrast imaging is a novel technology that achieves high soft-tissue contrast. Although its clinical impact is still under investigation, the technique may potentially improve clinical diagnostics. In conventional attenuation-based X-ray computed tomography, radiological diagnostics are quantified by Hounsfield units. Corresponding Hounsfield units for phase-contrast imaging have been recently introduced, enabling a setup-independent comparison and standardized interpretation of imaging results. Thus far, the experimental values of few tissue types have been reported; these values have been determined from fixated tissue samples. This study presents phase-contrast Hounsfield units for various types of non-fixated human soft tissues. A large variety of tissue specimens ranging from adipose, muscle and connective tissues to liver, kidney and pancreas tissues were imaged by a grating interferometer with a rotating-anode X-ray tube and a photon-counting detector. Furthermore, we investigated the effects of formalin fixation on the quantitative phase-contrast imaging results. PMID:26322638

  13. Magnon spectrum of the helimagnetic insulator Cu 2OSeO 3

    DOE PAGES

    Portnichenko, P. Y.; Romhányi, J.; Onykiienko, Y. A.; ...

    2016-02-25

    We report that complex low-temperature-ordered states in chiral magnets are typically governed by a competition between multiple magnetic interactions. The chiral-lattice multiferroic Cu 2OSeO 3 became the first insulating helimagnetic material in which a long-range order of topologically stable spin vortices known as skyrmions was established. Here we employ state-of-the-art inelastic neutron scattering to comprehend the full three-dimensional spin-excitation spectrum of Cu 2OSeO 3 over a broad range of energies. Distinct types of high- and low-energy dispersive magnon modes separated by an extensive energy gap are observed in excellent agreement with the previously suggested microscopic theory based on a modelmore » of entangled Cu 4 tetrahedra. The comparison of our neutron spectroscopy data with model spin-dynamical calculations based on these theoretical proposals enables an accurate quantitative verification of the fundamental magnetic interactions in Cu 2OSeO 3 that are essential for understanding its abundant low-temperature magnetically ordered phases.« less

  14. Magnon spectrum of the helimagnetic insulator Cu2OSeO3

    PubMed Central

    Portnichenko, P. Y.; Romhányi, J.; Onykiienko, Y. A.; Henschel, A.; Schmidt, M.; Cameron, A. S.; Surmach, M. A.; Lim, J. A.; Park, J. T.; Schneidewind, A.; Abernathy, D. L.; Rosner, H.; van den Brink, Jeroen; Inosov, D. S.

    2016-01-01

    Complex low-temperature-ordered states in chiral magnets are typically governed by a competition between multiple magnetic interactions. The chiral-lattice multiferroic Cu2OSeO3 became the first insulating helimagnetic material in which a long-range order of topologically stable spin vortices known as skyrmions was established. Here we employ state-of-the-art inelastic neutron scattering to comprehend the full three-dimensional spin-excitation spectrum of Cu2OSeO3 over a broad range of energies. Distinct types of high- and low-energy dispersive magnon modes separated by an extensive energy gap are observed in excellent agreement with the previously suggested microscopic theory based on a model of entangled Cu4 tetrahedra. The comparison of our neutron spectroscopy data with model spin-dynamical calculations based on these theoretical proposals enables an accurate quantitative verification of the fundamental magnetic interactions in Cu2OSeO3 that are essential for understanding its abundant low-temperature magnetically ordered phases. PMID:26911567

  15. Atmospheric Environment Vulnerability Cause Analysis for the Beijing-Tianjin-Hebei Metropolitan Region

    PubMed Central

    Zhang, Yang; Shen, Jing; Li, Yu

    2018-01-01

    Assessing and quantifying atmospheric vulnerability is a key issue in urban environmental protection and management. This paper integrated the Analytical hierarchy process (AHP), fuzzy synthesis evaluation and Geographic Information System (GIS) spatial analysis into an Exposure-Sensitivity-Adaptive capacity (ESA) framework to quantitatively assess atmospheric environment vulnerability in the Beijing-Tianjin-Hebei (BTH) region with spatial and temporal comparisons. The elaboration of the relationships between atmospheric environment vulnerability and indices of exposure, sensitivity, and adaptive capacity supports enable analysis of the atmospheric environment vulnerability. Our findings indicate that the atmospheric environment vulnerability of 13 cities in the BTH region exhibits obvious spatial heterogeneity, which is caused by regional diversity in exposure, sensitivity, and adaptive capacity indices. The results of atmospheric environment vulnerability assessment and the cause analysis can provide guidance to pick out key control regions and recognize vulnerable indicators for study sites. The framework developed in this paper can also be replicated at different spatial and temporal scales using context-specific datasets to support environmental management. PMID:29342852

  16. Completeness and overlap in open access systems: Search engines, aggregate institutional repositories and physics-related open sources.

    PubMed

    Tsay, Ming-Yueh; Wu, Tai-Luan; Tseng, Ling-Li

    2017-01-01

    This study examines the completeness and overlap of coverage in physics of six open access scholarly communication systems, including two search engines (Google Scholar and Microsoft Academic), two aggregate institutional repositories (OAIster and OpenDOAR), and two physics-related open sources (arXiv.org and Astrophysics Data System). The 2001-2013 Nobel Laureates in Physics served as the sample. Bibliographic records of their publications were retrieved and downloaded from each system, and a computer program was developed to perform the analytical tasks of sorting, comparison, elimination, aggregation and statistical calculations. Quantitative analyses and cross-referencing were performed to determine the completeness and overlap of the system coverage of the six open access systems. The results may enable scholars to select an appropriate open access system as an efficient scholarly communication channel, and academic institutions may build institutional repositories or independently create citation index systems in the future. Suggestions on indicators and tools for academic assessment are presented based on the comprehensiveness assessment of each system.

  17. Phase-contrast Hounsfield units of fixated and non-fixated soft-tissue samples

    DOE PAGES

    Willner, Marian; Fior, Gabriel; Marschner, Mathias; ...

    2015-08-31

    X-ray phase-contrast imaging is a novel technology that achieves high soft-tissue contrast. Although its clinical impact is still under investigation, the technique may potentially improve clinical diagnostics. In conventional attenuation-based X-ray computed tomography, radiological diagnostics are quantified by Hounsfield units. Corresponding Hounsfield units for phase-contrast imaging have been recently introduced, enabling a setup-independent comparison and standardized interpretation of imaging results. Thus far, the experimental values of few tissue types have been reported; these values have been determined from fixated tissue samples. This study presents phase-contrast Hounsfield units for various types of non-fixated human soft tissues. A large variety of tissuemore » specimens ranging from adipose, muscle and connective tissues to liver, kidney and pancreas tissues were imaged by a grating interferometer with a rotating-anode X-ray tube and a photon-counting detector. In addition, we investigated the effects of formalin fixation on the quantitative phase-contrast imaging results.« less

  18. Investigating Factorial Invariance of Latent Variables Across Populations When Manifest Variables Are Missing Completely

    PubMed Central

    Widaman, Keith F.; Grimm, Kevin J.; Early, Dawnté R.; Robins, Richard W.; Conger, Rand D.

    2013-01-01

    Difficulties arise in multiple-group evaluations of factorial invariance if particular manifest variables are missing completely in certain groups. Ad hoc analytic alternatives can be used in such situations (e.g., deleting manifest variables), but some common approaches, such as multiple imputation, are not viable. At least 3 solutions to this problem are viable: analyzing differing sets of variables across groups, using pattern mixture approaches, and a new method using random number generation. The latter solution, proposed in this article, is to generate pseudo-random normal deviates for all observations for manifest variables that are missing completely in a given sample and then to specify multiple-group models in a way that respects the random nature of these values. An empirical example is presented in detail comparing the 3 approaches. The proposed solution can enable quantitative comparisons at the latent variable level between groups using programs that require the same number of manifest variables in each group. PMID:24019738

  19. Magnon spectrum of the helimagnetic insulator Cu2OSeO3.

    PubMed

    Portnichenko, P Y; Romhányi, J; Onykiienko, Y A; Henschel, A; Schmidt, M; Cameron, A S; Surmach, M A; Lim, J A; Park, J T; Schneidewind, A; Abernathy, D L; Rosner, H; van den Brink, Jeroen; Inosov, D S

    2016-02-25

    Complex low-temperature-ordered states in chiral magnets are typically governed by a competition between multiple magnetic interactions. The chiral-lattice multiferroic Cu2OSeO3 became the first insulating helimagnetic material in which a long-range order of topologically stable spin vortices known as skyrmions was established. Here we employ state-of-the-art inelastic neutron scattering to comprehend the full three-dimensional spin-excitation spectrum of Cu2OSeO3 over a broad range of energies. Distinct types of high- and low-energy dispersive magnon modes separated by an extensive energy gap are observed in excellent agreement with the previously suggested microscopic theory based on a model of entangled Cu4 tetrahedra. The comparison of our neutron spectroscopy data with model spin-dynamical calculations based on these theoretical proposals enables an accurate quantitative verification of the fundamental magnetic interactions in Cu2OSeO3 that are essential for understanding its abundant low-temperature magnetically ordered phases.

  20. Automated frame selection process for high-resolution microendoscopy

    NASA Astrophysics Data System (ADS)

    Ishijima, Ayumu; Schwarz, Richard A.; Shin, Dongsuk; Mondrik, Sharon; Vigneswaran, Nadarajah; Gillenwater, Ann M.; Anandasabapathy, Sharmila; Richards-Kortum, Rebecca

    2015-04-01

    We developed an automated frame selection algorithm for high-resolution microendoscopy video sequences. The algorithm rapidly selects a representative frame with minimal motion artifact from a short video sequence, enabling fully automated image analysis at the point-of-care. The algorithm was evaluated by quantitative comparison of diagnostically relevant image features and diagnostic classification results obtained using automated frame selection versus manual frame selection. A data set consisting of video sequences collected in vivo from 100 oral sites and 167 esophageal sites was used in the analysis. The area under the receiver operating characteristic curve was 0.78 (automated selection) versus 0.82 (manual selection) for oral sites, and 0.93 (automated selection) versus 0.92 (manual selection) for esophageal sites. The implementation of fully automated high-resolution microendoscopy at the point-of-care has the potential to reduce the number of biopsies needed for accurate diagnosis of precancer and cancer in low-resource settings where there may be limited infrastructure and personnel for standard histologic analysis.

  1. Atmospheric Environment Vulnerability Cause Analysis for the Beijing-Tianjin-Hebei Metropolitan Region.

    PubMed

    Zhang, Yang; Shen, Jing; Li, Yu

    2018-01-13

    Assessing and quantifying atmospheric vulnerability is a key issue in urban environmental protection and management. This paper integrated the Analytical hierarchy process (AHP), fuzzy synthesis evaluation and Geographic Information System (GIS) spatial analysis into an Exposure-Sensitivity-Adaptive capacity (ESA) framework to quantitatively assess atmospheric environment vulnerability in the Beijing-Tianjin-Hebei (BTH) region with spatial and temporal comparisons. The elaboration of the relationships between atmospheric environment vulnerability and indices of exposure, sensitivity, and adaptive capacity supports enable analysis of the atmospheric environment vulnerability. Our findings indicate that the atmospheric environment vulnerability of 13 cities in the BTH region exhibits obvious spatial heterogeneity, which is caused by regional diversity in exposure, sensitivity, and adaptive capacity indices. The results of atmospheric environment vulnerability assessment and the cause analysis can provide guidance to pick out key control regions and recognize vulnerable indicators for study sites. The framework developed in this paper can also be replicated at different spatial and temporal scales using context-specific datasets to support environmental management.

  2. Development of soft scaffolding strategy to improve student’s creative thinking ability in physics

    NASA Astrophysics Data System (ADS)

    Nurulsari, Novinta; Abdurrahman; Suyatna, Agus

    2017-11-01

    Student’s creative thinking ability in physics learning can be developed through a learning experience. However, many students fail to gain a learning experience because of the lack of teacher roles in providing assistance to students when they face learning difficulties. In this study, a soft scaffolding strategy developed to improve student’s creative thinking ability in physics, especially in optical instruments. The methods used were qualitative and quantitative. The soft scaffolding strategy developed was called the 6E Soft Scaffolding Strategy where 6E stands for Explore real-life problems, Engage students with web technology, Enable experiment using analogies, Elaborate data through multiple representations, Encourage questioning, and Ensure the feedback. The strategy was applied to 60 students in secondary school through cooperative learning. As a comparison, conventional strategies were also applied to 60 students in the same school and grade. The result of the study showed that the soft scaffolding strategy was effective in improving student’s creative thinking ability.

  3. Development of a Rational Design Space for Optimizing Mixing Conditions for Formation of Adhesive Mixtures for Dry-Powder Inhaler Formulations.

    PubMed

    Sarkar, Saurabh; Minatovicz, Bruna; Thalberg, Kyrre; Chaudhuri, Bodhisattwa

    2017-01-01

    The purpose of the present study was to develop guidance toward rational choice of blenders and processing conditions to make robust and high performing adhesive mixtures for dry-powder inhalers and to develop quantitative experimental approaches for optimizing the process. Mixing behavior of carrier (LH100) and AstraZeneca fine lactose in high-shear and low-shear double cone blenders was systematically investigated. Process variables impacting the mixing performance were evaluated for both blenders. The performance of the blenders with respect to the mixing time, press-on forces, static charging, and abrasion of carrier fines was monitored, and for some of the parameters, distinct differences could be detected. A comparison table is presented, which can be used as a guidance to enable rational choice of blender and process parameters based on the user requirements. Segregation of adhesive mixtures during hopper discharge was also investigated. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  4. Communication: Development of standing evanescent-wave fluorescence correlation spectroscopy and its application to the lateral diffusion of lipids in a supported lipid bilayer

    NASA Astrophysics Data System (ADS)

    Otosu, Takuhiro; Yamaguchi, Shoichi

    2017-07-01

    We present standing evanescent-wave fluorescence correlation spectroscopy (SEW-FCS). This technique utilizes the interference of two evanescent waves which generates a standing evanescent-wave. Fringe-pattern illumination created by a standing evanescent-wave enables us to measure the diffusion coefficients of molecules with a super-resolution corresponding to one fringe width. Because the fringe width can be reliably estimated by a simple procedure, utilization of fringes is beneficial to quantitatively analyze the slow diffusion of molecules in a supported lipid bilayer (SLB), a model biomembrane formed on a solid substrate, with the timescale relevant for reliable FCS analysis. Furthermore, comparison of the data between SEW-FCS and conventional total-internal reflection FCS, which can also be performed by the SEW-FCS instrument, effectively eliminates the artifact due to afterpulsing of the photodiode detector. The versatility of SEW-FCS is demonstrated by its application to various SLBs.

  5. The developmental proteome of Drosophila melanogaster

    PubMed Central

    Casas-Vila, Nuria; Bluhm, Alina; Sayols, Sergi; Dinges, Nadja; Dejung, Mario; Altenhein, Tina; Kappei, Dennis; Altenhein, Benjamin; Roignant, Jean-Yves; Butter, Falk

    2017-01-01

    Drosophila melanogaster is a widely used genetic model organism in developmental biology. While this model organism has been intensively studied at the RNA level, a comprehensive proteomic study covering the complete life cycle is still missing. Here, we apply label-free quantitative proteomics to explore proteome remodeling across Drosophila’s life cycle, resulting in 7952 proteins, and provide a high temporal-resolved embryogenesis proteome of 5458 proteins. Our proteome data enabled us to monitor isoform-specific expression of 34 genes during development, to identify the pseudogene Cyp9f3Ψ as a protein-coding gene, and to obtain evidence of 268 small proteins. Moreover, the comparison with available transcriptomic data uncovered examples of poor correlation between mRNA and protein, underscoring the importance of proteomics to study developmental progression. Data integration of our embryogenesis proteome with tissue-specific data revealed spatial and temporal information for further functional studies of yet uncharacterized proteins. Overall, our high resolution proteomes provide a powerful resource and can be explored in detail in our interactive web interface. PMID:28381612

  6. Temperature dependence of pre-edge features in Ti K-edge XANES spectra for ATiO₃ (A = Ca and Sr), A₂TiO₄ (A = Mg and Fe), TiO₂ rutile and TiO₂ anatase.

    PubMed

    Hiratoko, Tatsuya; Yoshiasa, Akira; Nakatani, Tomotaka; Okube, Maki; Nakatsuka, Akihiko; Sugiyama, Kazumasa

    2013-07-01

    XANES (X-ray absorption near-edge structure) spectra of the Ti K-edges of ATiO3 (A = Ca and Sr), A2TiO4 (A = Mg and Fe), TiO2 rutile and TiO2 anatase were measured in the temperature range 20-900 K. Ti atoms for all samples were located in TiO6 octahedral sites. The absorption intensity invariant point (AIIP) was found to be between the pre-edge and post-edge. After the AIIP, amplitudes damped due to Debye-Waller factor effects with temperature. Amplitudes in the pre-edge region increased with temperature normally by thermal vibration. Use of the AIIP peak intensity as a standard point enables a quantitative comparison of the intensity of the pre-edge peaks in various titanium compounds over a wide temperature range.

  7. Recommendations for Benchmarking Preclinical Studies of Nanomedicines.

    PubMed

    Dawidczyk, Charlene M; Russell, Luisa M; Searson, Peter C

    2015-10-01

    Nanoparticle-based delivery systems provide new opportunities to overcome the limitations associated with traditional small-molecule drug therapy for cancer and to achieve both therapeutic and diagnostic functions in the same platform. Preclinical trials are generally designed to assess therapeutic potential and not to optimize the design of the delivery platform. Consequently, progress in developing design rules for cancer nanomedicines has been slow, hindering progress in the field. Despite the large number of preclinical trials, several factors restrict comparison and benchmarking of different platforms, including variability in experimental design, reporting of results, and the lack of quantitative data. To solve this problem, we review the variables involved in the design of preclinical trials and propose a protocol for benchmarking that we recommend be included in in vivo preclinical studies of drug-delivery platforms for cancer therapy. This strategy will contribute to building the scientific knowledge base that enables development of design rules and accelerates the translation of new technologies. ©2015 American Association for Cancer Research.

  8. Perspective: Recommendations for benchmarking pre-clinical studies of nanomedicines

    PubMed Central

    Dawidczyk, Charlene M.; Russell, Luisa M.; Searson, Peter C.

    2015-01-01

    Nanoparticle-based delivery systems provide new opportunities to overcome the limitations associated with traditional small molecule drug therapy for cancer, and to achieve both therapeutic and diagnostic functions in the same platform. Pre-clinical trials are generally designed to assess therapeutic potential and not to optimize the design of the delivery platform. Consequently, progress in developing design rules for cancer nanomedicines has been slow, hindering progress in the field. Despite the large number of pre-clinical trials, several factors restrict comparison and benchmarking of different platforms, including variability in experimental design, reporting of results, and the lack of quantitative data. To solve this problem, we review the variables involved in the design of pre-clinical trials and propose a protocol for benchmarking that we recommend be included in in vivo pre-clinical studies of drug delivery platforms for cancer therapy. This strategy will contribute to building the scientific knowledge base that enables development of design rules and accelerates the translation of new technologies. PMID:26249177

  9. Nanoscale Magnetism in Next Generation Magnetic Nanoparticles

    DTIC Science & Technology

    2018-03-17

    as dextran coated SPIONs were studied. From the measured T1 and T2 relaxation times, a new method called Quantitative Ultra- Short Time-to-Echo...angiograms with high clarity and definition, and enabled quantitative MRI in biological samples. At UCL, the work included (i) fabricating multi-element...distribution unlimited. I. Introduction Compared to flat biosensor devices, 3D engineered biosensors achievemore intimate and conformal interfaces with cells

  10. Using Inequality Measures to Incorporate Environmental Justice into Regulatory Analyses

    EPA Science Inventory

    Abstract: Formally evaluating how specific policy measures influence environmental justice is challenging, especially in the context of regulatory analyses in which quantitative comparisons are the norm. However, there is a large literature on developing and applying quantitative...

  11. Real-time label-free quantitative fluorescence microscopy-based detection of ATP using a tunable fluorescent nano-aptasensor platform

    NASA Astrophysics Data System (ADS)

    Shrivastava, Sajal; Sohn, Il-Yung; Son, Young-Min; Lee, Won-Il; Lee, Nae-Eung

    2015-11-01

    Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules.Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr05839b

  12. The circadian rhythm of core temperature: effects of physical activity and aging.

    PubMed

    Weinert, Dietmar; Waterhouse, Jim

    2007-02-28

    The circadian rhythm of core temperature depends upon several interacting rhythms, of both endogenous and exogenous origin, but an understanding of the process requires these two components to be separated. Constant routines remove the exogenous (masking) component at source, but they are severely limited in their application. By contrast, several purification methods have successfully reduced the masking component of overt circadian rhythms measured in field circumstances. One important, but incidental, outcome from these methods is that they enable a quantitative estimate of masking effects to be obtained. It has been shown that these effects of activity upon the temperature rhythm show circadian rhythmicity, and more detailed investigations of this have aided our understanding of thermoregulation and the genesis of the circadian rhythm of core temperature itself. The observed circadian rhythm of body temperature varies with age; in comparison with adults, it is poorly developed in the neonate and deteriorates in the aged subject. Comparing masked and purified data enables the reasons for these differences--whether due to the body clock, the effector pathways or organs, or irregularities due to the individual's lifestyle--to begin to be understood. Such investigations stress the immaturity of the circadian rhythm in the human neonate and its deterioration in elderly compared with younger subjects, but they also indicate the robustness of the body clock itself into advanced age, at least in mice.

  13. Mapping the Binding Interface of VEGF and a Monoclonal Antibody Fab-1 Fragment with Fast Photochemical Oxidation of Proteins (FPOP) and Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Wecksler, Aaron T.; Molina, Patricia; Deperalta, Galahad; Gross, Michael L.

    2017-05-01

    We previously analyzed the Fab-1:VEGF (vascular endothelial growth factor) system described in this work, with both native top-down mass spectrometry and bottom-up mass spectrometry (carboxyl-group or GEE footprinting) techniques. This work continues bottom-up mass spectrometry analysis using a fast photochemical oxidation of proteins (FPOP) platform to map the solution binding interface of VEGF and a fragment antigen binding region of an antibody (Fab-1). In this study, we use FPOP to compare the changes in solvent accessibility by quantitating the extent of oxidative modification in the unbound versus bound states. Determining the changes in solvent accessibility enables the inference of the protein binding sites (epitope and paratopes) and a comparison to the previously published Fab-1:VEGF crystal structure, adding to the top-down and bottom-up data. Using this method, we investigated peptide-level and residue-level changes in solvent accessibility between the unbound proteins and bound complex. Mapping these data onto the Fab-1:VEGF crystal structure enabled successful characterization of both the binding region and regions of remote conformation changes. These data, coupled with our previous higher order structure (HOS) studies, demonstrate the value of a comprehensive toolbox of methods for identifying the putative epitopes and paratopes for biotherapeutic antibodies.

  14. Comparative Performance of Reagents and Platforms for Quantitation of Cytomegalovirus DNA by Digital PCR

    PubMed Central

    Gu, Z.; Sam, S. S.; Sun, Y.; Tang, L.; Pounds, S.; Caliendo, A. M.

    2016-01-01

    A potential benefit of digital PCR is a reduction in result variability across assays and platforms. Three sets of PCR reagents were tested on two digital PCR systems (Bio-Rad and RainDance), using three different sets of PCR reagents for quantitation of cytomegalovirus (CMV). Both commercial quantitative viral standards and 16 patient samples (n = 16) were tested. Quantitative accuracy (compared to nominal values) and variability were determined based on viral standard testing results. Quantitative correlation and variability were assessed with pairwise comparisons across all reagent-platform combinations for clinical plasma sample results. The three reagent sets, when used to assay quantitative standards on the Bio-Rad system, all showed a high degree of accuracy, low variability, and close agreement with one another. When used on the RainDance system, one of the three reagent sets appeared to have a much better correlation to nominal values than did the other two. Quantitative results for patient samples showed good correlation in most pairwise comparisons, with some showing poorer correlations when testing samples with low viral loads. Digital PCR is a robust method for measuring CMV viral load. Some degree of result variation may be seen, depending on platform and reagents used; this variation appears to be greater in samples with low viral load values. PMID:27535685

  15. In-depth Qualitative and Quantitative Profiling of Tyrosine Phosphorylation Using a Combination of Phosphopeptide Immunoaffinity Purification and Stable Isotope Dimethyl Labeling*

    PubMed Central

    Boersema, Paul J.; Foong, Leong Yan; Ding, Vanessa M. Y.; Lemeer, Simone; van Breukelen, Bas; Philp, Robin; Boekhorst, Jos; Snel, Berend; den Hertog, Jeroen; Choo, Andre B. H.; Heck, Albert J. R.

    2010-01-01

    Several mass spectrometry-based assays have emerged for the quantitative profiling of cellular tyrosine phosphorylation. Ideally, these methods should reveal the exact sites of tyrosine phosphorylation, be quantitative, and not be cost-prohibitive. The latter is often an issue as typically several milligrams of (stable isotope-labeled) starting protein material are required to enable the detection of low abundance phosphotyrosine peptides. Here, we adopted and refined a peptidecentric immunoaffinity purification approach for the quantitative analysis of tyrosine phosphorylation by combining it with a cost-effective stable isotope dimethyl labeling method. We were able to identify by mass spectrometry, using just two LC-MS/MS runs, more than 1100 unique non-redundant phosphopeptides in HeLa cells from about 4 mg of starting material without requiring any further affinity enrichment as close to 80% of the identified peptides were tyrosine phosphorylated peptides. Stable isotope dimethyl labeling could be incorporated prior to the immunoaffinity purification, even for the large quantities (mg) of peptide material used, enabling the quantification of differences in tyrosine phosphorylation upon pervanadate treatment or epidermal growth factor stimulation. Analysis of the epidermal growth factor-stimulated HeLa cells, a frequently used model system for tyrosine phosphorylation, resulted in the quantification of 73 regulated unique phosphotyrosine peptides. The quantitative data were found to be exceptionally consistent with the literature, evidencing that such a targeted quantitative phosphoproteomics approach can provide reproducible results. In general, the combination of immunoaffinity purification of tyrosine phosphorylated peptides with large scale stable isotope dimethyl labeling provides a cost-effective approach that can alleviate variation in sample preparation and analysis as samples can be combined early on. Using this approach, a rather complete qualitative and quantitative picture of tyrosine phosphorylation signaling events can be generated. PMID:19770167

  16. Comparison of propidium monoazide-quantitative PCR and reverse transcription quantitative PCR for viability detection of fresh Cryptosporidium oocysts following disinfection and after long-term storage in water samples

    EPA Science Inventory

    Purified oocysts of Cryptosporidium parvum were used to evaluate applicability of two quantitative PCR (qPCR) viability detection methods in raw surface water and disinfection treated water. Propidium monoazide-qPCR targeting hsp70 gene was compared to reverse transcription (RT)-...

  17. Acoustic Facies Analysis of Side-Scan Sonar Data

    NASA Astrophysics Data System (ADS)

    Dwan, Fa Shu

    Acoustic facies analysis methods have allowed the generation of system-independent values for the quantitative seafloor acoustic parameter, backscattering strength, from GLORIA and (TAMU) ^2 side-scan sonar data. The resulting acoustic facies parameters enable quantitative comparisons of data collected by different sonar systems, data from different environments, and measurements made with survey geometries. Backscattering strength values were extracted from the sonar amplitude data by inversion based on the sonar equation. Image processing products reveal seafloor features and patterns of relative intensity. To quantitatively compare data collected at different times or by different systems, and to ground truth-measurements and geoacoustic models, quantitative corrections must be made on any given data set for system source level, beam pattern, time-varying gain, processing gain, transmission loss, absorption, insonified area contribution, and grazing angle effects. In the sonar equation, backscattering strength is the sonar parameter which is directly related to seafloor properties. The GLORIA data used in this study are from the edge of a distal lobe of the Monterey Fan. An interfingered region of strong and weak seafloor signal returns from a flat seafloor region provides an ideal data set for this study. Inversion of imagery data from the region allows the quantitative definition of different acoustic facies. The (TAMU) ^2 data used are from a calibration site near the Green Canyon area of the Gulf of Mexico. Acoustic facies analysis techniques were implemented to generate statistical information for acoustic facies based on the estimates of backscattering strength. The backscattering strength values have been compared with Lambert's Law and other functions to parameterize the description of the acoustic facies. The resulting Lambertian constant values range from -26 dB to -36 dB. A modified Lambert relationship, which consists of both intercept and slope terms, appears to represent the BSS versus grazing angle profiles better based on chi^2 testing and error ellipse generation. Different regression functions, composed of trigonometric functions, were analyzed for different segments of the BSS profiles. A cotangent or sine/cosine function shows promising results for representing the entire grazing angle span of the BSS profiles.

  18. Discrepancies between qualitative and quantitative evaluation of randomised controlled trial results: achieving clarity through mixed methods triangulation.

    PubMed

    Tonkin-Crine, Sarah; Anthierens, Sibyl; Hood, Kerenza; Yardley, Lucy; Cals, Jochen W L; Francis, Nick A; Coenen, Samuel; van der Velden, Alike W; Godycki-Cwirko, Maciek; Llor, Carl; Butler, Chris C; Verheij, Theo J M; Goossens, Herman; Little, Paul

    2016-05-12

    Mixed methods are commonly used in health services research; however, data are not often integrated to explore complementarity of findings. A triangulation protocol is one approach to integrating such data. A retrospective triangulation protocol was carried out on mixed methods data collected as part of a process evaluation of a trial. The multi-country randomised controlled trial found that a web-based training in communication skills (including use of a patient booklet) and the use of a C-reactive protein (CRP) point-of-care test decreased antibiotic prescribing by general practitioners (GPs) for acute cough. The process evaluation investigated GPs' and patients' experiences of taking part in the trial. Three analysts independently compared findings across four data sets: qualitative data collected view semi-structured interviews with (1) 62 patients and (2) 66 GPs and quantitative data collected via questionnaires with (3) 2886 patients and (4) 346 GPs. Pairwise comparisons were made between data sets and were categorised as agreement, partial agreement, dissonance or silence. Three instances of dissonance occurred in 39 independent findings. GPs and patients reported different views on the use of a CRP test. GPs felt that the test was useful in convincing patients to accept a no-antibiotic decision, but patient data suggested that this was unnecessary if a full explanation was given. Whilst qualitative data indicated all patients were generally satisfied with their consultation, quantitative data indicated highest levels of satisfaction for those receiving a detailed explanation from their GP with a booklet giving advice on self-care. Both qualitative and quantitative data sets indicated higher patient enablement for those in the communication groups who had received a booklet. Use of CRP tests does not appear to engage patients or influence illness perceptions and its effect is more centred on changing clinician behaviour. Communication skills and the patient booklet were relevant and useful for all patients and associated with increased patient satisfaction. A triangulation protocol to integrate qualitative and quantitative data can reveal findings that need further interpretation and also highlight areas of dissonance that lead to a deeper insight than separate analyses.

  19. Quantitative Profiling of Feruloylated Arabinoxylan Side-Chains from Graminaceous Cell Walls

    PubMed Central

    Schendel, Rachel R.; Meyer, Marleen R.; Bunzel, Mirko

    2016-01-01

    Graminaceous arabinoxylans are distinguished by decoration with feruloylated monosaccharidic and oligosaccharidic side-chains. Although it is hypothesized that structural complexity and abundance of these feruloylated arabinoxylan side-chains may contribute, among other factors, to resistance of plant cell walls to enzymatic degradation, quantitative profiling approaches for these structural units in plant cell wall materials have not been described yet. Here we report the development and application of a rapid and robust method enabling the quantitative comparison of feruloylated side-chain profiles in cell wall materials following mildly acidic hydrolysis, C18-solid phase extraction (SPE), reduction under aprotic conditions, and liquid chromatography with diode-array detection/mass spectrometry (LC-DAD/MS) separation and detection. The method was applied to the insoluble fiber/cell wall materials isolated from 12 whole grains: wild rice (Zizania aquatica L.), long-grain brown rice (Oryza sativa L.), rye (Secale cereale L.), kamut (Triticum turanicum Jakubz.), wheat (Triticum aestivum L.), spelt (Triticum spelta L.), intermediate wheatgrass (Thinopyrum intermedium), maize (Zea mays L.), popcorn (Zea mays L. var. everta), oat (Avena sativa L.) (dehulled), barley (Hordeum vulgare L.) (dehulled), and proso millet (Panicum miliaceum L.). Between 51 and 96% of the total esterified monomeric ferulates were represented in the quantified compounds captured in the feruloylated side-chain profiles, which confirms the significance of these structures to the global arabinoxylan structure in terms of quantity. The method provided new structural insights into cereal grain arabinoxylans, in particular, that the structural moiety α-l-galactopyranosyl-(1→2)-β-d-xylopyranosyl-(1→2)-5-O-trans-feruloyl-l-arabinofuranose (FAXG), which had previously only been described in maize, is ubiquitous to cereal grains. PMID:26834763

  20. A comparison of quantitative methods for clinical imaging with hyperpolarized (13)C-pyruvate.

    PubMed

    Daniels, Charlie J; McLean, Mary A; Schulte, Rolf F; Robb, Fraser J; Gill, Andrew B; McGlashan, Nicholas; Graves, Martin J; Schwaiger, Markus; Lomas, David J; Brindle, Kevin M; Gallagher, Ferdia A

    2016-04-01

    Dissolution dynamic nuclear polarization (DNP) enables the metabolism of hyperpolarized (13)C-labelled molecules, such as the conversion of [1-(13)C]pyruvate to [1-(13)C]lactate, to be dynamically and non-invasively imaged in tissue. Imaging of this exchange reaction in animal models has been shown to detect early treatment response and correlate with tumour grade. The first human DNP study has recently been completed, and, for widespread clinical translation, simple and reliable methods are necessary to accurately probe the reaction in patients. However, there is currently no consensus on the most appropriate method to quantify this exchange reaction. In this study, an in vitro system was used to compare several kinetic models, as well as simple model-free methods. Experiments were performed using a clinical hyperpolarizer, a human 3 T MR system, and spectroscopic imaging sequences. The quantitative methods were compared in vivo by using subcutaneous breast tumours in rats to examine the effect of pyruvate inflow. The two-way kinetic model was the most accurate method for characterizing the exchange reaction in vitro, and the incorporation of a Heaviside step inflow profile was best able to describe the in vivo data. The lactate time-to-peak and the lactate-to-pyruvate area under the curve ratio were simple model-free approaches that accurately represented the full reaction, with the time-to-peak method performing indistinguishably from the best kinetic model. Finally, extracting data from a single pixel was a robust and reliable surrogate of the whole region of interest. This work has identified appropriate quantitative methods for future work in the analysis of human hyperpolarized (13)C data. © 2016 The Authors. NMR in Biomedicine published by John Wiley & Sons Ltd.

  1. Characterization of E 471 food emulsifiers by high-performance thin-layer chromatography-fluorescence detection.

    PubMed

    Oellig, Claudia; Brändle, Klara; Schwack, Wolfgang

    2018-07-13

    Mono- and diacylglycerol (MAG and DAG) emulsifiers, also known as food additive E 471, are widely used to adjust techno-functional properties in various foods. Besides MAGs and DAGs, E 471 emulsifiers additionally comprise different amounts of triacylglycerols (TAGs) and free fatty acids (FFAs). MAGs, DAGs, TAGs and FFAs are generally determined by high-performance liquid chromatography (HPLC) or gas chromatography (GC) coupled to mass selective detection, analyzing the individual representatives of the lipid classes. In this work we present a rapid and sensitive method for the determination of MAGs, DAGs, TAGs and FFAs in E 471 emulsifiers by high-performance thin-layer chromatography with fluorescence detection (HPTLC-FLD), including a response factor system for quantitation. Samples were simply dissolved and diluted with t-butyl methyl ether before a two-fold development was performed on primuline pre-impregnated LiChrospher silica gel plates with diethyl ether and n-pentane/n-hexane/diethyl ether (52:20:28, v/v/v) as the mobile phases to 18 and 75 mm, respectively. For quantitation, the plate was scanned in the fluorescence mode at UV 366/>400 nm, when the cumulative signal for each lipid class was used. Calibration was done with 1,2-distearin and amounts of lipid classes were calculated with response factors and expressed as monostearin, distearin, tristearin and stearic acid. Limits of detection and quantitation were 1 and 4 ng/zone, respectively, for 1,2-distearin. Thus, the HPTLC-FLD approach represents a simple, rapid and convenient screening alternative to HPLC and GC analysis of the individual compounds. Visual detection additionally enables an easy characterization and the direct comparison of emulsifiers through the lipid class pattern, when utilized as a fingerprint. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Whole-body PET parametric imaging employing direct 4D nested reconstruction and a generalized non-linear Patlak model

    NASA Astrophysics Data System (ADS)

    Karakatsanis, Nicolas A.; Rahmim, Arman

    2014-03-01

    Graphical analysis is employed in the research setting to provide quantitative estimation of PET tracer kinetics from dynamic images at a single bed. Recently, we proposed a multi-bed dynamic acquisition framework enabling clinically feasible whole-body parametric PET imaging by employing post-reconstruction parameter estimation. In addition, by incorporating linear Patlak modeling within the system matrix, we enabled direct 4D reconstruction in order to effectively circumvent noise amplification in dynamic whole-body imaging. However, direct 4D Patlak reconstruction exhibits a relatively slow convergence due to the presence of non-sparse spatial correlations in temporal kinetic analysis. In addition, the standard Patlak model does not account for reversible uptake, thus underestimating the influx rate Ki. We have developed a novel whole-body PET parametric reconstruction framework in the STIR platform, a widely employed open-source reconstruction toolkit, a) enabling accelerated convergence of direct 4D multi-bed reconstruction, by employing a nested algorithm to decouple the temporal parameter estimation from the spatial image update process, and b) enhancing the quantitative performance particularly in regions with reversible uptake, by pursuing a non-linear generalized Patlak 4D nested reconstruction algorithm. A set of published kinetic parameters and the XCAT phantom were employed for the simulation of dynamic multi-bed acquisitions. Quantitative analysis on the Ki images demonstrated considerable acceleration in the convergence of the nested 4D whole-body Patlak algorithm. In addition, our simulated and patient whole-body data in the postreconstruction domain indicated the quantitative benefits of our extended generalized Patlak 4D nested reconstruction for tumor diagnosis and treatment response monitoring.

  3. A quantitative comparison of leading-edge vortices in incompressible and supersonic flows

    DOT National Transportation Integrated Search

    2002-01-14

    When requiring quantitative data on delta-wing vortices for design purposes, low-speed results have often been extrapolated to configurations intended for supersonic operation. This practice stems from a lack of database owing to difficulties that pl...

  4. Assessing signal-to-noise in quantitative proteomics: multivariate statistical analysis in DIGE experiments.

    PubMed

    Friedman, David B

    2012-01-01

    All quantitative proteomics experiments measure variation between samples. When performing large-scale experiments that involve multiple conditions or treatments, the experimental design should include the appropriate number of individual biological replicates from each condition to enable the distinction between a relevant biological signal from technical noise. Multivariate statistical analyses, such as principal component analysis (PCA), provide a global perspective on experimental variation, thereby enabling the assessment of whether the variation describes the expected biological signal or the unanticipated technical/biological noise inherent in the system. Examples will be shown from high-resolution multivariable DIGE experiments where PCA was instrumental in demonstrating biologically significant variation as well as sample outliers, fouled samples, and overriding technical variation that would not be readily observed using standard univariate tests.

  5. Neutron multiplicity counting: Confidence intervals for reconstruction parameters

    DOE PAGES

    Verbeke, Jerome M.

    2016-03-09

    From nuclear materials accountability to homeland security, the need for improved nuclear material detection, assay, and authentication has grown over the past decades. Starting in the 1940s, neutron multiplicity counting techniques have enabled quantitative evaluation of masses and multiplications of fissile materials. In this paper, we propose a new method to compute uncertainties on these parameters using a model-based sequential Bayesian processor, resulting in credible regions in the fissile material mass and multiplication space. These uncertainties will enable us to evaluate quantitatively proposed improvements to the theoretical fission chain model. Additionally, because the processor can calculate uncertainties in real time,more » it is a useful tool in applications such as portal monitoring: monitoring can stop as soon as a preset confidence of non-threat is reached.« less

  6. Specificity and non-specificity in RNA–protein interactions

    PubMed Central

    Jankowsky, Eckhard; Harris, Michael E.

    2016-01-01

    Gene expression is regulated by complex networks of interactions between RNAs and proteins. Proteins that interact with RNA have been traditionally viewed as either specific or non-specific; specific proteins interact preferentially with defined RNA sequence or structure motifs, whereas non-specific proteins interact with RNA sites devoid of such characteristics. Recent studies indicate that the binary “specific vs. non-specific” classification is insufficient to describe the full spectrum of RNA–protein interactions. Here, we review new methods that enable quantitative measurements of protein binding to large numbers of RNA variants, and the concepts aimed as describing resulting binding spectra: affinity distributions, comprehensive binding models and free energy landscapes. We discuss how these new methodologies and associated concepts enable work towards inclusive, quantitative models for specific and non-specific RNA–protein interactions. PMID:26285679

  7. Quantitative aspects of inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Bulska, Ewa; Wagner, Barbara

    2016-10-01

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue 'Quantitative mass spectrometry'.

  8. Social Comparison and Body Image in Adolescence: A Grounded Theory Approach

    ERIC Educational Resources Information Center

    Krayer, A.; Ingledew, D. K.; Iphofen, R.

    2008-01-01

    This study explored the use of social comparison appraisals in adolescents' lives with particular reference to enhancement appraisals which can be used to counter threats to the self. Social comparison theory has been increasingly used in quantitative research to understand the processes through which societal messages about appearance influence…

  9. Target Scattering Metrics: Model-Model and Model-Data Comparisons

    DTIC Science & Technology

    2017-12-13

    measured synthetic aperture sonar (SAS) data or from numerical models is investigated. Metrics are needed for quantitative comparisons for signals...candidate metrics for model-model comparisons are examined here with a goal to consider raw data prior to its reduction to data products, which may...be suitable for input to classification schemes. The investigated metrics are then applied to model-data comparisons. INTRODUCTION Metrics for

  10. Target Scattering Metrics: Model-Model and Model Data comparisons

    DTIC Science & Technology

    2017-12-13

    measured synthetic aperture sonar (SAS) data or from numerical models is investigated. Metrics are needed for quantitative comparisons for signals...candidate metrics for model-model comparisons are examined here with a goal to consider raw data prior to its reduction to data products, which may...be suitable for input to classification schemes. The investigated metrics are then applied to model-data comparisons. INTRODUCTION Metrics for

  11. Specific energy contributions from competing hydrogen-bonded structures in six polymorphs of phenobarbital.

    PubMed

    Gelbrich, Thomas; Braun, Doris E; Griesser, Ulrich J

    2016-01-01

    In solid state structures of organic molecules, identical sets of H-bond donor and acceptor functions can result in a range of distinct H-bond connectivity modes. Specifically, competing H-bond structures (HBSs) may differ in the quantitative proportion between one-point and multiple-point H-bond connections. For an assessment of such HBSs, the effects of their internal as well as external (packing) interactions need to be taken into consideration. The semi-classical density sums (SCDS-PIXEL) method, which enables the calculation of interaction energies for molecule-molecule pairs, was used to investigate six polymorphs of phenobarbital (Pbtl) with different quantitative proportions of one-point and two-point H-bond connections. The structures of polymorphs V and VI of Pbtl were determined from single crystal data. Two-point H-bond connections are inherently inflexible in their geometry and lie within a small PIXEL energy range (-45.7 to -49.7 kJ mol(-1)). One-point H-bond connections are geometrically less restricted and subsequently show large variations in their dispersion terms and total energies (-23.1 to -40.5 kJ mol(-1)). The comparison of sums of interaction energies in small clusters containing only the strongest intermolecular interactions showed an advantage for compact HBSs with multiple-point connections, whereas alternative HBSs based on one-point connections may enable more favourable overall packing interactions (i.e. V vs. III). Energy penalties associated with experimental intramolecular geometries relative to the global conformational energy minimum were calculated and used to correct total PIXEL energies. The estimated order of stabilities (based on PIXEL energies) is III > I > II > VI > X > V, with a difference of just 1.7 kJ mol(-1) between the three most stable forms. For an analysis of competing HBSs, one has to consider the contributions from internal H-bond and non-H-bond interactions, from the packing of multiple HBS instances and intramolecular energy penalties. A compact HBS based on multiple-point H-bond connections should typically lead to more packing alternatives and ultimately to a larger number of viable low-energy structures than a competing one-point HBS (i.e. dimer vs. catemer). Coulombic interaction energies associated with typical short intermolecular C-H···O contact geometries are small in comparison with dispersion effects associated with the packing complementary molecular shapes.Graphical abstractCompeting H-bond motifs can differ markedly in their energy contributions.

  12. A 100-Year Review: Methods and impact of genetic selection in dairy cattle-From daughter-dam comparisons to deep learning algorithms.

    PubMed

    Weigel, K A; VanRaden, P M; Norman, H D; Grosu, H

    2017-12-01

    In the early 1900s, breed society herdbooks had been established and milk-recording programs were in their infancy. Farmers wanted to improve the productivity of their cattle, but the foundations of population genetics, quantitative genetics, and animal breeding had not been laid. Early animal breeders struggled to identify genetically superior families using performance records that were influenced by local environmental conditions and herd-specific management practices. Daughter-dam comparisons were used for more than 30 yr and, although genetic progress was minimal, the attention given to performance recording, genetic theory, and statistical methods paid off in future years. Contemporary (herdmate) comparison methods allowed more accurate accounting for environmental factors and genetic progress began to accelerate when these methods were coupled with artificial insemination and progeny testing. Advances in computing facilitated the implementation of mixed linear models that used pedigree and performance data optimally and enabled accurate selection decisions. Sequencing of the bovine genome led to a revolution in dairy cattle breeding, and the pace of scientific discovery and genetic progress accelerated rapidly. Pedigree-based models have given way to whole-genome prediction, and Bayesian regression models and machine learning algorithms have joined mixed linear models in the toolbox of modern animal breeders. Future developments will likely include elucidation of the mechanisms of genetic inheritance and epigenetic modification in key biological pathways, and genomic data will be used with data from on-farm sensors to facilitate precision management on modern dairy farms. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  13. Evolution of Precipitation Structure During the November DYNAMO MJO Event: Cloud-Resolving Model Intercomparison and Cross Validation Using Radar Observations

    NASA Astrophysics Data System (ADS)

    Li, Xiaowen; Janiga, Matthew A.; Wang, Shuguang; Tao, Wei-Kuo; Rowe, Angela; Xu, Weixin; Liu, Chuntao; Matsui, Toshihisa; Zhang, Chidong

    2018-04-01

    Evolution of precipitation structures are simulated and compared with radar observations for the November Madden-Julian Oscillation (MJO) event during the DYNAmics of the MJO (DYNAMO) field campaign. Three ground-based, ship-borne, and spaceborne precipitation radars and three cloud-resolving models (CRMs) driven by observed large-scale forcing are used to study precipitation structures at different locations over the central equatorial Indian Ocean. Convective strength is represented by 0-dBZ echo-top heights, and convective organization by contiguous 17-dBZ areas. The multi-radar and multi-model framework allows for more stringent model validations. The emphasis is on testing models' ability to simulate subtle differences observed at different radar sites when the MJO event passed through. The results show that CRMs forced by site-specific large-scale forcing can reproduce not only common features in cloud populations but also subtle variations observed by different radars. The comparisons also revealed common deficiencies in CRM simulations where they underestimate radar echo-top heights for the strongest convection within large, organized precipitation features. Cross validations with multiple radars and models also enable quantitative comparisons in CRM sensitivity studies using different large-scale forcing, microphysical schemes and parameters, resolutions, and domain sizes. In terms of radar echo-top height temporal variations, many model sensitivity tests have better correlations than radar/model comparisons, indicating robustness in model performance on this aspect. It is further shown that well-validated model simulations could be used to constrain uncertainties in observed echo-top heights when the low-resolution surveillance scanning strategy is used.

  14. Quantitative 3D comparison of biofilm imaged by X-ray micro-tomography and two-photon laser scanning microscopy.

    PubMed

    Larue, A E; Swider, P; Duru, P; Daviaud, D; Quintard, M; Davit, Y

    2018-06-21

    Optical imaging techniques for biofilm observation, like laser scanning microscopy, are not applicable when investigating biofilm formation in opaque porous media. X-ray micro-tomography (X-ray CMT) might be an alternative but it finds limitations in similarity of X-ray absorption coefficients for the biofilm and aqueous phases. To overcome this difficulty, barium sulphate was used in Davit et al. (2011) to enable high-resolution 3D imaging of biofilm via X-ray CMT. However, this approach lacks comparison with well-established imaging methods, which are known to capture the fine structures of biofilms, as well as uncertainty quantification. Here, we compare two-photon laser scanning microscopy (TPLSM) images of Pseudomonas Aeruginosa biofilm grown in glass capillaries against X-ray CMT using an improved protocol where barium sulphate is combined with low-gelling temperature agarose to avoid sedimentation. Calibrated phantoms consisting of mono-dispersed fluorescent and X-ray absorbent beads were used to evaluate the uncertainty associated with our protocol along with three different segmentation techniques, namely hysteresis, watershed and region growing, to determine the bias relative to image binarization. Metrics such as volume, 3D surface area and thickness were measured and comparison of both imaging modalities shows that X-ray CMT of biofilm using our protocol yields an accuracy that is comparable and even better in certain respects than TPLSM, even in a nonporous system that is largely favourable to TPLSM. © 2018 The Authors Journal of Microscopy © 2018 Royal Microscopical Society.

  15. Stereological analysis of bacterial load and lung lesions in nonhuman primates (rhesus macaques) experimentally infected with Mycobacterium tuberculosis.

    PubMed

    Luciw, Paul A; Oslund, Karen L; Yang, Xiao-Wei; Adamson, Lourdes; Ravindran, Resmi; Canfield, Don R; Tarara, Ross; Hirst, Linda; Christensen, Miles; Lerche, Nicholas W; Offenstein, Heather; Lewinsohn, David; Ventimiglia, Frank; Brignolo, Laurie; Wisner, Erik R; Hyde, Dallas M

    2011-11-01

    Infection with Mycobacterium tuberculosis primarily produces a multifocal distribution of pulmonary granulomas in which the pathogen resides. Accordingly, quantitative assessment of the bacterial load and pathology is a substantial challenge in tuberculosis. Such assessments are critical for studies of the pathogenesis and for the development of vaccines and drugs in animal models of experimental M. tuberculosis infection. Stereology enables unbiased quantitation of three-dimensional objects from two-dimensional sections and thus is suited to quantify histological lesions. We have developed a protocol for stereological analysis of the lung in rhesus macaques inoculated with a pathogenic clinical strain of M. tuberculosis (Erdman strain). These animals exhibit a pattern of infection and tuberculosis similar to that of naturally infected humans. Conditions were optimized for collecting lung samples in a nonbiased, random manner. Bacterial load in these samples was assessed by a standard plating assay, and granulomas were graded and enumerated microscopically. Stereological analysis provided quantitative data that supported a significant correlation between bacterial load and lung granulomas. Thus this stereological approach enables a quantitative, statistically valid analysis of the impact of M. tuberculosis infection in the lung and will serve as an essential tool for objectively comparing the efficacy of drugs and vaccines.

  16. Interlaboratory Comparison of Quantitative PCR Test Results for Dehalococcoides

    EPA Science Inventory

    Quantitative PCR (qPCR) techniques have been widely used to measure Dehalococcoides (Dhc) DNA in the groundwater at field sites for several years. Interpretation of these data may be complicated when different laboratories using alternate methods conduct the analysis. An...

  17. Apparatus and method for identification of matrix materials in which transuranic elements are embedded using thermal neutron capture gamma-ray emission

    DOEpatents

    Close, D.A.; Franks, L.A.; Kocimski, S.M.

    1984-08-16

    An invention is described that enables the quantitative simultaneous identification of the matrix materials in which fertile and fissile nuclides are embedded to be made along with the quantitative assay of the fertile and fissile materials. The invention also enables corrections for any absorption of neutrons by the matrix materials and by the measurement apparatus by the measurement of the prompt and delayed neutron flux emerging from a sample after the sample is interrogated by simultaneously applied neutrons and gamma radiation. High energy electrons are directed at a first target to produce gamma radiation. A second target receives the resulting pulsed gamma radiation and produces neutrons from the interaction with the gamma radiation. These neutrons are slowed by a moderator surrounding the sample and bathe the sample uniformly, generating second gamma radiation in the interaction. The gamma radiation is then resolved and quantitatively detected, providing a spectroscopic signature of the constituent elements contained in the matrix and in the materials within the vicinity of the sample. (LEW)

  18. Self-powered integrated microfluidic point-of-care low-cost enabling (SIMPLE) chip

    PubMed Central

    Yeh, Erh-Chia; Fu, Chi-Cheng; Hu, Lucy; Thakur, Rohan; Feng, Jeffrey; Lee, Luke P.

    2017-01-01

    Portable, low-cost, and quantitative nucleic acid detection is desirable for point-of-care diagnostics; however, current polymerase chain reaction testing often requires time-consuming multiple steps and costly equipment. We report an integrated microfluidic diagnostic device capable of on-site quantitative nucleic acid detection directly from the blood without separate sample preparation steps. First, we prepatterned the amplification initiator [magnesium acetate (MgOAc)] on the chip to enable digital nucleic acid amplification. Second, a simplified sample preparation step is demonstrated, where the plasma is separated autonomously into 224 microwells (100 nl per well) without any hemolysis. Furthermore, self-powered microfluidic pumping without any external pumps, controllers, or power sources is accomplished by an integrated vacuum battery on the chip. This simple chip allows rapid quantitative digital nucleic acid detection directly from human blood samples (10 to 105 copies of methicillin-resistant Staphylococcus aureus DNA per microliter, ~30 min, via isothermal recombinase polymerase amplification). These autonomous, portable, lab-on-chip technologies provide promising foundations for future low-cost molecular diagnostic assays. PMID:28345028

  19. On the analysis of complex biological supply chains: From Process Systems Engineering to Quantitative Systems Pharmacology.

    PubMed

    Rao, Rohit T; Scherholz, Megerle L; Hartmanshenn, Clara; Bae, Seul-A; Androulakis, Ioannis P

    2017-12-05

    The use of models in biology has become particularly relevant as it enables investigators to develop a mechanistic framework for understanding the operating principles of living systems as well as in quantitatively predicting their response to both pathological perturbations and pharmacological interventions. This application has resulted in a synergistic convergence of systems biology and pharmacokinetic-pharmacodynamic modeling techniques that has led to the emergence of quantitative systems pharmacology (QSP). In this review, we discuss how the foundational principles of chemical process systems engineering inform the progressive development of more physiologically-based systems biology models.

  20. MRI-guided attenuation correction in whole-body PET/MR: assessment of the effect of bone attenuation.

    PubMed

    Akbarzadeh, A; Ay, M R; Ahmadian, A; Alam, N Riahi; Zaidi, H

    2013-02-01

    Hybrid PET/MRI presents many advantages in comparison with its counterpart PET/CT in terms of improved soft-tissue contrast, decrease in radiation exposure, and truly simultaneous and multi-parametric imaging capabilities. However, the lack of well-established methodology for MR-based attenuation correction is hampering further development and wider acceptance of this technology. We assess the impact of ignoring bone attenuation and using different tissue classes for generation of the attenuation map on the accuracy of attenuation correction of PET data. This work was performed using simulation studies based on the XCAT phantom and clinical input data. For the latter, PET and CT images of patients were used as input for the analytic simulation model using realistic activity distributions where CT-based attenuation correction was utilized as reference for comparison. For both phantom and clinical studies, the reference attenuation map was classified into various numbers of tissue classes to produce three (air, soft tissue and lung), four (air, lungs, soft tissue and cortical bones) and five (air, lungs, soft tissue, cortical bones and spongeous bones) class attenuation maps. The phantom studies demonstrated that ignoring bone increases the relative error by up to 6.8% in the body and up to 31.0% for bony regions. Likewise, the simulated clinical studies showed that the mean relative error reached 15% for lesions located in the body and 30.7% for lesions located in bones, when neglecting bones. These results demonstrate an underestimation of about 30% of tracer uptake when neglecting bone, which in turn imposes substantial loss of quantitative accuracy for PET images produced by hybrid PET/MRI systems. Considering bones in the attenuation map will considerably improve the accuracy of MR-guided attenuation correction in hybrid PET/MR to enable quantitative PET imaging on hybrid PET/MR technologies.

  1. Novel graphene-based biosensor for early detection of Zika virus infection.

    PubMed

    Afsahi, Savannah; Lerner, Mitchell B; Goldstein, Jason M; Lee, Joo; Tang, Xiaoling; Bagarozzi, Dennis A; Pan, Deng; Locascio, Lauren; Walker, Amy; Barron, Francie; Goldsmith, Brett R

    2018-02-15

    We have developed a cost-effective and portable graphene-enabled biosensor to detect Zika virus with a highly specific immobilized monoclonal antibody. Field Effect Biosensing (FEB) with monoclonal antibodies covalently linked to graphene enables real-time, quantitative detection of native Zika viral (ZIKV) antigens. The percent change in capacitance in response to doses of antigen (ZIKV NS1) coincides with levels of clinical significance with detection of antigen in buffer at concentrations as low as 450pM. Potential diagnostic applications were demonstrated by measuring Zika antigen in a simulated human serum. Selectivity was validated using Japanese Encephalitis NS1, a homologous and potentially cross-reactive viral antigen. Further, the graphene platform can simultaneously provide the advanced quantitative data of nonclinical biophysical kinetics tools, making it adaptable to both clinical research and possible diagnostic applications. The speed, sensitivity, and selectivity of this first-of-its-kind graphene-enabled Zika biosensor make it an ideal candidate for development as a medical diagnostic test. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  2. Trans-Proteomic Pipeline, a standardized data processing pipeline for large-scale reproducible proteomics informatics

    PubMed Central

    Deutsch, Eric W.; Mendoza, Luis; Shteynberg, David; Slagel, Joseph; Sun, Zhi; Moritz, Robert L.

    2015-01-01

    Democratization of genomics technologies has enabled the rapid determination of genotypes. More recently the democratization of comprehensive proteomics technologies is enabling the determination of the cellular phenotype and the molecular events that define its dynamic state. Core proteomic technologies include mass spectrometry to define protein sequence, protein:protein interactions, and protein post-translational modifications. Key enabling technologies for proteomics are bioinformatic pipelines to identify, quantitate, and summarize these events. The Trans-Proteomics Pipeline (TPP) is a robust open-source standardized data processing pipeline for large-scale reproducible quantitative mass spectrometry proteomics. It supports all major operating systems and instrument vendors via open data formats. Here we provide a review of the overall proteomics workflow supported by the TPP, its major tools, and how it can be used in its various modes from desktop to cloud computing. We describe new features for the TPP, including data visualization functionality. We conclude by describing some common perils that affect the analysis of tandem mass spectrometry datasets, as well as some major upcoming features. PMID:25631240

  3. Trans-Proteomic Pipeline, a standardized data processing pipeline for large-scale reproducible proteomics informatics.

    PubMed

    Deutsch, Eric W; Mendoza, Luis; Shteynberg, David; Slagel, Joseph; Sun, Zhi; Moritz, Robert L

    2015-08-01

    Democratization of genomics technologies has enabled the rapid determination of genotypes. More recently the democratization of comprehensive proteomics technologies is enabling the determination of the cellular phenotype and the molecular events that define its dynamic state. Core proteomic technologies include MS to define protein sequence, protein:protein interactions, and protein PTMs. Key enabling technologies for proteomics are bioinformatic pipelines to identify, quantitate, and summarize these events. The Trans-Proteomics Pipeline (TPP) is a robust open-source standardized data processing pipeline for large-scale reproducible quantitative MS proteomics. It supports all major operating systems and instrument vendors via open data formats. Here, we provide a review of the overall proteomics workflow supported by the TPP, its major tools, and how it can be used in its various modes from desktop to cloud computing. We describe new features for the TPP, including data visualization functionality. We conclude by describing some common perils that affect the analysis of MS/MS datasets, as well as some major upcoming features. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Evaluating HDR photos using Web 2.0 technology

    NASA Astrophysics Data System (ADS)

    Qiu, Guoping; Mei, Yujie; Duan, Jiang

    2011-01-01

    High dynamic range (HDR) photography is an emerging technology that has the potential to dramatically enhance the visual quality and realism of digital photos. One of the key technical challenges of HDR photography is displaying HDR photos on conventional devices through tone mapping or dynamic range compression. Although many different tone mapping techniques have been developed in recent years, evaluating tone mapping operators prove to be extremely difficult. Web2.0, social media and crowd-sourcing are emerging Internet technologies which can be harnessed to harvest the brain power of the mass to solve difficult problems in science, engineering and businesses. Paired comparison is used in the scientific study of preferences and attitudes and has been shown to be capable of obtaining an interval-scale ordering of items along a psychometric dimension such as preference or importance. In this paper, we exploit these technologies for evaluating HDR tone mapping algorithms. We have developed a Web2.0 style system that enables Internet users from anywhere to evaluate tone mapped HDR photos at any time. We adopt a simple paired comparison protocol, Internet users are presented a pair of tone mapped images and are simply asked to select the one that they think is better or click a "no difference" button. These user inputs are collected in the web server and analyzed by a rank aggregation algorithm which ranks the tone mapped photos according to the votes they received. We present experimental results which demonstrate that the emerging Internet technologies can be exploited as a new paradigm for evaluating HDR tone mapping algorithms. The advantages of this approach include the potential of collecting large user inputs under a variety of viewing environments rather than limited user participation under controlled laboratory environments thus enabling more robust and reliable quality assessment. We also present data analysis to correlate user generated qualitative indices with quantitative image statistics which may provide useful guidance for developing better tone mapping operators.

  5. A comparison of manual and quantitative elbow strength testing.

    PubMed

    Shahgholi, Leili; Bengtson, Keith A; Bishop, Allen T; Shin, Alexander Y; Spinner, Robert J; Basford, Jeffrey R; Kaufman, Kenton R

    2012-10-01

    The aim of this study was to compare the clinical ratings of elbow strength obtained by skilled clinicians with objective strength measurement obtained through quantitative testing. A retrospective comparison of subject clinical records with quantitative strength testing results in a motion analysis laboratory was conducted. A total of 110 individuals between the ages of 8 and 65 yrs with traumatic brachial plexus injuries were identified. Patients underwent manual muscle strength testing as assessed on the 5-point British Medical Research Council Scale (5/5, normal; 0/5, absent) and quantitative elbow flexion and extension strength measurements. A total of 92 subjects had elbow flexion testing. Half of the subjects clinically assessed as having normal (5/5) elbow flexion strength on manual muscle testing exhibited less than 42% of their age-expected strength on quantitative testing. Eighty-four subjects had elbow extension strength testing. Similarly, half of those displaying normal elbow extension strength on manual muscle testing were found to have less than 62% of their age-expected values on quantitative testing. Significant differences between manual muscle testing and quantitative findings were not detected for the lesser (0-4) strength grades. Manual muscle testing, even when performed by experienced clinicians, may be more misleading than expected for subjects graded as having normal (5/5) strength. Manual muscle testing estimates for the lesser strength grades (1-4/5) seem reasonably accurate.

  6. Quantitative analysis of background parenchymal enhancement in whole breast on MRI: Influence of menstrual cycle and comparison with a qualitative analysis.

    PubMed

    Jung, Yongsik; Jeong, Seong Kyun; Kang, Doo Kyoung; Moon, Yeorae; Kim, Tae Hee

    2018-06-01

    We quantitatively analyzed background parenchymal enhancement (BPE) in whole breast according to menstrual cycle and compared it with a qualitative analysis method. A data set of breast magnetic resonance imaging (MRI) from 273 breast cancer patients was used. For quantitative analysis, we used semiautomated in-house software with MATLAB. From each voxel of whole breast, the software calculated BPE using following equation: [(signal intensity [SI] at 1 min 30 s after contrast injection - baseline SI)/baseline SI] × 100%. In total, 53 patients had minimal, 108 mild, 87 moderate, and 25 marked BPE. On quantitative analysis, mean BPE values were 33.1% in the minimal, 42.1% in the mild, 59.1% in the moderate, and 81.9% in the marked BPE group showing significant difference (p = .009 for minimal vs. mild, p < 0.001 for other comparisons). Spearman's correlation test showed that there was strong significant correlation between qualitative and quantitative BPE (r = 0.63, p < 0.001). The mean BPE value was 48.7% for patients in the first week of the menstrual cycle, 43.5% in the second week, 49% in the third week, and 49.4% for those in the fourth week. The difference between the second and fourth weeks was significant (p = .005). Median, 90th percentile, and 10th percentile values were also significantly different between the second and fourth weeks but not different in other comparisons (first vs. second, first vs. third, first vs. fourth, second vs. third, or third vs. fourth). Quantitative analysis of BPE correlated well with the qualitative BPE grade. Quantitative BPE values were lowest in the second week and highest in the fourth week. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Quantitative imaging biomarkers: a review of statistical methods for computer algorithm comparisons.

    PubMed

    Obuchowski, Nancy A; Reeves, Anthony P; Huang, Erich P; Wang, Xiao-Feng; Buckler, Andrew J; Kim, Hyun J Grace; Barnhart, Huiman X; Jackson, Edward F; Giger, Maryellen L; Pennello, Gene; Toledano, Alicia Y; Kalpathy-Cramer, Jayashree; Apanasovich, Tatiyana V; Kinahan, Paul E; Myers, Kyle J; Goldgof, Dmitry B; Barboriak, Daniel P; Gillies, Robert J; Schwartz, Lawrence H; Sullivan, Daniel C

    2015-02-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  8. Indirect scaling methods for testing quantitative emotion theories.

    PubMed

    Junge, Martin; Reisenzein, Rainer

    2013-01-01

    Two studies investigated the utility of indirect scaling methods, based on graded pair comparisons, for the testing of quantitative emotion theories. In Study 1, we measured the intensity of relief and disappointment caused by lottery outcomes, and in Study 2, the intensity of disgust evoked by pictures, using both direct intensity ratings and graded pair comparisons. The stimuli were systematically constructed to reflect variables expected to influence the intensity of the emotions according to theoretical models of relief/disappointment and disgust, respectively. Two probabilistic scaling methods were used to estimate scale values from the pair comparison judgements: Additive functional measurement (AFM) and maximum likelihood difference scaling (MLDS). The emotion models were fitted to the direct and indirect intensity measurements using nonlinear regression (Study 1) and analysis of variance (Study 2). Both studies found substantially improved fits of the emotion models for the indirectly determined emotion intensities, with their advantage being evident particularly at the level of individual participants. The results suggest that indirect scaling methods yield more precise measurements of emotion intensity than rating scales and thereby provide stronger tests of emotion theories in general and quantitative emotion theories in particular.

  9. A maize map standard with sequenced core markers, grass genome reference points and 932 expressed sequence tagged sites (ESTs) in a 1736-locus map.

    PubMed Central

    Davis, G L; McMullen, M D; Baysdorfer, C; Musket, T; Grant, D; Staebell, M; Xu, G; Polacco, M; Koster, L; Melia-Hancock, S; Houchins, K; Chao, S; Coe, E H

    1999-01-01

    We have constructed a 1736-locus maize genome map containing1156 loci probed by cDNAs, 545 probed by random genomic clones, 16 by simple sequence repeats (SSRs), 14 by isozymes, and 5 by anonymous clones. Sequence information is available for 56% of the loci with 66% of the sequenced loci assigned functions. A total of 596 new ESTs were mapped from a B73 library of 5-wk-old shoots. The map contains 237 loci probed by barley, oat, wheat, rice, or tripsacum clones, which serve as grass genome reference points in comparisons between maize and other grass maps. Ninety core markers selected for low copy number, high polymorphism, and even spacing along the chromosome delineate the 100 bins on the map. The average bin size is 17 cM. Use of bin assignments enables comparison among different maize mapping populations and experiments including those involving cytogenetic stocks, mutants, or quantitative trait loci. Integration of nonmaize markers in the map extends the resources available for gene discovery beyond the boundaries of maize mapping information into the expanse of map, sequence, and phenotype information from other grass species. This map provides a foundation for numerous basic and applied investigations including studies of gene organization, gene and genome evolution, targeted cloning, and dissection of complex traits. PMID:10388831

  10. Analytical model of rotor wake aerodynamics in ground effect

    NASA Technical Reports Server (NTRS)

    Saberi, H. A.

    1983-01-01

    The model and the computer program developed provides the velocity, location, and circulation of the tip vortices of a two-blade helicopter in and out of the ground effect. Comparison of the theoretical results with some experimental measurements for the location of the wake indicate that there is excellent accuracy in the vicinity of the rotor and fair amount of accuracy far from it. Having the location of the wake at all times enables us to compute the history of the velocity and the location of any point in the flow. The main goal of out study, induced velocity at the rotor, can also be calculated in addition to stream lines and streak lines. Since the wake location close to the rotor is known more accurately than at other places, the calculated induced velocity over the disc should be a good estimate of the real induced velocity, with the exception of the blade location, because each blade was replaced only by a vortex line. Because no experimental measurements of the wake close to the ground were available to us, quantitative evaluation of the theoretical wake was not possible. But qualitatively we have been able to show excellent agreement. Comparison of flow visualization with out results has indicated the location of the ground vortex is estimated excellently. Also the flow field in hover is well represented.

  11. Comparison of space radiation calculations for deterministic and Monte Carlo transport codes

    NASA Astrophysics Data System (ADS)

    Lin, Zi-Wei; Adams, James; Barghouty, Abdulnasser; Randeniya, Sharmalee; Tripathi, Ram; Watts, John; Yepes, Pablo

    For space radiation protection of astronauts or electronic equipments, it is necessary to develop and use accurate radiation transport codes. Radiation transport codes include deterministic codes, such as HZETRN from NASA and UPROP from the Naval Research Laboratory, and Monte Carlo codes such as FLUKA, the Geant4 toolkit and HETC-HEDS. The deterministic codes and Monte Carlo codes complement each other in that deterministic codes are very fast while Monte Carlo codes are more elaborate. Therefore it is important to investigate how well the results of deterministic codes compare with those of Monte Carlo transport codes and where they differ. In this study we evaluate these different codes in their space radiation applications by comparing their output results in the same given space radiation environments, shielding geometry and material. Typical space radiation environments such as the 1977 solar minimum galactic cosmic ray environment are used as the well-defined input, and simple geometries made of aluminum, water and/or polyethylene are used to represent the shielding material. We then compare various outputs of these codes, such as the dose-depth curves and the flux spectra of different fragments and other secondary particles. These comparisons enable us to learn more about the main differences between these space radiation transport codes. At the same time, they help us to learn the qualitative and quantitative features that these transport codes have in common.

  12. Global observations of tropospheric BrO columns using GOME-2 satellite data

    NASA Astrophysics Data System (ADS)

    Theys, N.; van Roozendael, M.; Hendrick, F.; Yang, X.; de Smedt, I.; Richter, A.; Begoin, M.; Errera, Q.; Johnston, P. V.; Kreher, K.; de Mazière, M.

    2011-02-01

    Measurements from the GOME-2 satellite instrument have been analyzed for tropospheric BrO using a residual technique that combines measured BrO columns and estimates of the stratospheric BrO content from a climatological approach driven by O3 and NO2 observations. Comparisons between the GOME-2 results and BrO vertical columns derived from correlative ground-based and SCIAMACHY nadir observations, present a good level of consistency. We show that the adopted technique enables separation of stratospheric and tropospheric fractions of the measured total BrO columns and allows quantitative study of the BrO plumes in polar regions. While some satellite observed plumes of enhanced BrO can be explained by stratospheric descending air, we show that most BrO hotspots are of tropospheric origin, although they are often associated to regions with low tropopause heights as well. Elaborating on simulations using the p-TOMCAT tropospheric chemical transport model, this result is found to be consistent with the mechanism of bromine release through sea salt aerosols production during blowing snow events. No definitive conclusion can be drawn however on the importance of blowing snow sources in comparison to other bromine release mechanisms. Outside polar regions, evidence is provided for a global tropospheric BrO background with column of 1-3 × 1013 molec cm-2, consistent with previous estimates.

  13. A Brief Review of Facial Emotion Recognition Based on Visual Information.

    PubMed

    Ko, Byoung Chul

    2018-01-30

    Facial emotion recognition (FER) is an important topic in the fields of computer vision and artificial intelligence owing to its significant academic and commercial potential. Although FER can be conducted using multiple sensors, this review focuses on studies that exclusively use facial images, because visual expressions are one of the main information channels in interpersonal communication. This paper provides a brief review of researches in the field of FER conducted over the past decades. First, conventional FER approaches are described along with a summary of the representative categories of FER systems and their main algorithms. Deep-learning-based FER approaches using deep networks enabling "end-to-end" learning are then presented. This review also focuses on an up-to-date hybrid deep-learning approach combining a convolutional neural network (CNN) for the spatial features of an individual frame and long short-term memory (LSTM) for temporal features of consecutive frames. In the later part of this paper, a brief review of publicly available evaluation metrics is given, and a comparison with benchmark results, which are a standard for a quantitative comparison of FER researches, is described. This review can serve as a brief guidebook to newcomers in the field of FER, providing basic knowledge and a general understanding of the latest state-of-the-art studies, as well as to experienced researchers looking for productive directions for future work.

  14. Kinematic Analysis of a Six-Degrees-of-Freedom Model Based on ISB Recommendation: A Repeatability Analysis and Comparison with Conventional Gait Model.

    PubMed

    Żuk, Magdalena; Pezowicz, Celina

    2015-01-01

    Objective. The purpose of the present work was to assess the validity of a six-degrees-of-freedom gait analysis model based on the ISB recommendation on definitions of joint coordinate systems (ISB 6DOF) through a quantitative comparison with the Helen Hays model (HH) and repeatability assessment. Methods. Four healthy subjects were analysed with both marker sets: an HH marker set and four marker clusters in ISB 6DOF. A navigated pointer was used to indicate the anatomical landmark position in the cluster reference system according to the ISB recommendation. Three gait cycles were selected from the data collected simultaneously for the two marker sets. Results. Two protocols showed good intertrial repeatability, which apart from pelvic rotation did not exceed 2°. The greatest differences between protocols were observed in the transverse plane as well as for knee angles. Knee internal/external rotation revealed the lowest subject-to-subject and interprotocol repeatability and inconsistent patterns for both protocols. Knee range of movement in transverse plane was overestimated for the HH set (the mean is 34°), which could indicate the cross-talk effect. Conclusions. The ISB 6DOF anatomically based protocol enabled full 3D kinematic description of joints according to the current standard with clinically acceptable intertrial repeatability and minimal equipment requirements.

  15. Using Public Data for Comparative Proteome Analysis in Precision Medicine Programs.

    PubMed

    Hughes, Christopher S; Morin, Gregg B

    2018-03-01

    Maximizing the clinical utility of information obtained in longitudinal precision medicine programs would benefit from robust comparative analyses to known information to assess biological features of patient material toward identifying the underlying features driving their disease phenotype. Herein, the potential for utilizing publically deposited mass-spectrometry-based proteomics data to perform inter-study comparisons of cell-line or tumor-tissue materials is investigated. To investigate the robustness of comparison between MS-based proteomics studies carried out with different methodologies, deposited data representative of label-free (MS1) and isobaric tagging (MS2 and MS3 quantification) are utilized. In-depth quantitative proteomics data acquired from analysis of ovarian cancer cell lines revealed the robust recapitulation of observable gene expression dynamics between individual studies carried out using significantly different methodologies. The observed signatures enable robust inter-study clustering of cell line samples. In addition, the ability to classify and cluster tumor samples based on observed gene expression trends when using a single patient sample is established. With this analysis, relevant gene expression dynamics are obtained from a single patient tumor, in the context of a precision medicine analysis, by leveraging a large cohort of repository data as a comparator. Together, these data establish the potential for state-of-the-art MS-based proteomics data to serve as resources for robust comparative analyses in precision medicine applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. A Brief Review of Facial Emotion Recognition Based on Visual Information

    PubMed Central

    2018-01-01

    Facial emotion recognition (FER) is an important topic in the fields of computer vision and artificial intelligence owing to its significant academic and commercial potential. Although FER can be conducted using multiple sensors, this review focuses on studies that exclusively use facial images, because visual expressions are one of the main information channels in interpersonal communication. This paper provides a brief review of researches in the field of FER conducted over the past decades. First, conventional FER approaches are described along with a summary of the representative categories of FER systems and their main algorithms. Deep-learning-based FER approaches using deep networks enabling “end-to-end” learning are then presented. This review also focuses on an up-to-date hybrid deep-learning approach combining a convolutional neural network (CNN) for the spatial features of an individual frame and long short-term memory (LSTM) for temporal features of consecutive frames. In the later part of this paper, a brief review of publicly available evaluation metrics is given, and a comparison with benchmark results, which are a standard for a quantitative comparison of FER researches, is described. This review can serve as a brief guidebook to newcomers in the field of FER, providing basic knowledge and a general understanding of the latest state-of-the-art studies, as well as to experienced researchers looking for productive directions for future work. PMID:29385749

  17. Quantitative Ultrasound for Nondestructive Characterization of Engineered Tissues and Biomaterials

    PubMed Central

    Dalecki, Diane; Mercado, Karla P.; Hocking, Denise C.

    2015-01-01

    Non-invasive, non-destructive technologies for imaging and quantitatively monitoring the development of artificial tissues are critical for the advancement of tissue engineering. Current standard techniques for evaluating engineered tissues, including histology, biochemical assays and mechanical testing, are destructive approaches. Ultrasound is emerging as a valuable tool for imaging and quantitatively monitoring the properties of engineered tissues and biomaterials longitudinally during fabrication and post-implantation. Ultrasound techniques are rapid, non-invasive, non-destructive and can be easily integrated into sterile environments necessary for tissue engineering. Furthermore, high-frequency quantitative ultrasound techniques can enable volumetric characterization of the structural, biological, and mechanical properties of engineered tissues during fabrication and post-implantation. This review provides an overview of ultrasound imaging, quantitative ultrasound techniques, and elastography, with representative examples of applications of these ultrasound-based techniques to the field of tissue engineering. PMID:26581347

  18. Dissecting the pathobiology of altered MRI signal in amyotrophic lateral sclerosis: A post mortem whole brain sampling strategy for the integration of ultra-high-field MRI and quantitative neuropathology.

    PubMed

    Pallebage-Gamarallage, Menuka; Foxley, Sean; Menke, Ricarda A L; Huszar, Istvan N; Jenkinson, Mark; Tendler, Benjamin C; Wang, Chaoyue; Jbabdi, Saad; Turner, Martin R; Miller, Karla L; Ansorge, Olaf

    2018-03-13

    Amyotrophic lateral sclerosis (ALS) is a clinically and histopathologically heterogeneous neurodegenerative disorder, in which therapy is hindered by the rapid progression of disease and lack of biomarkers. Magnetic resonance imaging (MRI) has demonstrated its potential for detecting the pathological signature and tracking disease progression in ALS. However, the microstructural and molecular pathological substrate is poorly understood and generally defined histologically. One route to understanding and validating the pathophysiological correlates of MRI signal changes in ALS is to directly compare MRI to histology in post mortem human brains. The article delineates a universal whole brain sampling strategy of pathologically relevant grey matter (cortical and subcortical) and white matter tracts of interest suitable for histological evaluation and direct correlation with MRI. A standardised systematic sampling strategy that was compatible with co-registration of images across modalities was established for regions representing phosphorylated 43-kDa TAR DNA-binding protein (pTDP-43) patterns that were topographically recognisable with defined neuroanatomical landmarks. Moreover, tractography-guided sampling facilitated accurate delineation of white matter tracts of interest. A digital photography pipeline at various stages of sampling and histological processing was established to account for structural deformations that might impact alignment and registration of histological images to MRI volumes. Combined with quantitative digital histology image analysis, the proposed sampling strategy is suitable for routine implementation in a high-throughput manner for acquisition of large-scale histology datasets. Proof of concept was determined in the spinal cord of an ALS patient where multiple MRI modalities (T1, T2, FA and MD) demonstrated sensitivity to axonal degeneration and associated heightened inflammatory changes in the lateral corticospinal tract. Furthermore, qualitative comparison of R2* and susceptibility maps in the motor cortex of 2 ALS patients demonstrated varying degrees of hyperintense signal changes compared to a control. Upon histological evaluation of the same region, intensity of signal changes in both modalities appeared to correspond primarily to the degree of microglial activation. The proposed post mortem whole brain sampling methodology enables the accurate intraindividual study of pathological propagation and comparison with quantitative MRI data, to more fully understand the relationship of imaging signal changes with underlying pathophysiology in ALS.

  19. Neutron-activation analysis applied to copper ores and artifacts

    NASA Technical Reports Server (NTRS)

    Linder, N. F.

    1970-01-01

    Neutron activation analysis is used for quantitative identification of trace metals in copper. Establishing a unique fingerprint of impurities in Michigan copper would enable identification of artifacts made from this copper.

  20. Automated quantitative cytological analysis using portable microfluidic microscopy.

    PubMed

    Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva

    2016-06-01

    In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. DMD-based quantitative phase microscopy and optical diffraction tomography

    NASA Astrophysics Data System (ADS)

    Zhou, Renjie

    2018-02-01

    Digital micromirror devices (DMDs), which offer high speed and high degree of freedoms in steering light illuminations, have been increasingly applied to optical microscopy systems in recent years. Lately, we introduced DMDs into digital holography to enable new imaging modalities and break existing imaging limitations. In this paper, we will first present our progress in using DMDs for demonstrating laser-illumination Fourier ptychographic microscopy (FPM) with shotnoise limited detection. After that, we will present a novel common-path quantitative phase microscopy (QPM) system based on using a DMD. Building on those early developments, a DMD-based high speed optical diffraction tomography (ODT) system has been recently demonstrated, and the results will also be presented. This ODT system is able to achieve video-rate 3D refractive-index imaging, which can potentially enable observations of high-speed 3D sample structural changes.

  2. Quantitative aspects of inductively coupled plasma mass spectrometry

    PubMed Central

    Wagner, Barbara

    2016-01-01

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644971

  3. Electromagnetic Induction: A Computer-Assisted Experiment

    ERIC Educational Resources Information Center

    Fredrickson, J. E.; Moreland, L.

    1972-01-01

    By using minimal equipment it is possible to demonstrate Faraday's Law. An electronic desk calculator enables sophomore students to solve a difficult mathematical expression for the induced EMF. Polaroid pictures of the plot of induced EMF, together with the computer facility, enables students to make comparisons. (PS)

  4. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    PubMed

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  5. Comparative study of the dynamics of lipid membrane phase decomposition in experiment and simulation.

    PubMed

    Burger, Stefan; Fraunholz, Thomas; Leirer, Christian; Hoppe, Ronald H W; Wixforth, Achim; Peter, Malte A; Franke, Thomas

    2013-06-25

    Phase decomposition in lipid membranes has been the subject of numerous investigations by both experiment and theoretical simulation, yet quantitative comparisons of the simulated data to the experimental results are rare. In this work, we present a novel way of comparing the temporal development of liquid-ordered domains obtained from numerically solving the Cahn-Hilliard equation and by inducing a phase transition in giant unilamellar vesicles (GUVs). Quantitative comparison is done by calculating the structure factor of the domain pattern. It turns out that the decomposition takes place in three distinct regimes in both experiment and simulation. These regimes are characterized by different rates of growth of the mean domain diameter, and there is quantitative agreement between experiment and simulation as to the duration of each regime and the absolute rate of growth in each regime.

  6. Single Laboratory Comparison of Quantitative Real-Time PCR Assays for the Detection of Human Fecal Pollution

    EPA Science Inventory

    There are numerous quantitative real-time PCR (qPCR) methods available to detect and enumerate human fecal pollution in ambient waters. Each assay employs distinct primers and/or probes and many target different genes and microorganisms leading to potential variations in method ...

  7. Employment from Solar Energy: A Bright but Partly Cloudy Future.

    ERIC Educational Resources Information Center

    Smeltzer, K. K.; Santini, D. J.

    A comparison of quantitative and qualitative employment effects of solar and conventional systems can prove the increased employment postulated as one of the significant secondary benefits of a shift from conventional to solar energy use. Current quantitative employment estimates show solar technology-induced employment to be generally greater…

  8. Using Facebook as a LMS?

    ERIC Educational Resources Information Center

    Arabacioglu, Taner; Akar-Vural, Ruken

    2014-01-01

    The main purpose of this research was to compare the communication media according to effective teaching. For this purpose, in the research, the mixed method, including quantitative and qualitative data collecting techniques, was applied. For the quantitative part of the research, the static group comparison design was implemented as one of the…

  9. Single Laboratory Comparison of Quantitative Real-time PCR Assays for the Detection of Fecal Pollution

    EPA Science Inventory

    There are numerous quantitative real-time PCR (qPCR) assays available to detect and enumerate fecal pollution in ambient waters. Each assay employs distinct primers and probes that target different rRNA genes and microorganisms leading to potential variations in concentration es...

  10. Comparison of genetic diversity and population structure of Pacific Coast whitebark pine across multiple markers

    Treesearch

    Andrew D. Bower; Bryce A. Richardson; Valerie Hipkins; Regina Rochefort; Carol Aubry

    2011-01-01

    Analysis of "neutral" molecular markers and "adaptive" quantitative traits are common methods of assessing genetic diversity and population structure. Molecular markers typically reflect the effects of demographic and stochastic processes but are generally assumed to not reflect natural selection. Conversely, quantitative (or "adaptive")...

  11. Testing process predictions of models of risky choice: a quantitative model comparison approach

    PubMed Central

    Pachur, Thorsten; Hertwig, Ralph; Gigerenzer, Gerd; Brandstätter, Eduard

    2013-01-01

    This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or non-linear functions thereof) and the separate evaluation of risky options (expectation models). Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models). We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter et al., 2006), and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up) and direction of search (i.e., gamble-wise vs. reason-wise). In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly); acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988) called “similarity.” In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies. PMID:24151472

  12. [The development of a computer model in the quantitative assessment of thallium-201 myocardial scintigraphy].

    PubMed

    Raineri, M; Traina, M; Rotolo, A; Candela, B; Lombardo, R M; Raineri, A A

    1993-05-01

    Thallium-201 scintigraphy is a widely used noninvasive procedure for the detection and prognostic assessment of patients with suspected or proven coronary artery disease. Thallium uptake can be evaluated by a visual analysis or by a quantitative interpretation. Quantitative scintigraphy enhances disease detection in individual coronary arteries, provides a more precise estimate of the amount of ischemic myocardium, distinguishing scar from hypoperfused tissue. Due to the great deal of data, analysis, interpretation and comparison of thallium uptake can be very complex. We designed a computer-based system for the interpretation of quantitative thallium-201 scintigraphy data uptake. We used a database (DataEase 4.2-DataEase Italia). Our software has the following functions: data storage; calculation; conversion of numerical data into different definitions classifying myocardial perfusion; uptake data comparison; automatic conclusion; comparison of different scintigrams for the same patient. Our software is made up by 4 sections: numeric analysis, descriptive analysis, automatic conclusion, clinical remarks. We introduced in the computer system appropriate information, "logical paths", that use the "IF ... THEN" rules. The software executes these rules in order to analyze the myocardial regions in the 3 phases of scintigraphic analysis (stress, redistribution, re-injection), in the 3 projections (LAO 45 degrees, LAT,ANT), considering our uptake cutoff, obtaining, finally, the automatic conclusions. For these reasons, our computer-based system could be considered a real "expert system".

  13. Interpretation of protein quantitation using the Bradford assay: comparison with two calculation models.

    PubMed

    Ku, Hyung-Keun; Lim, Hyuk-Min; Oh, Kyong-Hwa; Yang, Hyo-Jin; Jeong, Ji-Seon; Kim, Sook-Kyung

    2013-03-01

    The Bradford assay is a simple method for protein quantitation, but variation in the results between proteins is a matter of concern. In this study, we compared and normalized quantitative values from two models for protein quantitation, where the residues in the protein that bind to anionic Coomassie Brilliant Blue G-250 comprise either Arg and Lys (Method 1, M1) or Arg, Lys, and His (Method 2, M2). Use of the M2 model yielded much more consistent quantitation values compared with use of the M1 model, which exhibited marked overestimations against protein standards. Copyright © 2012 Elsevier Inc. All rights reserved.

  14. Risk manager formula for success: Influencing decision making.

    PubMed

    Midgley, Mike

    2017-10-01

    Providing the ultimate decision makers with a quantitative risk analysis based on thoughtful assessment by the organization's experts enables an efficient decision. © 2017 American Society for Healthcare Risk Management of the American Hospital Association.

  15. Reasoning strategies with rational numbers revealed by eye tracking.

    PubMed

    Plummer, Patrick; DeWolf, Melissa; Bassok, Miriam; Gordon, Peter C; Holyoak, Keith J

    2017-07-01

    Recent research has begun to investigate the impact of different formats for rational numbers on the processes by which people make relational judgments about quantitative relations. DeWolf, Bassok, and Holyoak (Journal of Experimental Psychology: General, 144(1), 127-150, 2015) found that accuracy on a relation identification task was highest when fractions were presented with countable sets, whereas accuracy was relatively low for all conditions where decimals were presented. However, it is unclear what processing strategies underlie these disparities in accuracy. We report an experiment that used eye-tracking methods to externalize the strategies that are evoked by different types of rational numbers for different types of quantities (discrete vs. continuous). Results showed that eye-movement behavior during the task was jointly determined by image and number format. Discrete images elicited a counting strategy for both fractions and decimals, but this strategy led to higher accuracy only for fractions. Continuous images encouraged magnitude estimation and comparison, but to a greater degree for decimals than fractions. This strategy led to decreased accuracy for both number formats. By analyzing participants' eye movements when they viewed a relational context and made decisions, we were able to obtain an externalized representation of the strategic choices evoked by different ontological types of entities and different types of rational numbers. Our findings using eye-tracking measures enable us to go beyond previous studies based on accuracy data alone, demonstrating that quantitative properties of images and the different formats for rational numbers jointly influence strategies that generate eye-movement behavior.

  16. Quantification of Peptides from Immunoglobulin Constant and Variable Regions by Liquid Chromatography-Multiple Reaction Monitoring Mass Spectrometry for Assessment of Multiple Myeloma Patients

    PubMed Central

    Remily-Wood, Elizabeth R.; Benson, Kaaron; Baz, Rachid C.; Chen, Y. Ann; Hussein, Mohamad; Hartley-Brown, Monique A.; Sprung, Robert W.; Perez, Brianna; Liu, Richard Z.; Yoder, Sean; Teer, Jamie; Eschrich, Steven A.; Koomen, John M.

    2014-01-01

    Purpose Quantitative mass spectrometry assays for immunoglobulins (Igs) are compared with existing clinical methods in samples from patients with plasma cell dyscrasias, e.g. multiple myeloma. Experimental design Using LC-MS/MS data, Ig constant region peptides and transitions were selected for liquid chromatography-multiple reaction monitoring mass spectrometry (LC-MRM). Quantitative assays were used to assess Igs in serum from 83 patients. Results LC-MRM assays quantify serum levels of Igs and their isoforms (IgG1–4, IgA1–2, IgM, IgD, and IgE, as well as kappa(κ) and lambda(λ) light chains). LC-MRM quantification has been applied to single samples from a patient cohort and a longitudinal study of an IgE patient undergoing treatment, to enable comparison with existing clinical methods. Proof-of-concept data for defining and monitoring variable region peptides are provided using the H929 multiple myeloma cell line and two MM patients. Conclusions and Clinical Relevance LC-MRM assays targeting constant region peptides determine the type and isoform of the involved immunoglobulin and quantify its expression; the LC-MRM approach has improved sensitivity compared with the current clinical method, but slightly higher interassay variability. Detection of variable region peptides is a promising way to improve Ig quantification, which could produce a dramatic increase in sensitivity over existing methods, and could further complement current clinical techniques. PMID:24723328

  17. Quantification of peptides from immunoglobulin constant and variable regions by LC-MRM MS for assessment of multiple myeloma patients.

    PubMed

    Remily-Wood, Elizabeth R; Benson, Kaaron; Baz, Rachid C; Chen, Y Ann; Hussein, Mohamad; Hartley-Brown, Monique A; Sprung, Robert W; Perez, Brianna; Liu, Richard Z; Yoder, Sean J; Teer, Jamie K; Eschrich, Steven A; Koomen, John M

    2014-10-01

    Quantitative MS assays for Igs are compared with existing clinical methods in samples from patients with plasma cell dyscrasias, for example, multiple myeloma (MM). Using LC-MS/MS data, Ig constant region peptides, and transitions were selected for LC-MRM MS. Quantitative assays were used to assess Igs in serum from 83 patients. RNA sequencing and peptide-based LC-MRM are used to define peptides for quantification of the disease-specific Ig. LC-MRM assays quantify serum levels of Igs and their isoforms (IgG1-4, IgA1-2, IgM, IgD, and IgE, as well as kappa (κ) and lambda (λ) light chains). LC-MRM quantification has been applied to single samples from a patient cohort and a longitudinal study of an IgE patient undergoing treatment, to enable comparison with existing clinical methods. Proof-of-concept data for defining and monitoring variable region peptides are provided using the H929 MM cell line and two MM patients. LC-MRM assays targeting constant region peptides determine the type and isoform of the involved Ig and quantify its expression; the LC-MRM approach has improved sensitivity compared with the current clinical method, but slightly higher inter-assay variability. Detection of variable region peptides is a promising way to improve Ig quantification, which could produce a dramatic increase in sensitivity over existing methods, and could further complement current clinical techniques. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Analysis of the quantitative dermatoglyphics of the digito-palmar complex in patients with multiple sclerosis.

    PubMed

    Supe, S; Milicić, J; Pavićević, R

    1997-06-01

    Recent studies on the etiopathogenesis of multiple sclerosis (MS) all point out that there is a polygenetical predisposition for this illness. The so called "MS Trait" determines the reactivity of the immunological system upon ecological factors. The development of the glyphological science and the study of the characteristics of the digito-palmar dermatoglyphic complex (for which it was established that they are polygenetically determined characteristics) all enable a better insight into the genetic development during early embriogenesis. The aim of this study was to estimate certain differences in the dermatoglyphics of digito-palmar complexes between the group with multiple sclerosis and the comparable, phenotypically healthy groups of both sexes. This study is based on the analysis of 18 quantitative characteristics of the digito-palmar complex in 125 patients with multiple sclerosis (41 males and 84 females) in comparison to a group of 400 phenotypically healthy patients (200 males and 200 females). The conducted analysis pointed towards a statistically significant decrease of the number of digital and palmar ridges, as well as with lower values of atd angles in a group of MS patients of both sexes. The main discriminators were the characteristic palmar dermatoglyphics with the possibility that the discriminate analysis classifies over 80% of the examinees which exceeds the statistical significance. The results of this study suggest a possible discrimination of patients with MS and the phenotypically health population through the analysis of the dermatoglyphic status, and therefore the possibility that multiple sclerosis is genetically predisposed disease.

  19. Cortical maturation and myelination in healthy toddlers and young children.

    PubMed

    Deoni, Sean C L; Dean, Douglas C; Remer, Justin; Dirks, Holly; O'Muircheartaigh, Jonathan

    2015-07-15

    The maturation of cortical structures, and the establishment of their connectivity, are critical neurodevelopmental processes that support and enable cognitive and behavioral functioning. Measures of cortical development, including thickness, curvature, and gyrification have been extensively studied in older children, adolescents, and adults, revealing regional associations with cognitive performance, and alterations with disease or pathology. In addition to these gross morphometric measures, increased attention has recently focused on quantifying more specific indices of cortical structure, in particular intracortical myelination, and their relationship to cognitive skills, including IQ, executive functioning, and language performance. Here we analyze the progression of cortical myelination across early childhood, from 1 to 6 years of age, in vivo for the first time. Using two quantitative imaging techniques, namely T1 relaxation time and myelin water fraction (MWF) imaging, we characterize myelination throughout the cortex, examine developmental trends, and investigate hemispheric and gender-based differences. We present a pattern of cortical myelination that broadly mirrors established histological timelines, with somatosensory, motor and visual cortices myelinating by 1 year of age; and frontal and temporal cortices exhibiting more protracted myelination. Developmental trajectories, defined by logarithmic functions (increasing for MWF, decreasing for T1), were characterized for each of 68 cortical regions. Comparisons of trajectories between hemispheres and gender revealed no significant differences. Results illustrate the ability to quantitatively map cortical myelination throughout early neurodevelopment, and may provide an important new tool for investigating typical and atypical development. Copyright © 2015. Published by Elsevier Inc.

  20. Quantitative Image Informatics for Cancer Research (QIICR) | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    Imaging has enormous untapped potential to improve cancer research through software to extract and process morphometric and functional biomarkers. In the era of non-cytotoxic treatment agents, multi- modality image-guided ablative therapies and rapidly evolving computational resources, quantitative imaging software can be transformative in enabling minimally invasive, objective and reproducible evaluation of cancer treatment response. Post-processing algorithms are integral to high-throughput analysis and fine- grained differentiation of multiple molecular targets.

  1. The application of time series models to cloud field morphology analysis

    NASA Technical Reports Server (NTRS)

    Chin, Roland T.; Jau, Jack Y. C.; Weinman, James A.

    1987-01-01

    A modeling method for the quantitative description of remotely sensed cloud field images is presented. A two-dimensional texture modeling scheme based on one-dimensional time series procedures is adopted for this purpose. The time series procedure used is the seasonal autoregressive, moving average (ARMA) process in Box and Jenkins. Cloud field properties such as directionality, clustering and cloud coverage can be retrieved by this method. It has been demonstrated that a cloud field image can be quantitatively defined by a small set of parameters and synthesized surrogates can be reconstructed from these model parameters. This method enables cloud climatology to be studied quantitatively.

  2. Quantitative analysis of culture using millions of digitized books

    PubMed Central

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  3. Quantitative analysis of culture using millions of digitized books.

    PubMed

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.

  4. Qualitative and quantitative interpretation of SEM image using digital image processing.

    PubMed

    Saladra, Dawid; Kopernik, Magdalena

    2016-10-01

    The aim of the this study is improvement of qualitative and quantitative analysis of scanning electron microscope micrographs by development of computer program, which enables automatic crack analysis of scanning electron microscopy (SEM) micrographs. Micromechanical tests of pneumatic ventricular assist devices result in a large number of micrographs. Therefore, the analysis must be automatic. Tests for athrombogenic titanium nitride/gold coatings deposited on polymeric substrates (Bionate II) are performed. These tests include microshear, microtension and fatigue analysis. Anisotropic surface defects observed in the SEM micrographs require support for qualitative and quantitative interpretation. Improvement of qualitative analysis of scanning electron microscope images was achieved by a set of computational tools that includes binarization, simplified expanding, expanding, simple image statistic thresholding, the filters Laplacian 1, and Laplacian 2, Otsu and reverse binarization. Several modifications of the known image processing techniques and combinations of the selected image processing techniques were applied. The introduced quantitative analysis of digital scanning electron microscope images enables computation of stereological parameters such as area, crack angle, crack length, and total crack length per unit area. This study also compares the functionality of the developed computer program of digital image processing with existing applications. The described pre- and postprocessing may be helpful in scanning electron microscopy and transmission electron microscopy surface investigations. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  5. Single molecule quantitation and sequencing of rare translocations using microfluidic nested digital PCR.

    PubMed

    Shuga, Joe; Zeng, Yong; Novak, Richard; Lan, Qing; Tang, Xiaojiang; Rothman, Nathaniel; Vermeulen, Roel; Li, Laiyu; Hubbard, Alan; Zhang, Luoping; Mathies, Richard A; Smith, Martyn T

    2013-09-01

    Cancers are heterogeneous and genetically unstable. New methods are needed that provide the sensitivity and specificity to query single cells at the genetic loci that drive cancer progression, thereby enabling researchers to study the progression of individual tumors. Here, we report the development and application of a bead-based hemi-nested microfluidic droplet digital PCR (dPCR) technology to achieve 'quantitative' measurement and single-molecule sequencing of somatically acquired carcinogenic translocations at extremely low levels (<10(-6)) in healthy subjects. We use this technique in our healthy study population to determine the overall concentration of the t(14;18) translocation, which is strongly associated with follicular lymphoma. The nested dPCR approach improves the detection limit to 1×10(-7) or lower while maintaining the analysis efficiency and specificity. Further, the bead-based dPCR enabled us to isolate and quantify the relative amounts of the various clonal forms of t(14;18) translocation in these subjects, and the single-molecule sensitivity and resolution of dPCR led to the discovery of new clonal forms of t(14;18) that were otherwise masked by the conventional quantitative PCR measurements. In this manner, we created a quantitative map for this carcinogenic mutation in this healthy population and identified the positions on chromosomes 14 and 18 where the vast majority of these t(14;18) events occur.

  6. Reverse Fluorescence Enhancement and Colorimetric Bimodal Signal Readout Immunochromatography Test Strip for Ultrasensitive Large-Scale Screening and Postoperative Monitoring.

    PubMed

    Yao, Yingyi; Guo, Weisheng; Zhang, Jian; Wu, Yudong; Fu, Weihua; Liu, Tingting; Wu, Xiaoli; Wang, Hanjie; Gong, Xiaoqun; Liang, Xing-Jie; Chang, Jin

    2016-09-07

    Ultrasensitive and quantitative fast screening of cancer biomarkers by immunochromatography test strip (ICTS) is still challenging in clinic. The gold nanoparticles (NPs) based ICTS with colorimetric readout enables a quick spectrum screening but suffers from nonquantitative performance; although ICTS with fluorescence readout (FICTS) allows quantitative detection, its sensitivity still deserves more efforts and attentions. In this work, by taking advantages of colorimetric ICTS and FICTS, we described a reverse fluorescence enhancement ICTS (rFICTS) with bimodal signal readout for ultrasensitive and quantitative fast screening of carcinoembryonic antigen (CEA). In the presence of target, gold NPs aggregation in T line induced colorimetric readout, allowing on-the-spot spectrum screening in 10 min by naked eye. Meanwhile, the reverse fluorescence enhancement signal enabled more accurately quantitative detection with better sensitivity (5.89 pg/mL for CEA), which is more than 2 orders of magnitude lower than that of the conventional FICTS. The accuracy and stability of the rFICTS were investigated with more than 100 clinical serum samples for large-scale screening. Furthermore, this rFICTS also realized postoperative monitoring by detecting CEA in a patient with colon cancer and comparing with CT imaging diagnosis. These results indicated this rFICTS is particularly suitable for point-of-care (POC) diagnostics in both resource-rich and resource-limited settings.

  7. Quantitative analysis of periodontal pathogens by ELISA and real-time polymerase chain reaction.

    PubMed

    Hamlet, Stephen M

    2010-01-01

    The development of analytical methods enabling the accurate identification and enumeration of bacterial species colonizing the oral cavity has led to the identification of a small number of bacterial pathogens that are major factors in the etiology of periodontal disease. Further, these methods also underpin more recent epidemiological analyses of the impact of periodontal disease on general health. Given the complex milieu of over 700 species of microorganisms known to exist within the complex biofilms found in the oral cavity, the identification and enumeration of oral periodontopathogens has not been an easy task. In recent years however, some of the intrinsic limitations of the more traditional microbiological analyses previously used have been overcome with the advent of immunological and molecular analytical methods. Of the plethora of methodologies reported in the literature, the enzyme-linked immunosorbent assay (ELISA), which combines the specificity of antibody with the sensitivity of simple enzyme assays and the polymerase chain reaction (PCR), has been widely utilized in both laboratory and clinical applications. Although conventional PCR does not allow quantitation of the target organism, real-time PCR (rtPCR) has the ability to detect amplicons as they accumulate in "real time" allowing subsequent quantitation. These methods enable the accurate quantitation of as few as 10(2) (using rtPCR) to 10(4) (using ELISA) periodontopathogens in dental plaque samples.

  8. Synchrotron Bragg diffraction imaging characterization of synthetic diamond crystals for optical and electronic power device applications1 1

    PubMed Central

    Tran Thi, Thu Nhi; Morse, J.; Caliste, D.; Fernandez, B.; Eon, D.; Härtwig, J.; Mer-Calfati, C.; Tranchant, N.; Arnault, J. C.; Lafford, T. A.; Baruchel, J.

    2017-01-01

    Bragg diffraction imaging enables the quality of synthetic single-crystal diamond substrates and their overgrown, mostly doped, diamond layers to be characterized. This is very important for improving diamond-based devices produced for X-ray optics and power electronics applications. The usual first step for this characterization is white-beam X-ray diffraction topography, which is a simple and fast method to identify the extended defects (dislocations, growth sectors, boundaries, stacking faults, overall curvature etc.) within the crystal. This allows easy and quick comparison of the crystal quality of diamond plates available from various commercial suppliers. When needed, rocking curve imaging (RCI) is also employed, which is the quantitative counterpart of monochromatic Bragg diffraction imaging. RCI enables the local determination of both the effective misorientation, which results from lattice parameter variation and the local lattice tilt, and the local Bragg position. Maps derived from these parameters are used to measure the magnitude of the distortions associated with polishing damage and the depth of this damage within the volume of the crystal. For overgrown layers, these maps also reveal the distortion induced by the incorporation of impurities such as boron, or the lattice parameter variations associated with the presence of growth-incorporated nitrogen. These techniques are described, and their capabilities for studying the quality of diamond substrates and overgrown layers, and the surface damage caused by mechanical polishing, are illustrated by examples. PMID:28381981

  9. CLMSVault: A Software Suite for Protein Cross-Linking Mass-Spectrometry Data Analysis and Visualization.

    PubMed

    Courcelles, Mathieu; Coulombe-Huntington, Jasmin; Cossette, Émilie; Gingras, Anne-Claude; Thibault, Pierre; Tyers, Mike

    2017-07-07

    Protein cross-linking mass spectrometry (CL-MS) enables the sensitive detection of protein interactions and the inference of protein complex topology. The detection of chemical cross-links between protein residues can identify intra- and interprotein contact sites or provide physical constraints for molecular modeling of protein structure. Recent innovations in cross-linker design, sample preparation, mass spectrometry, and software tools have significantly improved CL-MS approaches. Although a number of algorithms now exist for the identification of cross-linked peptides from mass spectral data, a dearth of user-friendly analysis tools represent a practical bottleneck to the broad adoption of the approach. To facilitate the analysis of CL-MS data, we developed CLMSVault, a software suite designed to leverage existing CL-MS algorithms and provide intuitive and flexible tools for cross-platform data interpretation. CLMSVault stores and combines complementary information obtained from different cross-linkers and search algorithms. CLMSVault provides filtering, comparison, and visualization tools to support CL-MS analyses and includes a workflow for label-free quantification of cross-linked peptides. An embedded 3D viewer enables the visualization of quantitative data and the mapping of cross-linked sites onto PDB structural models. We demonstrate the application of CLMSVault for the analysis of a noncovalent Cdc34-ubiquitin protein complex cross-linked under different conditions. CLMSVault is open-source software (available at https://gitlab.com/courcelm/clmsvault.git ), and a live demo is available at http://democlmsvault.tyerslab.com/ .

  10. Through the Looking GLASS: A JWST Exploration of Galaxy Formation and Evolution from Cosmic Dawn to Present Day

    NASA Astrophysics Data System (ADS)

    Treu, Tommaso; Abramson, L.; Bradac, M.; Brammer, G.; Fontana, A.; Henry, A.; Hoag, A.; Huang, K.; Mason, C.; Morishita, T.; Pentericci, L.; Wang, X.

    2017-11-01

    We propose a carefully designed set of observations of the lensing cluster Abell 2744 to study intrinsically faint magnified galaxies from the epoch of reionization to redshift of 1, demonstrating and characterizing complementary spectroscopic modes with NIRSPEC and NIRISS. The observations are designed to address the questions: 1) when did reionization happen and what were the sources of reionizing photons? 2) How do baryons cycle in and out of galaxies? This dataset with deep spectroscopy on the cluster and deep multiband NIRCAM imaging in parallel will enable a wealth of investigations and will thus be of interest to a broad section of the astronomical community. The dataset will illustrate the power and challenges of: 1) combining rest frame UV and optical NIRSPEC spectroscopy for galaxies at the epoch of reionization, 2) obtaining spatially resolved emission line maps with NIRISS, 3) combining NIRISS and NIRSPEC spectroscopy. Building on our extensive experience with HST slitless spectroscopy and imaging in clusters of galaxies as part of the GLASS, WISP, SURFSUP, and ASTRODEEP projects, we will provide the following science-enabling products to the community: 1)quantitative comparison of spatially resolved (NIRISS) and spectrally resolved (NIRSPEC) spectroscopy, 2) Object based interactive exploration tools for multi-instrument datasets, 3) Interface for easy forced extractionof slitless spectra based on coordinates, 4) UV-optical spectroscopic templates of highredshift galaxies, 5) NIRCAM parallel catalogs and a list of 26 z>=9 dropouts for spectroscopic follow-up in Cycle-2.

  11. Dynamics of zebrafish fin regeneration using a pulsed SILAC approach.

    PubMed

    Nolte, Hendrik; Hölper, Soraya; Housley, Michael P; Islam, Shariful; Piller, Tanja; Konzer, Anne; Stainier, Didier Y R; Braun, Thomas; Krüger, Marcus

    2015-02-01

    The zebrafish owns remarkable regenerative capacities allowing regeneration of several tissues, including the heart, liver, and brain. To identify protein dynamics during fin regeneration we used a pulsed SILAC approach that enabled us to detect the incorporation of (13) C6 -lysine (Lys6) into newly synthesized proteins. Samples were taken at four different time points from noninjured and regrowing fins and incorporation rates were monitored using a combination of single-shot 4-h gradients and high-resolution tandem MS. We identified more than 5000 labeled proteins during the first 3 weeks of fin regeneration and were able to monitor proteins that are responsible for initializing and restoring the shape of these appendages. The comparison of Lys6 incorporation rates between noninjured and regrowing fins enabled us to identify proteins that are directly involved in regeneration. For example, we observed increased incorporation rates of two actinodin family members at the actinotrichia, which is a hairlike fiber structure at the tip of regrowing fins. Moreover, we used quantitative real-time RNA measurements of several candidate genes, including osteoglycin, si:ch211-288h17.3, and prostaglandin reductase 1 to correlate the mRNA expression to Lys6 incorporation data. This novel pulsed SILAC methodology in fish can be used as a versatile tool to monitor newly synthesized proteins and will help to characterize protein dynamics during regenerative processes in zebrafish beyond fin regeneration. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Discriminatory validity of the Aspects of Wheelchair Mobility Test as demonstrated by a comparison of four wheelchair types designed for use in low-resource areas

    PubMed Central

    Hamm, Elisa; Wee, Joy

    2017-01-01

    Background Comparative effectiveness research on wheelchairs available in low-resource areas is needed to enable effective use of limited funds. Mobility on commonly encountered rolling environments is a key aspect of function. High variation in capacity among wheelchair users can mask changes in mobility because of wheelchair design. A repeated measures protocol in which the participants use one type of wheelchair and then another minimises the impact of individual variation. Objectives The Aspects of Wheelchair Mobility Test (AWMT) was designed to be used in repeated measures studies in low-resource areas. It measures the impact of different wheelchair types on physical performance in commonly encountered rolling environments and provides an opportunity for qualitative and quantitative participant response. This study sought to confirm the ability of the AWMT to discern differences in mobility because of wheelchair design. Method Participants were wheelchair users at a boarding school for students with disabilities in a low-resource area. Each participant completed timed tests on measured tracks on rough and smooth surfaces, in tight spaces and over curbs. Four types of wheelchairs designed for use in low-resource areas were included. Results The protocol demonstrated the ability to discriminate changes in mobility of individuals because of wheelchair type. Conclusion Comparative effectiveness studies with this protocol can enable beneficial change. This is illustrated by design alterations by wheelchair manufacturers in response to results. PMID:28936413

  13. Advanced Elemental and Isotopic Characterization of Atmospheric Aerosols

    NASA Astrophysics Data System (ADS)

    Shafer, M. M.; Schauer, J. J.; Park, J.

    2001-12-01

    Recent sampling and analytical developments advanced by the project team enable the detailed elemental and isotopic fingerprinting of extremely small masses of atmospheric aerosols. Historically, this type of characterization was rarely achieved due to limitations in analytical sensitivity and a lack of awareness concerning the potential for contamination. However, with the introduction of 3rd and 4th generation ICP-MS instrumentation and the application of state-of-the- art "clean-techniques", quantitative analysis of over 40 elements in sub-milligram samples can be realized. When coupled with an efficient and validated solubilization method, ICP-MS approaches provide distinct advantages in comparison with traditional methods; greatly enhanced detection limits, improved accuracy, and isotope resolution capability, to name a few. Importantly, the ICP-MS approach can readily be integrated with techniques which enable phase differentiation and chemical speciation information to be acquired. For example, selective chemical leaching can provide data on the association of metals with major phase-components, and oxidation state of certain metals. Critical information on metal-ligand stability can be obtained when electrochemical techniques, such as adsorptive cathodic stripping voltammetry (ACSV), are applied to these same extracts. Our research group is applying these techniques in a broad range of research projects to better understand the sources and distribution of trace metals in particulate matter in the atmosphere. Using examples from our research, including recent Pb and Sr isotope ratio work on Asian aerosols, we will illustrate the capabilities and applications of these new methods.

  14. Descriptive survey of the contextual support for nursing research in 15 countries.

    PubMed

    Uys, Leana R; Newhouse, Robin P; Oweis, Arwa; Liang, Xiaokun

    2013-01-01

    Global research productivity depends on the presence of contextual factors, such as a doctorally prepared faculty, graduate programmes, publication options, that enable the conduct and publication of studies to generate knowledge to inform nursing practice. The current study aimed to develop and test an instrument that measures the level of contextual support for nursing research within a specific country, allowing comparisons between countries. After development of a 20-item survey with seven factors and 11 criteria based on a literature review, a quantitative descriptive e-mail survey design was used. Nurse researchers (N=100) from 22 countries were invited to participate. The response rate was 39% from 15 countries. Ethics approval was obtained by investigators in their country of origin. Results showed wide variation in the level of contextual support. The average total level of support across all countries was 26.8% (standard deviation [SD]=14.97). The greatest variability was in the area of availability of publishing opportunities (ranging between no suitable journals in a country to over 100). The least variability was in the area of availability of local enabling support (SD=7.22). This research showed wide differences in the level of contextual support for nursing research. The survey instrument can be utilised as a country assessment that can be used to strategically plan the building of infrastructure needed to support nursing research. Contextual support for nursing research is an antecedent of strong science. Building infrastructure for nursing science is a priority for global health.

  15. Nanoelectronics enabled chronic multimodal neural platform in a mouse ischemic model.

    PubMed

    Luan, Lan; Sullender, Colin T; Li, Xue; Zhao, Zhengtuo; Zhu, Hanlin; Wei, Xiaoling; Xie, Chong; Dunn, Andrew K

    2018-02-01

    Despite significant advancements of optical imaging techniques for mapping hemodynamics in small animal models, it remains challenging to combine imaging with spatially resolved electrical recording of individual neurons especially for longitudinal studies. This is largely due to the strong invasiveness to the living brain from the penetrating electrodes and their limited compatibility with longitudinal imaging. We implant arrays of ultraflexible nanoelectronic threads (NETs) in mice for neural recording both at the brain surface and intracortically, which maintain great tissue compatibility chronically. By mounting a cranial window atop of the NET arrays that allows for chronic optical access, we establish a multimodal platform that combines spatially resolved electrical recording of neural activity and laser speckle contrast imaging (LSCI) of cerebral blood flow (CBF) for longitudinal studies. We induce peri-infarct depolarizations (PIDs) by targeted photothrombosis, and show the ability to detect its occurrence and propagation through spatiotemporal variations in both extracellular potentials and CBF. We also demonstrate chronic tracking of single-unit neural activity and CBF over days after photothrombosis, from which we observe reperfusion and increased firing rates. This multimodal platform enables simultaneous mapping of neural activity and hemodynamic parameters at the microscale for quantitative, longitudinal comparisons with minimal perturbation to the baseline neurophysiology. The ability to spatiotemporally resolve and chronically track CBF and neural electrical activity in the same living brain region has broad applications for studying the interplay between neural and hemodynamic responses in health and in cerebrovascular and neurological pathologies. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Visually estimated ejection fraction by two dimensional and triplane echocardiography is closely correlated with quantitative ejection fraction by real-time three dimensional echocardiography.

    PubMed

    Shahgaldi, Kambiz; Gudmundsson, Petri; Manouras, Aristomenis; Brodin, Lars-Ake; Winter, Reidar

    2009-08-25

    Visual assessment of left ventricular ejection fraction (LVEF) is often used in clinical routine despite general recommendations to use quantitative biplane Simpsons (BPS) measurements. Even thou quantitative methods are well validated and from many reasons preferable, the feasibility of visual assessment (eyeballing) is superior. There is to date only sparse data comparing visual EF assessment in comparison to quantitative methods available. The aim of this study was to compare visual EF assessment by two-dimensional echocardiography (2DE) and triplane echocardiography (TPE) using quantitative real-time three-dimensional echocardiography (RT3DE) as the reference method. Thirty patients were enrolled in the study. Eyeballing EF was assessed using apical 4-and 2 chamber views and TP mode by two experienced readers blinded to all clinical data. The measurements were compared to quantitative RT3DE. There were an excellent correlation between eyeballing EF by 2D and TP vs 3DE (r = 0.91 and 0.95 respectively) without any significant bias (-0.5 +/- 3.7% and -0.2 +/- 2.9% respectively). Intraobserver variability was 3.8% for eyeballing 2DE, 3.2% for eyeballing TP and 2.3% for quantitative 3D-EF. Interobserver variability was 7.5% for eyeballing 2D and 8.4% for eyeballing TP. Visual estimation of LVEF both using 2D and TP by an experienced reader correlates well with quantitative EF determined by RT3DE. There is an apparent trend towards a smaller variability using TP in comparison to 2D, this was however not statistically significant.

  17. Development of an exposure measurement database on five lung carcinogens (ExpoSYN) for quantitative retrospective occupational exposure assessment.

    PubMed

    Peters, Susan; Vermeulen, Roel; Olsson, Ann; Van Gelder, Rainer; Kendzia, Benjamin; Vincent, Raymond; Savary, Barbara; Williams, Nick; Woldbæk, Torill; Lavoué, Jérôme; Cavallo, Domenico; Cattaneo, Andrea; Mirabelli, Dario; Plato, Nils; Dahmann, Dirk; Fevotte, Joelle; Pesch, Beate; Brüning, Thomas; Straif, Kurt; Kromhout, Hans

    2012-01-01

    SYNERGY is a large pooled analysis of case-control studies on the joint effects of occupational carcinogens and smoking in the development of lung cancer. A quantitative job-exposure matrix (JEM) will be developed to assign exposures to five major lung carcinogens [asbestos, chromium, nickel, polycyclic aromatic hydrocarbons (PAH), and respirable crystalline silica (RCS)]. We assembled an exposure database, called ExpoSYN, to enable such a quantitative exposure assessment. Existing exposure databases were identified and European and Canadian research institutes were approached to identify pertinent exposure measurement data. Results of individual air measurements were entered anonymized according to a standardized protocol. The ExpoSYN database currently includes 356 551 measurements from 19 countries. In total, 140 666 personal and 215 885 stationary data points were available. Measurements were distributed over the five agents as follows: RCS (42%), asbestos (20%), chromium (16%), nickel (15%), and PAH (7%). The measurement data cover the time period from 1951 to present. However, only a small portion of measurements (1.4%) were performed prior to 1975. The major contributing countries for personal measurements were Germany (32%), UK (22%), France (14%), and Norway and Canada (both 11%). ExpoSYN is a unique occupational exposure database with measurements from 18 European countries and Canada covering a time period of >50 years. This database will be used to develop a country-, job-, and time period-specific quantitative JEM. This JEM will enable data-driven quantitative exposure assessment in a multinational pooled analysis of community-based lung cancer case-control studies.

  18. Optofluidic time-stretch quantitative phase microscopy.

    PubMed

    Guo, Baoshan; Lei, Cheng; Wu, Yi; Kobayashi, Hirofumi; Ito, Takuro; Yalikun, Yaxiaer; Lee, Sangwook; Isozaki, Akihiro; Li, Ming; Jiang, Yiyue; Yasumoto, Atsushi; Di Carlo, Dino; Tanaka, Yo; Yatomi, Yutaka; Ozeki, Yasuyuki; Goda, Keisuke

    2018-03-01

    Innovations in optical microscopy have opened new windows onto scientific research, industrial quality control, and medical practice over the last few decades. One of such innovations is optofluidic time-stretch quantitative phase microscopy - an emerging method for high-throughput quantitative phase imaging that builds on the interference between temporally stretched signal and reference pulses by using dispersive properties of light in both spatial and temporal domains in an interferometric configuration on a microfluidic platform. It achieves the continuous acquisition of both intensity and phase images with a high throughput of more than 10,000 particles or cells per second by overcoming speed limitations that exist in conventional quantitative phase imaging methods. Applications enabled by such capabilities are versatile and include characterization of cancer cells and microalgal cultures. In this paper, we review the principles and applications of optofluidic time-stretch quantitative phase microscopy and discuss its future perspective. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Distance-based microfluidic quantitative detection methods for point-of-care testing.

    PubMed

    Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James

    2016-04-07

    Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed.

  20. Real-time label-free quantitative fluorescence microscopy-based detection of ATP using a tunable fluorescent nano-aptasensor platform.

    PubMed

    Shrivastava, Sajal; Sohn, Il-Yung; Son, Young-Min; Lee, Won-Il; Lee, Nae-Eung

    2015-12-14

    Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules.

  1. Management Ratios 1. For Colleges & Universities.

    ERIC Educational Resources Information Center

    Minter, John, Ed.

    Ratios that enable colleges and universities to select other institutions for comparison are presented. The ratios and underlying data also enable colleges to rank order institutions and to calculate means, quartiles, and ranges for these groups. The data are based on FY 1983 U.S. Department of Education Statistics. The ratios summarize the…

  2. The Comparison of Students' Satisfaction between Ubiquitous and Web-Based Learning Environments

    ERIC Educational Resources Information Center

    Virtanen, Mari Aulikki; Kääriäinen, Maria; Liikanen, Eeva; Haavisto, Elina

    2017-01-01

    Higher education is moving towards digitalized learning. The rapid development of technological resources, devices and wireless networks enables more flexible opportunities to study and learn in innovative learning environments. New technologies enable combining of authentic and virtual learning spaces and digital resources as multifunctional…

  3. Urea Biosynthesis Using Liver Slices

    ERIC Educational Resources Information Center

    Teal, A. R.

    1976-01-01

    Presented is a practical scheme to enable introductory biology students to investigate the mechanism by which urea is synthesized in the liver. The tissue-slice technique is discussed, and methods for the quantitative analysis of metabolites are presented. (Author/SL)

  4. Enhanced culvert inspections - best practices guidebook : final report.

    DOT National Transportation Integrated Search

    2017-06-01

    Culvert inspection is a key enabler that allows MnDOT to manage the states highway culvert system. When quantitative detail on culvert condition is required, an inspector will need to use enhanced inspection technologies. Enhanced inspection techn...

  5. Potential use of combining the diffusion equation with the free Shrödinger equation to improve the Optical Coherence Tomography image analysis

    NASA Astrophysics Data System (ADS)

    Cabrera Fernandez, Delia; Salinas, Harry M.; Somfai, Gabor; Puliafito, Carmen A.

    2006-03-01

    Optical coherence tomography (OCT) is a rapidly emerging medical imaging technology. In ophthalmology, OCT is a powerful tool because it enables visualization of the cross sectional structure of the retina and anterior eye with higher resolutions than any other non-invasive imaging modality. Furthermore, OCT image information can be quantitatively analyzed, enabling objective assessment of features such as macular edema and diabetes retinopathy. We present specific improvements in the quantitative analysis of the OCT system, by combining the diffusion equation with the free Shrödinger equation. In such formulation, important features of the image can be extracted by extending the analysis from the real axis to the complex domain. Experimental results indicate that our proposed novel approach has good performance in speckle noise removal, enhancement and segmentation of the various cellular layers of the retina using the OCT system.

  6. A real-time monitoring platform of myogenesis regulators using double fluorescent labeling

    PubMed Central

    Sapoznik, Etai; Niu, Guoguang; Zhou, Yu; Prim, Peter M.; Criswell, Tracy L.

    2018-01-01

    Real-time, quantitative measurement of muscle progenitor cell (myoblast) differentiation is an important tool for skeletal muscle research and identification of drugs that support skeletal muscle regeneration. While most quantitative tools rely on sacrificial approach, we developed a double fluorescent tagging approach, which allows for dynamic monitoring of myoblast differentiation through assessment of fusion index and nuclei count. Fluorescent tagging of both the cell cytoplasm and nucleus enables monitoring of cell fusion and the formation of new myotube fibers, similar to immunostaining results. This labeling approach allowed monitoring the effects of Myf5 overexpression, TNFα, and Wnt agonist on myoblast differentiation. It also enabled testing the effects of surface coating on the fusion levels of scaffold-seeded myoblasts. The double fluorescent labeling of myoblasts is a promising technique to visualize even minor changes in myogenesis of myoblasts in order to support applications such as tissue engineering and drug screening. PMID:29444187

  7. Let's push things forward: disruptive technologies and the mechanics of tissue assembly.

    PubMed

    Varner, Victor D; Nelson, Celeste M

    2013-09-01

    Although many of the molecular mechanisms that regulate tissue assembly in the embryo have been delineated, the physical forces that couple these mechanisms to actual changes in tissue form remain unclear. Qualitative studies suggest that mechanical loads play a regulatory role in development, but clear quantitative evidence has been lacking. This is partly owing to the complex nature of these problems - embryonic tissues typically undergo large deformations and exhibit evolving, highly viscoelastic material properties. Still, despite these challenges, new disruptive technologies are enabling study of the mechanics of tissue assembly in unprecedented detail. Here, we present novel experimental techniques that enable the study of each component of these physical problems: kinematics, forces, and constitutive properties. Specifically, we detail advances in light sheet microscopy, optical coherence tomography, traction force microscopy, fluorescence force spectroscopy, microrheology and micropatterning. Taken together, these technologies are helping elucidate a more quantitative understanding of the mechanics of tissue assembly.

  8. Broadband quantitative NQR for authentication of vitamins and dietary supplements

    NASA Astrophysics Data System (ADS)

    Chen, Cheng; Zhang, Fengchao; Bhunia, Swarup; Mandal, Soumyajit

    2017-05-01

    We describe hardware, pulse sequences, and algorithms for nuclear quadrupole resonance (NQR) spectroscopy of medicines and dietary supplements. Medicine and food safety is a pressing problem that has drawn more and more attention. NQR is an ideal technique for authenticating these substances because it is a non-invasive method for chemical identification. We have recently developed a broadband NQR front-end that can excite and detect 14N NQR signals over a wide frequency range; its operating frequency can be rapidly set by software, while sensitivity is comparable to conventional narrowband front-ends over the entire range. This front-end improves the accuracy of authentication by enabling multiple-frequency experiments. We have also developed calibration and signal processing techniques to convert measured NQR signal amplitudes into nuclear spin densities, thus enabling its use as a quantitative technique. Experimental results from several samples are used to illustrate the proposed methods.

  9. Numerical Investigation of Vertical Plunging Jet Using a Hybrid Multifluid–VOF Multiphase CFD Solver

    DOE PAGES

    Shonibare, Olabanji Y.; Wardle, Kent E.

    2015-06-28

    A novel hybrid multiphase flow solver has been used to conduct simulations of a vertical plunging liquid jet. This solver combines a multifluid methodology with selective interface sharpening to enable simulation of both the initial jet impingement and the long-time entrained bubble plume phenomena. Models are implemented for variable bubble size capturing and dynamic switching of interface sharpened regions to capture transitions between the initially fully segregated flow types into the dispersed bubbly flow regime. It was found that the solver was able to capture the salient features of the flow phenomena under study and areas for quantitative improvement havemore » been explored and identified. In particular, a population balance approach is employed and detailed calibration of the underlying models with experimental data is required to enable quantitative prediction of bubble size and distribution to capture the transition between segregated and dispersed flow types with greater fidelity.« less

  10. Let's push things forward: disruptive technologies and the mechanics of tissue assembly

    PubMed Central

    Varner, Victor D.; Nelson, Celeste M.

    2013-01-01

    Although many of the molecular mechanisms that regulate tissue assembly in the embryo have been delineated, the physical forces that couple these mechanisms to actual changes in tissue form remain unclear. Qualitative studies suggest that mechanical loads play a regulatory role in development, but clear quantitative evidence has been lacking. This is partly owing to the complex nature of these problems – embryonic tissues typically undergo large deformations and exhibit evolving, highly viscoelastic material properties. Still, despite these challenges, new disruptive technologies are enabling study of the mechanics of tissue assembly in unprecedented detail. Here, we present novel experimental techniques that enable the study of each component of these physical problems: kinematics, forces, and constitutive properties. Specifically, we detail advances in light sheet microscopy, optical coherence tomography, traction force microscopy, fluorescence force spectroscopy, microrheology and micropatterning. Taken together, these technologies are helping elucidate a more quantitative understanding of the mechanics of tissue assembly. PMID:23907401

  11. A toolbox to explore the mechanics of living embryonic tissues

    PubMed Central

    Campàs, Otger

    2016-01-01

    The sculpting of embryonic tissues and organs into their functional morphologies involves the spatial and temporal regulation of mechanics at cell and tissue scales. Decades of in vitro work, complemented by some in vivo studies, have shown the relevance of mechanical cues in the control of cell behaviors that are central to developmental processes, but the lack of methodologies enabling precise, quantitative measurements of mechanical cues in vivo have hindered our understanding of the role of mechanics in embryonic development. Several methodologies are starting to enable quantitative studies of mechanics in vivo and in situ, opening new avenues to explore how mechanics contributes to shaping embryonic tissues and how it affects cell behavior within developing embryos. Here we review the present methodologies to study the role of mechanics in living embryonic tissues, considering their strengths and drawbacks as well as the conditions in which they are most suitable. PMID:27061360

  12. A toolbox to explore the mechanics of living embryonic tissues.

    PubMed

    Campàs, Otger

    2016-07-01

    The sculpting of embryonic tissues and organs into their functional morphologies involves the spatial and temporal regulation of mechanics at cell and tissue scales. Decades of in vitro work, complemented by some in vivo studies, have shown the relevance of mechanical cues in the control of cell behaviors that are central to developmental processes, but the lack of methodologies enabling precise, quantitative measurements of mechanical cues in vivo have hindered our understanding of the role of mechanics in embryonic development. Several methodologies are starting to enable quantitative studies of mechanics in vivo and in situ, opening new avenues to explore how mechanics contributes to shaping embryonic tissues and how it affects cell behavior within developing embryos. Here we review the present methodologies to study the role of mechanics in living embryonic tissues, considering their strengths and drawbacks as well as the conditions in which they are most suitable. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Accuracy and precision of pseudo-continuous arterial spin labeling perfusion during baseline and hypercapnia: a head-to-head comparison with ¹⁵O H₂O positron emission tomography.

    PubMed

    Heijtel, D F R; Mutsaerts, H J M M; Bakker, E; Schober, P; Stevens, M F; Petersen, E T; van Berckel, B N M; Majoie, C B L M; Booij, J; van Osch, M J P; Vanbavel, E; Boellaard, R; Lammertsma, A A; Nederveen, A J

    2014-05-15

    Measurements of the cerebral blood flow (CBF) and cerebrovascular reactivity (CVR) provide useful information about cerebrovascular condition and regional metabolism. Pseudo-continuous arterial spin labeling (pCASL) is a promising non-invasive MRI technique to quantitatively measure the CBF, whereas additional hypercapnic pCASL measurements are currently showing great promise to quantitatively assess the CVR. However, the introduction of pCASL at a larger scale awaits further evaluation of the exact accuracy and precision compared to the gold standard. (15)O H₂O positron emission tomography (PET) is currently regarded as the most accurate and precise method to quantitatively measure both CBF and CVR, though it is one of the more invasive methods as well. In this study we therefore assessed the accuracy and precision of quantitative pCASL-based CBF and CVR measurements by performing a head-to-head comparison with (15)O H₂O PET, based on quantitative CBF measurements during baseline and hypercapnia. We demonstrate that pCASL CBF imaging is accurate during both baseline and hypercapnia with respect to (15)O H₂O PET with a comparable precision. These results pave the way for quantitative usage of pCASL MRI in both clinical and research settings. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Calibration of BK Virus Nucleic Acid Amplification Testing to the 1st WHO International Standard for BK Virus

    PubMed Central

    Tan, Susanna K.; Milligan, Stephen; Sahoo, Malaya K.; Taylor, Nathaniel

    2017-01-01

    ABSTRACT Significant interassay variability in the quantification of BK virus (BKV) DNA precludes establishing broadly applicable thresholds for the management of BKV infection in transplantation. The 1st WHO International Standard for BKV (primary standard) was introduced in 2016 as a common calibrator for improving the harmonization of BKV nucleic acid amplification testing (NAAT) and enabling comparisons of biological measurements worldwide. Here, we evaluated the Altona RealStar BKV assay (Altona) and calibrated the results to the international unit (IU) using the Exact Diagnostics BKV verification panel, a secondary standard traceable to the primary standard. The primary and secondary standards on Altona had nearly identical linear regression equations (primary standard, Y = 1.05X − 0.28, R2 = 0.99; secondary standard, Y = 1.04X − 0.26, R2 = 0.99) and conversion factors (primary standard, 1.11 IU/copy; secondary standard, 1.09 IU/copy). A comparison of Altona with a laboratory-developed BKV NAAT assay in IU/ml versus copies/ml using Passing-Bablok regression revealed similar regression lines, no proportional bias, and improvement in the systematic bias (95% confidence interval of intercepts: copies/ml, −0.52 to −1.01; IU/ml, 0.07 to −0.36). Additionally, Bland-Altman analyses revealed a clinically significant reduction of bias when results were reported in IU/ml (IU/ml, −0.10 log10; copies/ml, −0.70 log10). These results indicate that the use of a common calibrator improved the agreement between the two assays. As clinical laboratories worldwide use calibrators traceable to the primary standard to harmonize BKV NAAT results, we anticipate improved interassay comparisons with a potential for establishing broadly applicable quantitative BKV DNA load cutoffs for clinical practice. PMID:28053213

  15. Outcomes assessment in rotator cuff pathology: what are we measuring?

    PubMed

    Makhni, Eric C; Steinhaus, Michael E; Morrow, Zachary S; Jobin, Charles M; Verma, Nikhil N; Cole, Brian J; Bach, Bernard R

    2015-12-01

    Assessments used to measure outcomes associated with rotator cuff pathology and after repair are varied. This lack of standardization leads to difficulty drawing comparisons across studies. We hypothesize that this variability in patient-reported outcome measures and objective metrics used in rotator cuff studies persists even in high-impact, peer reviewed journals. All studies assessing rotator cuff tear and repair outcomes in 6 orthopedic journals with a high impact factor from January 2010 to December 2014 were reviewed. Cadaveric and animal studies and those without outcomes were excluded. Outcome measures included range of motion (forward elevation, abduction, external rotation, and internal rotation), strength (in the same 4 planes), tendon integrity imaging, patient satisfaction, and functional assessment scores. Of the 156 included studies, 63% documented range of motion measurements, with 18% reporting range of motion in all 4 planes. Only 38% of studies reported quantitative strength measurements. In 65% of studies, tendon integrity was documented with imaging (38% magnetic resonance imaging/magnetic resonance anrhrogram, 31% ultrasound, and 8% computed tomography arthrogram). Finally, functional score reporting varied significantly, with the 5 most frequently reported scores ranging from 16% to 61% in studies, and 15 of the least reported outcomes were each reported in ≤6% of studies. Significant variability exists in outcomes reporting after rotator cuff tear and repair, making comparisons between clinical studies difficult. Creating a uniformly accepted, validated outcomes tool that assesses pain, function, patient satisfaction, and anatomic integrity would enable consistent outcomes assessment after operative and nonoperative management and allow comparisons across the literature. Copyright © 2015 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  16. Applicability of Unmanned Aerial Vehicles in Research on Aeolian Processes

    NASA Astrophysics Data System (ADS)

    Algimantas, Česnulevičius; Artūras, Bautrėnas; Linas, Bevainis; Donatas, Ovodas; Kęstutis, Papšys

    2018-02-01

    Surface dynamics and instabilities are characteristic of aeolian formation. The method of surface comparison is regarded as the most appropriate one for evaluation of the intensity of aeolian processes and the amount of transported sand. The data for surface comparison can be collected by topographic survey measurements and using unmanned aerial vehicles. Time cost for relief microform fixation and measurement executing topographic survey are very high. The method of unmanned aircraft aerial photographs fixation also encounters difficulties because there are no stable clear objects and contours that enable to link aerial photographs, to determine the boundaries of captured territory and to ensure the accuracy of surface measurements. Creation of stationary anchor points is irrational due to intense sand accumulation and deflation in different climate seasons. In September 2015 and in April 2016 the combined methodology was applied for evaluation of intensity of aeolian processes in the Curonian Spit. Temporary signs (marks) were installed on the surface, coordinates of the marks were fixed using GPS and then flight of unmanned aircraft was conducted. The fixed coordinates of marks ensure the accuracy of measuring aerial imagery and the ability to calculate the possible corrections. This method was used to track and measure very small (micro-rank) relief forms (5-10 cm height and 10-20 cm length). Using this method morphometric indicators of micro-terraces caused by sand dunes pressure to gytia layer were measured in a non-contact way. An additional advantage of the method is the ability to accurately link the repeated measurements. The comparison of 3D terrain models showed sand deflation and accumulation areas and quantitative changes in the terrain very clearly.

  17. Quantitative modeling of the reaction/diffusion kinetics of two-chemistry photopolymers

    NASA Astrophysics Data System (ADS)

    Kowalski, Benjamin Andrew

    Optically driven diffusion in photopolymers is an appealing material platform for a broad range of applications, in which the recorded refractive index patterns serve either as images (e.g. data storage, display holography) or as optical elements (e.g. custom GRIN components, integrated optical devices). A quantitative understanding of the reaction/diffusion kinetics is difficult to obtain directly, but is nevertheless necessary in order to fully exploit the wide array of design freedoms in these materials. A general strategy for characterizing these kinetics is proposed, in which key processes are decoupled and independently measured. This strategy enables prediction of a material's potential refractive index change, solely on the basis of its chemical components. The degree to which a material does not reach this potential reveals the fraction of monomer that has participated in unwanted reactions, reducing spatial resolution and dynamic range. This approach is demonstrated for a model material similar to commercial media, achieving quantitative predictions of index response over three orders of exposure dose (~1 to ~103 mJ cm-2) and three orders of feature size (0.35 to 500 microns). The resulting insights enable guided, rational design of new material formulations with demonstrated performance improvement.

  18. Single Laboratory Comparison of Quantitative Real-Time PCR Assays for the Detection of Human Fecal Pollution - Poster

    EPA Science Inventory

    There are numerous quantitative real-time PCR (qPCR) methods available to detect and enumerate human fecal pollution in ambient waters. Each assay employs distinct primers and/or probes and many target different genes and microorganisms leading to potential variations in method p...

  19. Exciting New Images | Lunar Reconnaissance Orbiter Camera

    Science.gov Websites

    slowly and relentlessly reshapes the Moon's topography. Comparative study of the shapes of lunar craters , quantitative comparison be derived? And how can we quantify and compare the topography of a large number of for quantitative characterization of impact crater topography (Mahanti, P. et al., 2014, Icarus v. 241

  20. A Comparison of Learning Cultures in Different Sizes and Types

    ERIC Educational Resources Information Center

    Brown, Paula D.; Finch, Kim S.; MacGregor, Cynthia

    2012-01-01

    This study compared relevant data and information about leadership and learning cultures in different sizes and types of high schools. Research was conducted using a quantitative design with a qualitative element. Quantitative data were gathered using a researcher-created survey. Independent sample t-tests were conducted to analyze the means of…

  1. Does Pre-Service Preparation Matter? Examining an Old Question in New Ways

    ERIC Educational Resources Information Center

    Ronfeldt, Matthew

    2014-01-01

    Background: Over the past decade, most of the quantitative studies on teacher preparation have focused on comparisons between alternative and traditional routes. There has been relatively little quantitative research on specific features of teacher education that might cause certain pathways into teaching to be more effective than others. The vast…

  2. Detection limits and cost comparisons of human- and gull-associated conventional and quantitative PCR assays in artificial and environmental waters

    EPA Science Inventory

    Modern techniques for tracking fecal pollution in environmental waters require investing in DNA-based methods to determine the presence of specific fecal sources. To help water quality managers decide whether to employ routine polymerase chain reaction (PCR) or quantitative PC...

  3. Comparison of quantitative PCR assays for Escherichia coli targeting ribosomal RNA and single copy genes

    EPA Science Inventory

    Aims: Compare specificity and sensitivity of quantitative PCR (qPCR) assays targeting single and multi-copy gene regions of Escherichia coli. Methods and Results: A previously reported assay targeting the uidA gene (uidA405) was used as the basis for comparing the taxono...

  4. Comprehensive Comparison of Self-Administered Questionnaires for Measuring Quantitative Autistic Traits in Adults

    ERIC Educational Resources Information Center

    Nishiyama, Takeshi; Suzuki, Masako; Adachi, Katsunori; Sumi, Satoshi; Okada, Kensuke; Kishino, Hirohisa; Sakai, Saeko; Kamio, Yoko; Kojima, Masayo; Suzuki, Sadao; Kanne, Stephen M.

    2014-01-01

    We comprehensively compared all available questionnaires for measuring quantitative autistic traits (QATs) in terms of reliability and construct validity in 3,147 non-clinical and 60 clinical subjects with normal intelligence. We examined four full-length forms, the Subthreshold Autism Trait Questionnaire (SATQ), the Broader Autism Phenotype…

  5. KEY COMPARISON: CCQM-K61: Quantitation of a linearised plasmid DNA, based on a matched standard in a matrix of non-target DNA

    NASA Astrophysics Data System (ADS)

    Woolford, Alison; Holden, Marcia; Salit, Marc; Burns, Malcolm; Ellison, Stephen L. R.

    2009-01-01

    Key comparison CCQM-K61 was performed to demonstrate and document the capability of interested national metrology institutes in the determination of the quantity of specific DNA target in an aqueous solution. The study provides support for the following measurement claim: "Quantitation of a linearised plasmid DNA, based on a matched standard in a matrix of non-target DNA". The comparison was an activity of the Bioanalysis Working Group (BAWG) of the Comité Consultatif pour la Quantité de Matière and was coordinated by NIST (Gaithersburg, USA) and LGC (Teddington, UK). The following laboratories (in alphabetical order) participated in this key comparison. DMSC (Thailand); IRMM (European Union); KRISS (Republic of Korea); LGC (UK); NIM (China); NIST (USA); NMIA (Australia); NMIJ (Japan); VNIIM (Russian Federation) Good agreement was observed between the reported results of all nine of the participants. Uncertainty estimates did not account fully for the dispersion of results even after allowance for possible inhomogeneity in calibration materials. Preliminary studies suggest that the effects of fluorescence threshold setting might contribute to the excess dispersion, and further study of this topic is suggested Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (MRA).

  6. Quantitative comparison of tumor delivery for multiple targeted nanoparticles simultaneously by multiplex ICP-MS.

    PubMed

    Elias, Andrew; Crayton, Samuel H; Warden-Rothman, Robert; Tsourkas, Andrew

    2014-07-28

    Given the rapidly expanding library of disease biomarkers and targeting agents, the number of unique targeted nanoparticles is growing exponentially. The high variability and expense of animal testing often makes it unfeasible to examine this large number of nanoparticles in vivo. This often leads to the investigation of a single formulation that performed best in vitro. However, nanoparticle performance in vivo depends on many variables, many of which cannot be adequately assessed with cell-based assays. To address this issue, we developed a lanthanide-doped nanoparticle method that allows quantitative comparison of multiple targeted nanoparticles simultaneously. Specifically, superparamagnetic iron oxide (SPIO) nanoparticles with different targeting ligands were created, each with a unique lanthanide dopant. Following the simultaneous injection of the various SPIO compositions into tumor-bearing mice, inductively coupled plasma mass spectroscopy was used to quantitatively and orthogonally assess the concentration of each SPIO composition in serial blood and resected tumor samples.

  7. Kinetic Modeling of Accelerated Stability Testing Enabled by Second Harmonic Generation Microscopy.

    PubMed

    Song, Zhengtian; Sarkar, Sreya; Vogt, Andrew D; Danzer, Gerald D; Smith, Casey J; Gualtieri, Ellen J; Simpson, Garth J

    2018-04-03

    The low limits of detection afforded by second harmonic generation (SHG) microscopy coupled with image analysis algorithms enabled quantitative modeling of the temperature-dependent crystallization of active pharmaceutical ingredients (APIs) within amorphous solid dispersions (ASDs). ASDs, in which an API is maintained in an amorphous state within a polymer matrix, are finding increasing use to address solubility limitations of small-molecule APIs. Extensive stability testing is typically performed for ASD characterization, the time frame for which is often dictated by the earliest detectable onset of crystal formation. Here a study of accelerated stability testing on ritonavir, a human immunodeficiency virus (HIV) protease inhibitor, has been conducted. Under the condition for accelerated stability testing at 50 °C/75%RH and 40 °C/75%RH, ritonavir crystallization kinetics from amorphous solid dispersions were monitored by SHG microscopy. SHG microscopy coupled by image analysis yielded limits of detection for ritonavir crystals as low as 10 ppm, which is about 2 orders of magnitude lower than other methods currently available for crystallinity detection in ASDs. The four decade dynamic range of SHG microscopy enabled quantitative modeling with an established (JMAK) kinetic model. From the SHG images, nucleation and crystal growth rates were independently determined.

  8. Automated reagent-dispensing system for microfluidic cell biology assays.

    PubMed

    Ly, Jimmy; Masterman-Smith, Michael; Ramakrishnan, Ravichandran; Sun, Jing; Kokubun, Brent; van Dam, R Michael

    2013-12-01

    Microscale systems that enable measurements of oncological phenomena at the single-cell level have a great capacity to improve therapeutic strategies and diagnostics. Such measurements can reveal unprecedented insights into cellular heterogeneity and its implications into the progression and treatment of complicated cellular disease processes such as those found in cancer. We describe a novel fluid-delivery platform to interface with low-cost microfluidic chips containing arrays of microchambers. Using multiple pairs of needles to aspirate and dispense reagents, the platform enables automated coating of chambers, loading of cells, and treatment with growth media or other agents (e.g., drugs, fixatives, membrane permeabilizers, washes, stains, etc.). The chips can be quantitatively assayed using standard fluorescence-based immunocytochemistry, microscopy, and image analysis tools, to determine, for example, drug response based on differences in protein expression and/or activation of cellular targets on an individual-cell level. In general, automation of fluid and cell handling increases repeatability, eliminates human error, and enables increased throughput, especially for sophisticated, multistep assays such as multiparameter quantitative immunocytochemistry. We report the design of the automated platform and compare several aspects of its performance to manually-loaded microfluidic chips.

  9. The role of 3-D interactive visualization in blind surveys of H I in galaxies

    NASA Astrophysics Data System (ADS)

    Punzo, D.; van der Hulst, J. M.; Roerdink, J. B. T. M.; Oosterloo, T. A.; Ramatsoku, M.; Verheijen, M. A. W.

    2015-09-01

    Upcoming H I surveys will deliver large datasets, and automated processing using the full 3-D information (two positional dimensions and one spectral dimension) to find and characterize H I objects is imperative. In this context, visualization is an essential tool for enabling qualitative and quantitative human control on an automated source finding and analysis pipeline. We discuss how Visual Analytics, the combination of automated data processing and human reasoning, creativity and intuition, supported by interactive visualization, enables flexible and fast interaction with the 3-D data, helping the astronomer to deal with the analysis of complex sources. 3-D visualization, coupled to modeling, provides additional capabilities helping the discovery and analysis of subtle structures in the 3-D domain. The requirements for a fully interactive visualization tool are: coupled 1-D/2-D/3-D visualization, quantitative and comparative capabilities, combined with supervised semi-automated analysis. Moreover, the source code must have the following characteristics for enabling collaborative work: open, modular, well documented, and well maintained. We review four state of-the-art, 3-D visualization packages assessing their capabilities and feasibility for use in the case of 3-D astronomical data.

  10. LCSH and PRECIS in Music: A Comparison.

    ERIC Educational Resources Information Center

    Gabbard, Paula Beversdorf

    1985-01-01

    By studying examples of their applications by two major English language bibliographic agencies, this article compares strengths and weaknesses of PRECIS and Library of Congress Subject Headings for books about music. Highlights include quantitative and qualitative analysis, comparison of number of subject statements, and terminology problems in…

  11. Application of the shifted excitation Raman difference spectroscopy (SERDS) to the analysis of trace amounts of methanol in red wines

    NASA Astrophysics Data System (ADS)

    Volodin, Boris; Dolgy, Sergei; Ban, Vladimir S.; Gracin, Davor; Juraić, Krunoslav; Gracin, Leo

    2014-03-01

    Shifted Excitation Raman Difference Spectroscopy (SERDS) has proven an effective method for performing Raman analysis of fluorescent samples. This technique allows achieving excellent signal to noise performance with shorter excitation wavelengths, thus taking full advantage of the superior signal strength afforded by shorter excitation wavelengths and the superior performance, also combined with lower cost, delivered by silicon CCDs. The technique is enabled by use of two closely space fixed-wavelength laser diode sources stabilized with the Volume Bragg gratings (VBGs). A side by side comparison reveals that SERDS technique delivers superior signal to noise ratio and better detection limits in most situations, even when a longer excitation wavelength is employed for the purpose of elimination of the fluorescence. We have applied the SERDS technique to the quantitative analysis of the presence of trace amounts of methanol in red wines, which is an important task in quality control operations within wine industry and is currently difficult to perform in the field. So far conventional Raman spectroscopy analysis of red wines has been impractical due to the high degree of fluorescence.

  12. Value Production in a Collaborative Environment. Sociophysical Studies of Wikipedia

    NASA Astrophysics Data System (ADS)

    Yasseri, Taha; Kertész, János

    2013-05-01

    We review some recent endeavors and add some new results to characterize and understand underlying mechanisms in Wikipedia (WP), the paradigmatic example of collaborative value production. We analyzed the statistics of editorial activity in different languages and observed typical circadian and weekly patterns, which enabled us to estimate the geographical origins of contributions to WPs in languages spoken in several time zones. Using a recently introduced measure we showed that the editorial activities have intrinsic dependencies in the burstiness of events. A comparison of the English and Simple English WPs revealed important aspects of language complexity and showed how peer cooperation solved the task of enhancing readability. One of our focus issues was characterizing the conflicts or edit wars in WPs, which helped us to automatically filter out controversial pages. When studying the temporal evolution of the controversiality of such pages we identified typical patterns and classified conflicts accordingly. Our quantitative analysis provides the basis of modeling conflicts and their resolution in collaborative environments and contribute to the understanding of this issue, which becomes increasingly important with the development of information communication technology.

  13. Mapping eQTL Networks with Mixed Graphical Markov Models

    PubMed Central

    Tur, Inma; Roverato, Alberto; Castelo, Robert

    2014-01-01

    Expression quantitative trait loci (eQTL) mapping constitutes a challenging problem due to, among other reasons, the high-dimensional multivariate nature of gene-expression traits. Next to the expression heterogeneity produced by confounding factors and other sources of unwanted variation, indirect effects spread throughout genes as a result of genetic, molecular, and environmental perturbations. From a multivariate perspective one would like to adjust for the effect of all of these factors to end up with a network of direct associations connecting the path from genotype to phenotype. In this article we approach this challenge with mixed graphical Markov models, higher-order conditional independences, and q-order correlation graphs. These models show that additive genetic effects propagate through the network as function of gene–gene correlations. Our estimation of the eQTL network underlying a well-studied yeast data set leads to a sparse structure with more direct genetic and regulatory associations that enable a straightforward comparison of the genetic control of gene expression across chromosomes. Interestingly, it also reveals that eQTLs explain most of the expression variability of network hub genes. PMID:25271303

  14. A Method for Identification and Analysis of Non-Overlapping Myeloid Immunophenotypes in Humans

    PubMed Central

    Gustafson, Michael P.; Lin, Yi; Maas, Mary L.; Van Keulen, Virginia P.; Johnston, Patrick B.; Peikert, Tobias; Gastineau, Dennis A.; Dietz, Allan B.

    2015-01-01

    The development of flow cytometric biomarkers in human studies and clinical trials has been slowed by inconsistent sample processing, use of cell surface markers, and reporting of immunophenotypes. Additionally, the function(s) of distinct cell types as biomarkers cannot be accurately defined without the proper identification of homogeneous populations. As such, we developed a method for the identification and analysis of human leukocyte populations by the use of eight 10-color flow cytometric protocols in combination with novel software analyses. This method utilizes un-manipulated biological sample preparation that allows for the direct quantitation of leukocytes and non-overlapping immunophenotypes. We specifically designed myeloid protocols that enable us to define distinct phenotypes that include mature monocytes, granulocytes, circulating dendritic cells, immature myeloid cells, and myeloid derived suppressor cells (MDSCs). We also identified CD123 as an additional distinguishing marker for the phenotypic characterization of immature LIN-CD33+HLA-DR- MDSCs. Our approach permits the comprehensive analysis of all peripheral blood leukocytes and yields data that is highly amenable for standardization across inter-laboratory comparisons for human studies. PMID:25799053

  15. In vivo imaging of human oral hard and soft tissues by polarization-sensitive optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Walther, Julia; Golde, Jonas; Kirsten, Lars; Tetschke, Florian; Hempel, Franz; Rosenauer, Tobias; Hannig, Christian; Koch, Edmund

    2017-12-01

    Since optical coherence tomography (OCT) provides three-dimensional high-resolution images of biological tissue, the benefit of polarization contrast in the field of dentistry is highlighted in this study. Polarization-sensitive OCT (PS OCT) with phase-sensitive recording is used for imaging dental and mucosal tissues in the human oral cavity in vivo. An enhanced polarization contrast of oral structures is reached by analyzing the signals of the co- and crosspolarized channels of the swept source PS OCT system quantitatively with respect to reflectivity, retardation, optic axis orientation, and depolarization. The calculation of these polarization parameters enables a high tissue-specific contrast imaging for the detailed physical interpretation of human oral hard and soft tissues. For the proof-of-principle, imaging of composite restorations and mineralization defects at premolars as well as gingival, lingual, and labial oral mucosa was performed in vivo within the anterior oral cavity. The achieved contrast-enhanced results of the investigated human oral tissues by means of polarization-sensitive imaging are evaluated by the comparison with conventional intensity-based OCT.

  16. Real-time probabilistic covariance tracking with efficient model update.

    PubMed

    Wu, Yi; Cheng, Jian; Wang, Jinqiao; Lu, Hanqing; Wang, Jun; Ling, Haibin; Blasch, Erik; Bai, Li

    2012-05-01

    The recently proposed covariance region descriptor has been proven robust and versatile for a modest computational cost. The covariance matrix enables efficient fusion of different types of features, where the spatial and statistical properties, as well as their correlation, are characterized. The similarity between two covariance descriptors is measured on Riemannian manifolds. Based on the same metric but with a probabilistic framework, we propose a novel tracking approach on Riemannian manifolds with a novel incremental covariance tensor learning (ICTL). To address the appearance variations, ICTL incrementally learns a low-dimensional covariance tensor representation and efficiently adapts online to appearance changes of the target with only O(1) computational complexity, resulting in a real-time performance. The covariance-based representation and the ICTL are then combined with the particle filter framework to allow better handling of background clutter, as well as the temporary occlusions. We test the proposed probabilistic ICTL tracker on numerous benchmark sequences involving different types of challenges including occlusions and variations in illumination, scale, and pose. The proposed approach demonstrates excellent real-time performance, both qualitatively and quantitatively, in comparison with several previously proposed trackers.

  17. An entirely automated method to score DSS-induced colitis in mice by digital image analysis of pathology slides

    PubMed Central

    Kozlowski, Cleopatra; Jeet, Surinder; Beyer, Joseph; Guerrero, Steve; Lesch, Justin; Wang, Xiaoting; DeVoss, Jason; Diehl, Lauri

    2013-01-01

    SUMMARY The DSS (dextran sulfate sodium) model of colitis is a mouse model of inflammatory bowel disease. Microscopic symptoms include loss of crypt cells from the gut lining and infiltration of inflammatory cells into the colon. An experienced pathologist requires several hours per study to score histological changes in selected regions of the mouse gut. In order to increase the efficiency of scoring, Definiens Developer software was used to devise an entirely automated method to quantify histological changes in the whole H&E slide. When the algorithm was applied to slides from historical drug-discovery studies, automated scores classified 88% of drug candidates in the same way as pathologists’ scores. In addition, another automated image analysis method was developed to quantify colon-infiltrating macrophages, neutrophils, B cells and T cells in immunohistochemical stains of serial sections of the H&E slides. The timing of neutrophil and macrophage infiltration had the highest correlation to pathological changes, whereas T and B cell infiltration occurred later. Thus, automated image analysis enables quantitative comparisons between tissue morphology changes and cell-infiltration dynamics. PMID:23580198

  18. Relevance and limitations of crowding, fractal, and polymer models to describe nuclear architecture.

    PubMed

    Huet, Sébastien; Lavelle, Christophe; Ranchon, Hubert; Carrivain, Pascal; Victor, Jean-Marc; Bancaud, Aurélien

    2014-01-01

    Chromosome architecture plays an essential role for all nuclear functions, and its physical description has attracted considerable interest over the last few years among the biophysics community. These researches at the frontiers of physics and biology have been stimulated by the demand for quantitative analysis of molecular biology experiments, which provide comprehensive data on chromosome folding, or of live cell imaging experiments that enable researchers to visualize selected chromosome loci in living or fixed cells. In this review our goal is to survey several nonmutually exclusive models that have emerged to describe the folding of DNA in the nucleus, the dynamics of proteins in the nucleoplasm, or the movements of chromosome loci. We focus on three classes of models, namely molecular crowding, fractal, and polymer models, draw comparisons, and discuss their merits and limitations in the context of chromosome structure and dynamics, or nuclear protein navigation in the nucleoplasm. Finally, we identify future challenges in the roadmap to a unified model of the nuclear environment. © 2014 Elsevier Inc. All rights reserved.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Josefsson, Gabriella; Gamstedt, E. Kristofer; Ahvenainen, Patrik

    The mechanical performance of materials reinforced by cellulose nanofibrils is highly affected by the orientation of these fibrils. This paper investigates the nanofibril orientation distribution of films of partly oriented cellulose nanofibrils. Stripes of hydrogel films were subjected to different amount of strain and, after drying, examined with X-ray diffraction to obtain the orientation of the nanofibrils in the films, caused by the stretching. The cellulose nanofibrils had initially a random in-plane orientation in the hydrogel films and the strain was applied to the films before the nanofibrils bond tightly together, which occurs during drying. The stretching resulted in amore » reorientation of the nanofibrils in the films, with monotonically increasing orientation towards the load direction with increasing strain. Estimation of nanofibril reorientation by X-ray diffraction enables quantitative comparison of the stretch-induced orientation ability of different cellulose nanofibril systems. The reorientation of nanofibrils as a consequence of an applied strain is also predicted by a geometrical model of deformation of nanofibril hydrogels. Conversely, in high-strain cold-drawing of wet cellulose nanofibril materials, the enhanced orientation is promoted by slipping of the effectively stiff fibrils.« less

  20. Refocusing distance of a standard plenoptic camera.

    PubMed

    Hahne, Christopher; Aggoun, Amar; Velisavljevic, Vladan; Fiebig, Susanne; Pesch, Matthias

    2016-09-19

    Recent developments in computational photography enabled variation of the optical focus of a plenoptic camera after image exposure, also known as refocusing. Existing ray models in the field simplify the camera's complexity for the purpose of image and depth map enhancement, but fail to satisfyingly predict the distance to which a photograph is refocused. By treating a pair of light rays as a system of linear functions, it will be shown in this paper that its solution yields an intersection indicating the distance to a refocused object plane. Experimental work is conducted with different lenses and focus settings while comparing distance estimates with a stack of refocused photographs for which a blur metric has been devised. Quantitative assessments over a 24 m distance range suggest that predictions deviate by less than 0.35 % in comparison to an optical design software. The proposed refocusing estimator assists in predicting object distances just as in the prototyping stage of plenoptic cameras and will be an essential feature in applications demanding high precision in synthetic focus or where depth map recovery is done by analyzing a stack of refocused photographs.

  1. Transcript and protein environmental biomarkers in fish--a review.

    PubMed

    Tom, Moshe; Auslander, Meirav

    2005-04-01

    The levels of contaminant-affected gene products (transcripts and proteins) are increasingly utilized as environmental biomarkers, and their appropriate implementation as diagnostic tools is discussed. The required characteristics of a gene product biomarker are accurate evaluation using properly normalized absolute units, aiming at long-term comparability of biomarker levels over a wide geographical range and among many laboratories. Quantitative RT-PCR and competitive ELISA are suggested as preferred evaluation methods for transcript and protein, respectively. Constitutively expressed RNAs or proteins which are part of the examined homogenate are suggested as normalizing agents, compensating for variable processing efficiency. Essential characterization of expression patterns is suggested, providing reference values to be compared to the monitored levels. This comparison would enable estimation of the intensity of biological effects of contaminants. Contaminant-independent reference expression patterns should include natural fluctuations of the biomarker level. Contaminant-dependent patterns should include dose response to model contaminants chronically administered in two environmentally-realistic routes, reaching extreme sub-lethal affected levels. Recent studies using fish as environmental sentinel species, applying gene products as environmental biomarkers, and implementing at least part of the depicted methodologies are reviewed.

  2. Completeness and overlap in open access systems: Search engines, aggregate institutional repositories and physics-related open sources

    PubMed Central

    Wu, Tai-luan; Tseng, Ling-li

    2017-01-01

    This study examines the completeness and overlap of coverage in physics of six open access scholarly communication systems, including two search engines (Google Scholar and Microsoft Academic), two aggregate institutional repositories (OAIster and OpenDOAR), and two physics-related open sources (arXiv.org and Astrophysics Data System). The 2001–2013 Nobel Laureates in Physics served as the sample. Bibliographic records of their publications were retrieved and downloaded from each system, and a computer program was developed to perform the analytical tasks of sorting, comparison, elimination, aggregation and statistical calculations. Quantitative analyses and cross-referencing were performed to determine the completeness and overlap of the system coverage of the six open access systems. The results may enable scholars to select an appropriate open access system as an efficient scholarly communication channel, and academic institutions may build institutional repositories or independently create citation index systems in the future. Suggestions on indicators and tools for academic assessment are presented based on the comprehensiveness assessment of each system. PMID:29267327

  3. Segmentation of the Aortic Valve Apparatus in 3D Echocardiographic Images: Deformable Modeling of a Branching Medial Structure

    PubMed Central

    Pouch, Alison M.; Tian, Sijie; Takabe, Manabu; Wang, Hongzhi; Yuan, Jiefu; Cheung, Albert T.; Jackson, Benjamin M.; Gorman, Joseph H.; Gorman, Robert C.; Yushkevich, Paul A.

    2015-01-01

    3D echocardiographic (3DE) imaging is a useful tool for assessing the complex geometry of the aortic valve apparatus. Segmentation of this structure in 3DE images is a challenging task that benefits from shape-guided deformable modeling methods, which enable inter-subject statistical shape comparison. Prior work demonstrates the efficacy of using continuous medial representation (cm-rep) as a shape descriptor for valve leaflets. However, its application to the entire aortic valve apparatus is limited since the structure has a branching medial geometry that cannot be explicitly parameterized in the original cm-rep framework. In this work, we show that the aortic valve apparatus can be accurately segmented using a new branching medial modeling paradigm. The segmentation method achieves a mean boundary displacement of 0.6 ± 0.1 mm (approximately one voxel) relative to manual segmentation on 11 3DE images of normal open aortic valves. This study demonstrates a promising approach for quantitative 3DE analysis of aortic valve morphology. PMID:26247062

  4. LivePhantom: Retrieving Virtual World Light Data to Real Environments.

    PubMed

    Kolivand, Hoshang; Billinghurst, Mark; Sunar, Mohd Shahrizal

    2016-01-01

    To achieve realistic Augmented Reality (AR), shadows play an important role in creating a 3D impression of a scene. Casting virtual shadows on real and virtual objects is one of the topics of research being conducted in this area. In this paper, we propose a new method for creating complex AR indoor scenes using real time depth detection to exert virtual shadows on virtual and real environments. A Kinect camera was used to produce a depth map for the physical scene mixing into a single real-time transparent tacit surface. Once this is created, the camera's position can be tracked from the reconstructed 3D scene. Real objects are represented by virtual object phantoms in the AR scene enabling users holding a webcam and a standard Kinect camera to capture and reconstruct environments simultaneously. The tracking capability of the algorithm is shown and the findings are assessed drawing upon qualitative and quantitative methods making comparisons with previous AR phantom generation applications. The results demonstrate the robustness of the technique for realistic indoor rendering in AR systems.

  5. Tribological Properties of AlSi12-Al₂O₃ Interpenetrating Composite Layers in Comparison with Unreinforced Matrix Alloy.

    PubMed

    Dolata, Anna Janina

    2017-09-06

    Alumina-Aluminum composites with interpenetrating network structures are a new class of advanced materials with potentially better properties than composites reinforced by particles or fibers. Local casting reinforcement was proposed to take into account problems with the machinability of this type of materials and the shaping of the finished products. The centrifugal infiltration process fabricated composite castings in the form of locally reinforced shafts. The main objective of the research presented in this work was to compare the tribological properties (friction coefficient, wear resistance) of AlSi12/Al₂O₃ interpenetrating composite layers with unreinforced AlSi12 matrix areas. Profilometric tests enabled both quantitative and qualitative analyses of the wear trace that formed on investigated surfaces. It has been shown that interpenetrating composite layers are characterized by lower and more stable coefficients of friction (μ), as well as higher wear resistance than unreinforced matrix areas. At the present stage, the study confirmed that the tribological properties of the composite layers depend on the spatial structure of the ceramic reinforcement, and primarily the volume and size of alumina foam cells.

  6. LivePhantom: Retrieving Virtual World Light Data to Real Environments

    PubMed Central

    2016-01-01

    To achieve realistic Augmented Reality (AR), shadows play an important role in creating a 3D impression of a scene. Casting virtual shadows on real and virtual objects is one of the topics of research being conducted in this area. In this paper, we propose a new method for creating complex AR indoor scenes using real time depth detection to exert virtual shadows on virtual and real environments. A Kinect camera was used to produce a depth map for the physical scene mixing into a single real-time transparent tacit surface. Once this is created, the camera’s position can be tracked from the reconstructed 3D scene. Real objects are represented by virtual object phantoms in the AR scene enabling users holding a webcam and a standard Kinect camera to capture and reconstruct environments simultaneously. The tracking capability of the algorithm is shown and the findings are assessed drawing upon qualitative and quantitative methods making comparisons with previous AR phantom generation applications. The results demonstrate the robustness of the technique for realistic indoor rendering in AR systems. PMID:27930663

  7. Chemical Bonding and Structural Information of Black CarbonReference Materials and Individual Carbonaceous AtmosphericAerosols

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hopkins, Rebecca J.; Tivanski, Alexei V.; Marten, Bryan D.

    2007-04-25

    The carbon-to-oxygen ratios and graphitic nature of a rangeof black carbon standard reference materials (BC SRMs), high molecularmass humic-like substances (HULIS) and atmospheric particles are examinedusing scanning transmission X-ray microscopy (STXM) coupled with nearedge X-ray absorption fine structure (NEXAFS) spectroscopy. UsingSTXM/NEXAFS, individual particles with diameter>100 nm are studied,thus the diversity of atmospheric particles collected during a variety offield missions is assessed. Applying a semi-quantitative peak fittingmethod to the NEXAFS spectra enables a comparison of BC SRMs and HULIS toparticles originating from anthropogenic combustion and biomass burns,thus allowing determination of the suitability of these materials forrepresenting atmospheric particles. Anthropogenic combustion andmore » biomassburn particles can be distinguished from one another using both chemicalbonding and structural ordering information. While anthropogeniccombustion particles are characterized by a high proportion ofaromatic-C, the presence of benzoquinone and are highly structurallyordered, biomass burn particles exhibit lower structural ordering, asmaller proportion of aromatic-C and contain a much higher proportion ofoxygenated functional groups.« less

  8. Phase imaging microscopy for the diagnostics of plasma-cell interaction

    NASA Astrophysics Data System (ADS)

    Ohene, Yolanda; Marinov, Ilya; de Laulanié, Lucie; Dupuy, Corinne; Wattelier, Benoit; Starikovskaia, Svetlana

    2015-06-01

    Phase images of biological specimens were obtained by the method of Quadriwave Lateral Shearing Interferometry (QWLSI). The QWLSI technique produces, at high resolution, phase images of the cells having been exposed to a plasma treatment and enables the quantitative analysis of the changes in the surface area of the cells over time. Morphological changes in the HTori normal thyroid cells were demonstrated using this method. There was a comparison of the cell behaviour between control cells, cells treated by plasma of a nanosecond dielectric barrier discharge, including cells pre-treated by catalase, and cells treated with an equivalent amount of H2O2. The major changes in the cell membrane morphology were observed at only 5 min after the plasma treatment. The primary role of reactive oxygen species (ROS) in this degradation is suggested. Deformation and condensation of the cell nucleus were observed 2-3 h after the treatment and are supposedly related to apoptosis induction. The coupling of the phase QWLSI with immunofluorescence imaging would give a deeper insight into the mechanisms of plasma induced cell death.

  9. Structural dynamics of surfaces by ultrafast electron crystallography: experimental and multiple scattering theory.

    PubMed

    Schäfer, Sascha; Liang, Wenxi; Zewail, Ahmed H

    2011-12-07

    Recent studies in ultrafast electron crystallography (UEC) using a reflection diffraction geometry have enabled the investigation of a wide range of phenomena on the femtosecond and picosecond time scales. In all these studies, the analysis of the diffraction patterns and their temporal change after excitation was performed within the kinematical scattering theory. In this contribution, we address the question, to what extent dynamical scattering effects have to be included in order to obtain quantitative information about structural dynamics. We discuss different scattering regimes and provide diffraction maps that describe all essential features of scatterings and observables. The effects are quantified by dynamical scattering simulations and examined by direct comparison to the results of ultrafast electron diffraction experiments on an in situ prepared Ni(100) surface, for which structural dynamics can be well described by a two-temperature model. We also report calculations for graphite surfaces. The theoretical framework provided here allows for further UEC studies of surfaces especially at larger penetration depths and for those of heavy-atom materials. © 2011 American Institute of Physics

  10. Influence of the conservative rotor loads on the near wake of a wind turbine

    NASA Astrophysics Data System (ADS)

    Herráez, I.; Micallef, D.; van Kuik, G. A. M.

    2017-05-01

    The presence of conservative forces on rotor blades is neglected in the blade element theory and all the numerical methods derived from it (like e.g. the blade element momentum theory and the actuator line technique). This might seem a reasonable simplification of the real flow of rotor blades, since conservative loads, by definition, do not contribute to the power conversion. However, conservative loads originating from the chordwise bound vorticity might affect the tip vortex trajectory, as we discussed in a previous work. In that work we also hypothesized that this effect, in turn, could influence the wake induction and correspondingly the rotor performance. In the current work we extend a standard actuator line model in order to account for the conservative loads at the blade tip. This allows to isolate the influence of conservative forces from other effects. The comparison of numerical results with and without conservative loads enables to confirm qualitatively their relevance for the near wake and the rotor performance. However, an accurate quantitative assessment of the effect still remains out of reach due to the inherent uncertainty of the numerical model.

  11. Comparison of Quantitative Antifungal Testing Methods for Textile Fabrics.

    PubMed

    Imoto, Yasuo; Seino, Satoshi; Nakagawa, Takashi; Yamamoto, Takao A

    2017-01-01

     Quantitative antifungal testing methods for textile fabrics under growth-supportive conditions were studied. Fungal growth activities on unfinished textile fabrics and textile fabrics modified with Ag nanoparticles were investigated using the colony counting method and the luminescence method. Morphological changes of the fungi during incubation were investigated by microscopic observation. Comparison of the results indicated that the fungal growth activity values obtained with the colony counting method depended on the morphological state of the fungi on textile fabrics, whereas those obtained with the luminescence method did not. Our findings indicated that unique characteristics of each testing method must be taken into account for the proper evaluation of antifungal activity.

  12. Actinide bioimaging in tissues: Comparison of emulsion and solid track autoradiography techniques with the iQID camera

    PubMed Central

    Miller, Brian W.; Van der Meeren, Anne; Tazrart, Anissa; Angulo, Jaime F.; Griffiths, Nina M.

    2017-01-01

    This work presents a comparison of three autoradiography techniques for imaging biological samples contaminated with actinides: emulsion-based, plastic-based autoradiography and a quantitative digital technique, the iQID camera, based on the numerical analysis of light from a scintillator screen. In radiation toxicology it has been important to develop means of imaging actinide distribution in tissues as these radionuclides may be heterogeneously distributed within and between tissues after internal contamination. Actinide distribution determines which cells are exposed to alpha radiation and is thus potentially critical for assessing absorbed dose. The comparison was carried out by generating autoradiographs of the same biological samples contaminated with actinides with the three autoradiography techniques. These samples were cell preparations or tissue sections collected from animals contaminated with different physico-chemical forms of actinides. The autoradiograph characteristics and the performances of the techniques were evaluated and discussed mainly in terms of acquisition process, activity distribution patterns, spatial resolution and feasibility of activity quantification. The obtained autoradiographs presented similar actinide distribution at low magnification. Out of the three techniques, emulsion autoradiography is the only one to provide a highly-resolved image of the actinide distribution inherently superimposed on the biological sample. Emulsion autoradiography is hence best interpreted at higher magnifications. However, this technique is destructive for the biological sample. Both emulsion- and plastic-based autoradiography record alpha tracks and thus enabled the differentiation between ionized forms of actinides and oxide particles. This feature can help in the evaluation of decorporation therapy efficacy. The most recent technique, the iQID camera, presents several additional features: real-time imaging, separate imaging of alpha particles and gamma rays, and alpha activity quantification. The comparison of these three autoradiography techniques showed that they are complementary and the choice of the technique depends on the purpose of the imaging experiment. PMID:29023595

  13. Ratiometric pH Imaging with a CoII2 MRI Probe via CEST Effects of Opposing pH Dependences (Postprint)

    DTIC Science & Technology

    2017-10-13

    7b08574 14. ABSTRACT (Maximum 200 words) We report a Co2-based magnetic resonance (MR) probe that enables the ratiometric quantitation and imaging of...ratios of CEST peak intensities at 104 and 64 ppm are correlated with solution pH in the physiological range 6.5−7.6 to construct a linear calibration...magnetic resonance (MR); ratiometric quantitation ; chemical exchange saturation transfer (CEST); carboxamide; hydroxyl-substituted bisphosphonate

  14. Applications of pathology-assisted image analysis of immunohistochemistry-based biomarkers in oncology.

    PubMed

    Shinde, V; Burke, K E; Chakravarty, A; Fleming, M; McDonald, A A; Berger, A; Ecsedy, J; Blakemore, S J; Tirrell, S M; Bowman, D

    2014-01-01

    Immunohistochemistry-based biomarkers are commonly used to understand target inhibition in key cancer pathways in preclinical models and clinical studies. Automated slide-scanning and advanced high-throughput image analysis software technologies have evolved into a routine methodology for quantitative analysis of immunohistochemistry-based biomarkers. Alongside the traditional pathology H-score based on physical slides, the pathology world is welcoming digital pathology and advanced quantitative image analysis, which have enabled tissue- and cellular-level analysis. An automated workflow was implemented that includes automated staining, slide-scanning, and image analysis methodologies to explore biomarkers involved in 2 cancer targets: Aurora A and NEDD8-activating enzyme (NAE). The 2 workflows highlight the evolution of our immunohistochemistry laboratory and the different needs and requirements of each biological assay. Skin biopsies obtained from MLN8237 (Aurora A inhibitor) phase 1 clinical trials were evaluated for mitotic and apoptotic index, while mitotic index and defects in chromosome alignment and spindles were assessed in tumor biopsies to demonstrate Aurora A inhibition. Additionally, in both preclinical xenograft models and an acute myeloid leukemia phase 1 trial of the NAE inhibitor MLN4924, development of a novel image algorithm enabled measurement of downstream pathway modulation upon NAE inhibition. In the highlighted studies, developing a biomarker strategy based on automated image analysis solutions enabled project teams to confirm target and pathway inhibition and understand downstream outcomes of target inhibition with increased throughput and quantitative accuracy. These case studies demonstrate a strategy that combines a pathologist's expertise with automated image analysis to support oncology drug discovery and development programs.

  15. Comparison of different approaches to quantitative adenovirus detection in stool specimens of hematopoietic stem cell transplant recipients.

    PubMed

    Kosulin, K; Dworzak, S; Lawitschka, A; Matthes-Leodolter, S; Lion, T

    2016-12-01

    Adenoviruses almost invariably proliferate in the gastrointestinal tract prior to dissemination, and critical threshold concentrations in stool correlate with the risk of viremia. Monitoring of adenovirus loads in stool may therefore be important for timely initiation of treatment in order to prevent invasive infection. Comparison of a manual DNA extraction kit in combination with a validated in-house PCR assay with automated extraction on the NucliSENS-EasyMAG device coupled with the Adenovirus R-gene kit (bioMérieux) for quantitative adenovirus analysis in stool samples. Stool specimens spiked with adenovirus concentrations in a range from 10E2-10E11 copies/g and 32 adenovirus-positive clinical stool specimens from pediatric stem cell transplant recipients were tested along with appropriate negative controls. Quantitative analysis of viral load in adenovirus-positive stool specimens revealed a median difference of 0.5 logs (range 0.1-2.2) between the detection systems tested and a difference of 0.3 logs (range 0.0-1.7) when the comparison was restricted to the PCR assays only. Spiking experiments showed a detection limit of 10 2 -10 3 adenovirus copies/g stool revealing a somewhat higher sensitivity offered by the automated extraction. The dynamic range of accurate quantitative analysis by both systems investigated was between 10 3 and 10 8 virus copies/g. The differences in quantitative analysis of adenovirus copy numbers between the systems tested were primarily attributable to the DNA extraction method used, while the qPCR assays revealed a high level of concordance. Both systems showed adequate performance for detection and monitoring of adenoviral load in stool specimens. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. [Integral quantitative evaluation of working conditions in the construction industry].

    PubMed

    Guseĭnov, A A

    1993-01-01

    Present method evaluating the quality of environment (using MAC and MAL) does not enable to assess completely and objectively the work conditions of building industry due to multiple confounding elements. A solution to this complicated problem including the analysis of various correlating elements of the system "human--work conditions--environment" may be encouraged by social norm of morbidity, which is independent on industrial and natural environment. The complete integral assessment enables to see the whole situation and reveal the points at risk.

  17. Reappraising Accretion to Vesta and the Angrite Parent Body Through Mineral-Scale Platinum Group Element and Os-Isotope Analyses

    NASA Astrophysics Data System (ADS)

    Riches, A. J. V.; Burton, K. W.; Nowell, G. M.; Dale, C. W.; Ottley, C. J.

    2016-08-01

    New methods presented here enable quantitative determination of mineral-scale PGE-abundances and Os-isotope compositions in meteorite materials thereby providing valuable new insight into planetary evolution.

  18. Resistance of a Wire as a Function of Temperature.

    ERIC Educational Resources Information Center

    Henry, David

    1995-01-01

    Presents a simple experiment that enables students to get a quantitative measure of the relationship between the resistance of a wire and the temperature of the wire allowing the calculation of the temperature coefficient of resistance. (JRH)

  19. Single-pair fluorescence resonance energy transfer analysis of mRNA transcripts for highly sensitive gene expression profiling in near real time.

    PubMed

    Peng, Zhiyong; Young, Brandon; Baird, Alison E; Soper, Steven A

    2013-08-20

    Expression analysis of mRNAs transcribed from certain genes can be used as important sources of biomarkers for in vitro diagnostics. While the use of reverse transcription quantitative PCR (RT-qPCR) can provide excellent analytical sensitivity for monitoring transcript numbers, more sensitive approaches for expression analysis that can report results in near real-time are needed for many critical applications. We report a novel assay that can provide exquisite limits-of-quantitation and consists of reverse transcription (RT) followed by a ligase detection reaction (LDR) with single-pair fluorescence resonance energy transfer (spFRET) to provide digital readout through molecular counting. For this assay, no PCR was employed, which enabled short assay turnaround times. To facilitate implementation of the assay, a cyclic olefin copolymer (COC) microchip, which was fabricated using hot embossing, was employed to carry out the LDR in a continuous flow format with online single-molecule detection following the LDR. As demonstrators of the assay's utility, MMP-7 mRNA was expression profiled from several colorectal cancer cell lines. It was found that the RT-LDR/spFRET assay produced highly linear calibration plots even in the low copy number regime. Comparison to RT-qPCR indicated a better linearity over the low copy number range investigated (10-10,000 copies) with an R(2) = 0.9995 for RT-LDR/spFRET and R(2) = 0.98 for RT-qPCR. In addition, differentiating between copy numbers of 10 and 50 could be performed with higher confidence using RT-LDR/spFRET. To demonstrate the short assay turnaround times obtainable using the RT-LDR/spFRET assay, a two thermal cycle LDR was carried out on amphiphysin gene transcripts that can serve as important diagnostic markers for ischemic stroke. The ability to supply diagnostic information on possible stroke events in short turnaround times using RT-LDR/spFRET will enable clinicians to treat patients effectively with appropriate time-sensitive therapeutics.

  20. Design and assessment of a novel SPECT system for desktop open-gantry imaging of small animals: A simulation study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeraatkar, Navid; Farahani, Mohammad Hossein; Rahmim, Arman

    Purpose: Given increasing efforts in biomedical research utilizing molecular imaging methods, development of dedicated high-performance small-animal SPECT systems has been growing rapidly in the last decade. In the present work, we propose and assess an alternative concept for SPECT imaging enabling desktop open-gantry imaging of small animals. Methods: The system, PERSPECT, consists of an imaging desk, with a set of tilted detector and pinhole collimator placed beneath it. The object to be imaged is simply placed on the desk. Monte Carlo (MC) and analytical simulations were utilized to accurately model and evaluate the proposed concept and design. Furthermore, a dedicatedmore » image reconstruction algorithm, finite-aperture-based circular projections (FABCP), was developed and validated for the system, enabling more accurate modeling of the system and higher quality reconstructed images. Image quality was quantified as a function of different tilt angles in the acquisition and number of iterations in the reconstruction algorithm. Furthermore, more complex phantoms including Derenzo, Defrise, and mouse whole body were simulated and studied. Results: The sensitivity of the PERSPECT was 207 cps/MBq. It was quantitatively demonstrated that for a tilt angle of 30°, comparable image qualities were obtained in terms of normalized squared error, contrast, uniformity, noise, and spatial resolution measurements, the latter at ∼0.6 mm. Furthermore, quantitative analyses demonstrated that 3 iterations of FABCP image reconstruction (16 subsets/iteration) led to optimally reconstructed images. Conclusions: The PERSPECT, using a novel imaging protocol, can achieve comparable image quality performance in comparison with a conventional pinhole SPECT with the same configuration. The dedicated FABCP algorithm, which was developed for reconstruction of data from the PERSPECT system, can produce high quality images for small-animal imaging via accurate modeling of the system as incorporated in the forward- and back-projection steps. Meanwhile, the developed MC model and the analytical simulator of the system can be applied for further studies on development and evaluation of the system.« less

  1. Evaluation of a large scale implementation of disease management programmes in various Dutch regions: a study protocol

    PubMed Central

    2011-01-01

    Background Disease management programmes (DMPs) have been developed to improve effectiveness and economic efficiency within chronic care delivery by combining patient-related, professional-directed, and organisational interventions. The benefits of DMPs within different settings, patient groups, and versions remain unclear. In this article we propose a protocol to evaluate a range of current DMPs by capturing them in a single conceptual framework, employing comparable structure, process, and outcome measures, and combining qualitative and quantitative research methods. Methods To assess DMP effectiveness a practical clinical trial will be conducted. Twenty-two disease management experiments will be studied in various Dutch regions consisting of a variety of collaborations between organisations and/or professionals. Patient cohorts include those with cardiovascular diseases, chronic obstructive pulmonary disease, diabetes, stroke, depression, psychotic diseases, and eating disorders. Our methodological approach combines qualitative and quantitative research methods to enable a comprehensive evaluation of complex programmes. Process indicators will be collected from health care providers' data registries and measured via physician and staff questionnaires. Patient questionnaires include health care experiences, health care utilisation, and quality of life. Qualitative data will be gathered by means of interviews and document analysis for an in depth description of project interventions and the contexts in which DMPs are embedded, and an ethnographic process evaluation in five DMPs. Such a design will provide insight into ongoing DMPs and demonstrate which elements of the intervention are potentially (cost)-effective for which patient populations. It will also enable sound comparison of the results of the different programmes. Discussion The study will lead to a better understanding of (1) the mechanisms of disease management, (2) the feasibility, and cost-effectiveness of a disease management approach to improving health care, and (3) the factors that determine success and failure of DMPs. Our study results will be relevant to decision makers and managers who confront the challenge of implementing and integrating DMPs into the health care system. Moreover, it will contribute to the search for methods to evaluate complex healthcare interventions. PMID:21219620

  2. Single-Pair Fret Analysis of mRNA Transcripts for Highly Sensitive Gene Expression Profiling in Near Real Time

    PubMed Central

    Peng, Zhiyong; Young, Brandon; Baird, Alison E.; Soper, Steven A.

    2013-01-01

    Expression analysis of mRNAs transcribed from certain genes can be used as important sources of biomarkers for in vitro diagnostics. While the use of reverse transcription quantitative PCR (RT-qPCR) can provide excellent analytical sensitivity for monitoring transcript numbers, more sensitive approaches for expression analysis that can report results in near real-time are needed for many critical applications. We report a novel assay that can provide exquisite limits-of-quantitation and consists of reverse transcription (RT) followed by a ligase detection reaction (LDR) with single-pair fluorescence resonance energy transfer (spFRET) to provide digital readout through molecular counting. For this assay, no PCR was employed, which enabled short assay turnaround times. To facilitate implementation of the assay, a cyclic olefin copolymer (COC) microchip, which was fabricated using hot embossing, was employed to carry out the LDR in a continuous flow format with on-line single-molecule detection following the LDR. As demonstrators of the assay's utility, MMP-7 mRNA was expression profiled from several colorectal cancer cell lines. It was found that the RT-LDR/spFRET assay produced highly linear calibration plots even in the low copy number regime. Comparison to RT-qPCR indicated a better linearity over the low copy number range investigated (10 − 10,000 copies) with an R2 = 0.9995 for RT-LDR/spFRET and R2 = 0.98 for RT-qPCR. In addition, differentiating between copy numbers of 10 and 50 could be performed with higher confidence using RT-LDR/spFRET. To demonstrate the short assay turnaround times obtainable using the RT-LDR/spFRET assay, a 2 thermal cycle LDR was carried out on amphiphysin gene transcripts that can serve as important diagnostic markers for ischemic stroke. The ability to supply diagnostic information on possible stroke events in short turnaround times using RT-LDR/spFRET will enable clinicians to treat patients effectively with appropriate time-sensitive therapeutics. PMID:23869556

  3. Evaluation of a large scale implementation of disease management programmes in various Dutch regions: a study protocol.

    PubMed

    Lemmens, Karin M M; Rutten-Van Mölken, Maureen P M H; Cramm, Jane M; Huijsman, Robbert; Bal, Roland A; Nieboer, Anna P

    2011-01-10

    Disease management programmes (DMPs) have been developed to improve effectiveness and economic efficiency within chronic care delivery by combining patient-related, professional-directed, and organisational interventions. The benefits of DMPs within different settings, patient groups, and versions remain unclear. In this article we propose a protocol to evaluate a range of current DMPs by capturing them in a single conceptual framework, employing comparable structure, process, and outcome measures, and combining qualitative and quantitative research methods. To assess DMP effectiveness a practical clinical trial will be conducted. Twenty-two disease management experiments will be studied in various Dutch regions consisting of a variety of collaborations between organisations and/or professionals. Patient cohorts include those with cardiovascular diseases, chronic obstructive pulmonary disease, diabetes, stroke, depression, psychotic diseases, and eating disorders. Our methodological approach combines qualitative and quantitative research methods to enable a comprehensive evaluation of complex programmes. Process indicators will be collected from health care providers' data registries and measured via physician and staff questionnaires. Patient questionnaires include health care experiences, health care utilisation, and quality of life. Qualitative data will be gathered by means of interviews and document analysis for an in depth description of project interventions and the contexts in which DMPs are embedded, and an ethnographic process evaluation in five DMPs. Such a design will provide insight into ongoing DMPs and demonstrate which elements of the intervention are potentially (cost)-effective for which patient populations. It will also enable sound comparison of the results of the different programmes. The study will lead to a better understanding of (1) the mechanisms of disease management, (2) the feasibility, and cost-effectiveness of a disease management approach to improving health care, and (3) the factors that determine success and failure of DMPs. Our study results will be relevant to decision makers and managers who confront the challenge of implementing and integrating DMPs into the health care system. Moreover, it will contribute to the search for methods to evaluate complex healthcare interventions.

  4. A reference genetic linkage map of apomictic Hieracium species based on expressed markers derived from developing ovule transcripts.

    PubMed

    Shirasawa, Kenta; Hand, Melanie L; Henderson, Steven T; Okada, Takashi; Johnson, Susan D; Taylor, Jennifer M; Spriggs, Andrew; Siddons, Hayley; Hirakawa, Hideki; Isobe, Sachiko; Tabata, Satoshi; Koltunow, Anna M G

    2015-03-01

    Apomixis in plants generates clonal progeny with a maternal genotype through asexual seed formation. Hieracium subgenus Pilosella (Asteraceae) contains polyploid, highly heterozygous apomictic and sexual species. Within apomictic Hieracium, dominant genetic loci independently regulate the qualitative developmental components of apomixis. In H. praealtum, LOSS OF APOMEIOSIS (LOA) enables formation of embryo sacs without meiosis and LOSS OF PARTHENOGENESIS (LOP) enables fertilization-independent seed formation. A locus required for fertilization-independent endosperm formation (AutE) has been identified in H. piloselloides. Additional quantitative loci appear to influence the penetrance of the qualitative loci, although the controlling genes remain unknown. This study aimed to develop the first genetic linkage maps for sexual and apomictic Hieracium species using simple sequence repeat (SSR) markers derived from expressed transcripts within the developing ovaries. RNA from microdissected Hieracium ovule cell types and ovaries was sequenced and SSRs were identified. Two different F1 mapping populations were created to overcome difficulties associated with genome complexity and asexual reproduction. SSR markers were analysed within each mapping population to generate draft linkage maps for apomictic and sexual Hieracium species. A collection of 14 684 Hieracium expressed SSR markers were developed and linkage maps were constructed for Hieracium species using a subset of the SSR markers. Both the LOA and LOP loci were successfully assigned to linkage groups; however, AutE could not be mapped using the current populations. Comparisons with lettuce (Lactuca sativa) revealed partial macrosynteny between the two Asteraceae species. A collection of SSR markers and draft linkage maps were developed for two apomictic and one sexual Hieracium species. These maps will support cloning of controlling genes at LOA and LOP loci in Hieracium and should also assist with identification of quantitative loci that affect the expressivity of apomixis. Future work will focus on mapping AutE using alternative populations. © The Author 2014. Published by Oxford University Press on behalf of the Annals of Botany Company. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Development and single-laboratory validation of a UHPLC-MS/MS method for quantitation of microcystins and nodularin in natural water, cyanobacteria, shellfish and algal supplement tablet powders.

    PubMed

    Turner, Andrew D; Waack, Julia; Lewis, Adam; Edwards, Christine; Lawton, Linda

    2018-02-01

    A simple, rapid UHPLC-MS/MS method has been developed and optimised for the quantitation of microcystins and nodularin in wide variety of sample matrices. Microcystin analogues targeted were MC-LR, MC-RR, MC-LA, MC-LY, MC-LF, LC-LW, MC-YR, MC-WR, [Asp3] MC-LR, [Dha7] MC-LR, MC-HilR and MC-HtyR. Optimisation studies were conducted to develop a simple, quick and efficient extraction protocol without the need for complex pre-analysis concentration procedures, together with a rapid sub 5min chromatographic separation of toxins in shellfish and algal supplement tablet powders, as well as water and cyanobacterial bloom samples. Validation studies were undertaken on each matrix-analyte combination to the full method performance characteristics following international guidelines. The method was found to be specific and linear over the full calibration range. Method sensitivity in terms of limits of detection, quantitation and reporting were found to be significantly improved in comparison to LC-UV methods and applicable to the analysis of each of the four matrices. Overall, acceptable recoveries were determined for each of the matrices studied, with associated precision and within-laboratory reproducibility well within expected guidance limits. Results from the formalised ruggedness analysis of all available cyanotoxins, showed that the method was robust for all parameters investigated. The results presented here show that the optimised LC-MS/MS method for cyanotoxins is fit for the purpose of detection and quantitation of a range of microcystins and nodularin in shellfish, algal supplement tablet powder, water and cyanobacteria. The method provides a valuable early warning tool for the rapid, routine extraction and analysis of natural waters, cyanobacterial blooms, algal powders, food supplements and shellfish tissues, enabling monitoring labs to supplement traditional microscopy techniques and report toxicity results within a short timeframe of sample receipt. The new method, now accredited to ISO17025 standard, is simple, quick, applicable to multiple matrices and is highly suitable for use as a routine, high-throughout, fast turnaround regulatory monitoring tool. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Devising tissue ingrowth metrics: a contribution to the computational characterization of engineered soft tissue healing.

    PubMed

    Alves, Antoine; Attik, Nina; Bayon, Yves; Royet, Elodie; Wirth, Carine; Bourges, Xavier; Piat, Alexis; Dolmazon, Gaëlle; Clermont, Gaëlle; Boutrand, Jean-Pierre; Grosgogeat, Brigitte; Gritsch, Kerstin

    2018-03-14

    The paradigm shift brought about by the expansion of tissue engineering and regenerative medicine away from the use of biomaterials, currently questions the value of histopathologic methods in the evaluation of biological changes. To date, the available tools of evaluation are not fully consistent and satisfactory for these advanced therapies. We have developed a new, simple and inexpensive quantitative digital approach that provides key metrics for structural and compositional characterization of the regenerated tissues. For example, metrics provide the tissue ingrowth rate (TIR) which integrates two separate indicators; the cell ingrowth rate (CIR) and the total collagen content (TCC) as featured in the equation, TIR% = CIR% + TCC%. Moreover a subset of quantitative indicators describing the directional organization of the collagen (relating structure and mechanical function of tissues), the ratio of collagen I to collagen III (remodeling quality) and the optical anisotropy property of the collagen (maturity indicator) was automatically assessed as well. Using an image analyzer, all metrics were extracted from only two serial sections stained with either Feulgen & Rossenbeck (cell specific) or Picrosirius Red F3BA (collagen specific). To validate this new procedure, three-dimensional (3D) scaffolds were intraperitoneally implanted in healthy and in diabetic rats. It was hypothesized that quantitatively, the healing tissue would be significantly delayed and of poor quality in diabetic rats in comparison to healthy rats. In addition, a chemically modified 3D scaffold was similarly implanted in a third group of healthy rats with the assumption that modulation of the ingrown tissue would be quantitatively present in comparison to the 3D scaffold-healthy group. After 21 days of implantation, both hypotheses were verified by use of this novel computerized approach. When the two methods were run in parallel, the quantitative results revealed fine details and differences not detected by the semi-quantitative assessment, demonstrating the importance of quantitative analysis in the performance evaluation of soft tissue healing. This automated and supervised method reduced operator dependency and proved to be simple, sensitive, cost-effective and time-effective. It supports objective therapeutic comparisons and helps to elucidate regeneration and the dynamics of a functional tissue.

  7. Comparison of selected analytical techniques for protein sizing, quantitation and molecular weight determination.

    PubMed

    Goetz, H; Kuschel, M; Wulff, T; Sauber, C; Miller, C; Fisher, S; Woodward, C

    2004-09-30

    Protein analysis techniques are developing fast due to the growing number of proteins obtained by recombinant DNA techniques. In the present paper we compare selected techniques, which are used for protein sizing, quantitation and molecular weight determination: sodium dodecylsulfate-polyacrylamide gel electrophoresis (SDS-PAGE), lab-on-a-chip or microfluidics technology (LoaC), size exclusion chromatography (SEC) and mass spectrometry (MS). We compare advantages and limitations of each technique in respect to different application areas, analysis time, protein sizing and quantitation performance.

  8. Lipid Vesicle Shape Analysis from Populations Using Light Video Microscopy and Computer Vision

    PubMed Central

    Zupanc, Jernej; Drašler, Barbara; Boljte, Sabina; Kralj-Iglič, Veronika; Iglič, Aleš; Erdogmus, Deniz; Drobne, Damjana

    2014-01-01

    We present a method for giant lipid vesicle shape analysis that combines manually guided large-scale video microscopy and computer vision algorithms to enable analyzing vesicle populations. The method retains the benefits of light microscopy and enables non-destructive analysis of vesicles from suspensions containing up to several thousands of lipid vesicles (1–50 µm in diameter). For each sample, image analysis was employed to extract data on vesicle quantity and size distributions of their projected diameters and isoperimetric quotients (measure of contour roundness). This process enables a comparison of samples from the same population over time, or the comparison of a treated population to a control. Although vesicles in suspensions are heterogeneous in sizes and shapes and have distinctively non-homogeneous distribution throughout the suspension, this method allows for the capture and analysis of repeatable vesicle samples that are representative of the population inspected. PMID:25426933

  9. Inter-laboratory comparison of multi-locus variable-number tandem repeat analysis (MLVA) for verocytotoxin-producing Escherichia coli O157 to facilitate data sharing.

    PubMed

    Holmes, A; Perry, N; Willshaw, G; Hanson, M; Allison, L

    2015-01-01

    Multi-locus variable number tandem repeat analysis (MLVA) is used in clinical and reference laboratories for subtyping verocytotoxin-producing Escherichia coli O157 (VTEC O157). However, as yet there is no common allelic or profile nomenclature to enable laboratories to easily compare data. In this study, we carried out an inter-laboratory comparison of an eight-loci MLVA scheme using a set of 67 isolates of VTEC O157. We found all but two isolates were identical in profile in the two laboratories, and repeat units were homogeneous in size but some were incomplete. A subset of the isolates (n = 17) were sequenced to determine the actual copy number of representative alleles, thereby enabling alleles to be named according to international consensus guidelines. This work has enabled us to realize the potential of MLVA as a portable, highly discriminatory and convenient subtyping method.

  10. Collaborative Project: Development of an Isotope-Enabled CESM for Testing Abrupt Climate Changes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Zhengyu

    One of the most important validations for a state-of-art Earth System Model (ESM) with respect to climate changes is the simulation of the climate evolution and abrupt climate change events in the Earth’s history of the last 21,000 years. However, one great challenge for model validation is that ESMs usually do not directly simulate geochemical variables that can be compared directly with past proxy records. In this proposal, we have met this challenge by developing the simulation capability of major isotopes in a state-of-art ESM, the Community Earth System Model (CESM), enabling us to make direct model-data comparison by comparingmore » the model directly against proxy climate records. Our isotope-enabled ESM incorporates the capability of simulating key isotopes and geotracers, notably δ 18O, δD, δ 14C, and δ 13C, Nd and Pa/Th. The isotope-enabled ESM have been used to perform some simulations for the last 21000 years. The direct comparison of these simulations with proxy records has shed light on the mechanisms of important climate change events.« less

  11. Visually estimated ejection fraction by two dimensional and triplane echocardiography is closely correlated with quantitative ejection fraction by real-time three dimensional echocardiography

    PubMed Central

    Shahgaldi, Kambiz; Gudmundsson, Petri; Manouras, Aristomenis; Brodin, Lars-Åke; Winter, Reidar

    2009-01-01

    Background Visual assessment of left ventricular ejection fraction (LVEF) is often used in clinical routine despite general recommendations to use quantitative biplane Simpsons (BPS) measurements. Even thou quantitative methods are well validated and from many reasons preferable, the feasibility of visual assessment (eyeballing) is superior. There is to date only sparse data comparing visual EF assessment in comparison to quantitative methods available. The aim of this study was to compare visual EF assessment by two-dimensional echocardiography (2DE) and triplane echocardiography (TPE) using quantitative real-time three-dimensional echocardiography (RT3DE) as the reference method. Methods Thirty patients were enrolled in the study. Eyeballing EF was assessed using apical 4-and 2 chamber views and TP mode by two experienced readers blinded to all clinical data. The measurements were compared to quantitative RT3DE. Results There were an excellent correlation between eyeballing EF by 2D and TP vs 3DE (r = 0.91 and 0.95 respectively) without any significant bias (-0.5 ± 3.7% and -0.2 ± 2.9% respectively). Intraobserver variability was 3.8% for eyeballing 2DE, 3.2% for eyeballing TP and 2.3% for quantitative 3D-EF. Interobserver variability was 7.5% for eyeballing 2D and 8.4% for eyeballing TP. Conclusion Visual estimation of LVEF both using 2D and TP by an experienced reader correlates well with quantitative EF determined by RT3DE. There is an apparent trend towards a smaller variability using TP in comparison to 2D, this was however not statistically significant. PMID:19706183

  12. A Simple and Computationally Efficient Approach to Multifactor Dimensionality Reduction Analysis of Gene-Gene Interactions for Quantitative Traits

    PubMed Central

    Gui, Jiang; Moore, Jason H.; Williams, Scott M.; Andrews, Peter; Hillege, Hans L.; van der Harst, Pim; Navis, Gerjan; Van Gilst, Wiek H.; Asselbergs, Folkert W.; Gilbert-Diamond, Diane

    2013-01-01

    We present an extension of the two-class multifactor dimensionality reduction (MDR) algorithm that enables detection and characterization of epistatic SNP-SNP interactions in the context of a quantitative trait. The proposed Quantitative MDR (QMDR) method handles continuous data by modifying MDR’s constructive induction algorithm to use a T-test. QMDR replaces the balanced accuracy metric with a T-test statistic as the score to determine the best interaction model. We used a simulation to identify the empirical distribution of QMDR’s testing score. We then applied QMDR to genetic data from the ongoing prospective Prevention of Renal and Vascular End-Stage Disease (PREVEND) study. PMID:23805232

  13. End-to-end deep neural network for optical inversion in quantitative photoacoustic imaging.

    PubMed

    Cai, Chuangjian; Deng, Kexin; Ma, Cheng; Luo, Jianwen

    2018-06-15

    An end-to-end deep neural network, ResU-net, is developed for quantitative photoacoustic imaging. A residual learning framework is used to facilitate optimization and to gain better accuracy from considerably increased network depth. The contracting and expanding paths enable ResU-net to extract comprehensive context information from multispectral initial pressure images and, subsequently, to infer a quantitative image of chromophore concentration or oxygen saturation (sO 2 ). According to our numerical experiments, the estimations of sO 2 and indocyanine green concentration are accurate and robust against variations in both optical property and object geometry. An extremely short reconstruction time of 22 ms is achieved.

  14. Quantitative phase microscopy for cellular dynamics based on transport of intensity equation.

    PubMed

    Li, Ying; Di, Jianglei; Ma, Chaojie; Zhang, Jiwei; Zhong, Jinzhan; Wang, Kaiqiang; Xi, Teli; Zhao, Jianlin

    2018-01-08

    We demonstrate a simple method for quantitative phase imaging of tiny transparent objects such as living cells based on the transport of intensity equation. The experiments are performed using an inverted bright field microscope upgraded with a flipping imaging module, which enables to simultaneously create two laterally separated images with unequal defocus distances. This add-on module does not include any lenses or gratings and is cost-effective and easy-to-alignment. The validity of this method is confirmed by the measurement of microlens array and human osteoblastic cells in culture, indicating its potential in the applications of dynamically measuring living cells and other transparent specimens in a quantitative, non-invasive and label-free manner.

  15. Quantitative and Comparative Profiling of Protease Substrates through a Genetically Encoded Multifunctional Photocrosslinker.

    PubMed

    He, Dan; Xie, Xiao; Yang, Fan; Zhang, Heng; Su, Haomiao; Ge, Yun; Song, Haiping; Chen, Peng R

    2017-11-13

    A genetically encoded, multifunctional photocrosslinker was developed for quantitative and comparative proteomics. By bearing a bioorthogonal handle and a releasable linker in addition to its photoaffinity warhead, this probe enables the enrichment of transient and low-abundance prey proteins after intracellular photocrosslinking and prey-bait separation, which can be subject to stable isotope dimethyl labeling and mass spectrometry analysis. This quantitative strategy (termed isoCAPP) allowed a comparative proteomic approach to be adopted to identify the proteolytic substrates of an E. coli protease-chaperone dual machinery DegP. Two newly identified substrates were subsequently confirmed by proteolysis experiments. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Comparison of Enterococcus quantitative polymerase chain reaction analysis results from midwest U.S. river samples using EPA Method 1611 and Method 1609 PCR reagents

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has provided recommended beach advisory values in its 2012 recreational water quality criteria (RWQC) for states wishing to use quantitative polymerase chain reaction (qPCR) for the monitoring of Enterococcus fecal indicator bacteria...

  17. Comparison of Quantitative and Qualitative Research Traditions: Epistemological, Theoretical, and Methodological Differences

    ERIC Educational Resources Information Center

    Yilmaz, Kaya

    2013-01-01

    There has been much discussion about quantitative and qualitative approaches to research in different disciplines. In the behavioural and social sciences, these two paradigms are compared to reveal their relative strengths and weaknesses. But the debate about both traditions has commonly taken place in academic books. It is hard to find an article…

  18. Multi-laboratory comparison of quantitative PCR assays for detection and quantification of Fusarium virguliforme from soybean roots and soil

    USDA-ARS?s Scientific Manuscript database

    Accurate identification and quantification of Fusarium virguliforme, the cause of sudden death syndrome (SDS) in soybean, within root tissue and soil are important tasks. Several quantitative PCR (qPCR) assays have been developed but there are no reports comparing their use in sensitive and specific...

  19. COMPARISON OF POPULATIONS OF MOULD SPECIES IN HOMES IN THE UK AND US USING MOLD-SPECIFIC QUANTITATIVE PCR (MSQPCR)

    EPA Science Inventory

    The goal of this research was to compare the populations of 81 mold species in homes in USA and UK using mould specific quantitative polymerase chain reaction (MSQPCR) technology. Dust samples were obtained from randomly selected homes in Great Britain (n=11). The mould populat...

  20. Identities and Transformational Experiences for Quantitative Problem Solving: Gender Comparisons of First-Year University Science Students

    ERIC Educational Resources Information Center

    Hudson, Peter; Matthews, Kelly

    2012-01-01

    Women are underrepresented in science, technology, engineering and mathematics (STEM) areas in university settings; however this may be the result of attitude rather than aptitude. There is widespread agreement that quantitative problem-solving is essential for graduate competence and preparedness in science and other STEM subjects. The research…

  1. COMPARISON OF ENTEROCOCCUS MEASUREMENTS IN FRESHWATER AT TWO RECREATIONAL BEACHES BY QUANTITATIVE POLYMERASE CHAIN REACTION AND MEMBRANE FILER CULTURE ANALYSIS

    EPA Science Inventory

    Cell densities of the fecal pollution indicator genus, Enterococcus, were determined by a rapid (2-3 hr) quantitative PCR (QPCR) analysis based method in 100 ml water samples collected from recreational beaches on Lake Michigan and Lake Erie during the summer of 2003. Enumeration...

  2. Examining the Inclusion of Quantitative Research in a Meta-Ethnographic Review

    ERIC Educational Resources Information Center

    Booker, Rhae-Ann Richardson

    2010-01-01

    This study explored how one might extend meta-ethnography to quantitative research for the advancement of interpretive review methods. Using the same population of 139 studies on racial-ethnic matching as data, my investigation entailed an extended meta-ethnography (EME) and comparison of its results to a published meta-analysis (PMA). Adhering to…

  3. Inclusion and Student Learning: A Quantitative Comparison of Special and General Education Student Performance Using Team and Solo-Teaching

    ERIC Educational Resources Information Center

    Jamison, Joseph A.

    2013-01-01

    This quantitative study sought to determine whether there were significant statistical differences between the performance scores of special education and general education students' scores when in team or solo-teaching environments as may occur in inclusively taught classrooms. The investigated problem occurs because despite education's stated…

  4. Analyses of Disruption of Cerebral White Matter Integrity in Schizophrenia with MR Diffusion Tensor Fiber Tracking Method

    NASA Astrophysics Data System (ADS)

    Yamamoto, Utako; Kobayashi, Tetsuo; Kito, Shinsuke; Koga, Yoshihiko

    We have analyzed cerebral white matter using magnetic resonance diffusion tensor imaging (MR-DTI) to measure the diffusion anisotropy of water molecules. The goal of this study is the quantitative evaluation of schizophrenia. Diffusion tensor images are acquired for patients with schizophrenia and healthy comparison subjects, group-matched for age, sex, and handedness. Fiber tracking is performed on the superior longitudinal fasciculus for the comparison between the patient and comparison groups. We have analysed and compared the cross-sectional area on the starting coronal plane and the mean and standard deviation of the fractional anisotropy and the apparent diffusion coefficient along fibers in the right and left hemispheres. In the right hemisphere, the cross-sectional areas in patient group are significantly smaller than those in the comparison group. Furthermore, in the comparison group, the cross-sectional areas in the right hemisphere are significantly larger than those in the left hemisphere, whereas there is no significant difference in the patient group. These results suggest that we may evaluate the disruption in white matter integrity in schizophrenic patients quantitatively by comparing the cross-sectional area of the superior longitudinal fasciculus in the right and left hemispheres.

  5. A Backscatter-Lidar Forward-Operator

    NASA Astrophysics Data System (ADS)

    Geisinger, Armin; Behrendt, Andreas; Wulfmeyer, Volker; Vogel, Bernhard; Mattis, Ina; Flentje, Harald; Förstner, Jochen; Potthast, Roland

    2015-04-01

    We have developed a forward-operator which is capable of calculating virtual lidar profiles from atmospheric state simulations. The operator allows us to compare lidar measurements and model simulations based on the same measurement parameter: the lidar backscatter profile. This method simplifies qualitative comparisons and also makes quantitative comparisons possible, including statistical error quantification. Implemented into an aerosol-capable model system, the operator will act as a component to assimilate backscatter-lidar measurements. As many weather services maintain already networks of backscatter-lidars, such data are acquired already in an operational manner. To estimate and quantify errors due to missing or uncertain aerosol information, we started sensitivity studies about several scattering parameters such as the aerosol size and both the real and imaginary part of the complex index of refraction. Furthermore, quantitative and statistical comparisons between measurements and virtual measurements are shown in this study, i.e. applying the backscatter-lidar forward-operator on model output.

  6. Comparison of Dynamic Contrast Enhanced MRI and Quantitative SPECT in a Rat Glioma Model

    PubMed Central

    Skinner, Jack T.; Yankeelov, Thomas E.; Peterson, Todd E.; Does, Mark D.

    2012-01-01

    Pharmacokinetic modeling of dynamic contrast enhanced (DCE)-MRI data provides measures of the extracellular volume fraction (ve) and the volume transfer constant (Ktrans) in a given tissue. These parameter estimates may be biased, however, by confounding issues such as contrast agent and tissue water dynamics, or assumptions of vascularization and perfusion made by the commonly used model. In contrast to MRI, radiotracer imaging with SPECT is insensitive to water dynamics. A quantitative dual-isotope SPECT technique was developed to obtain an estimate of ve in a rat glioma model for comparison to the corresponding estimates obtained using DCE-MRI with a vascular input function (VIF) and reference region model (RR). Both DCE-MRI methods produced consistently larger estimates of ve in comparison to the SPECT estimates, and several experimental sources were postulated to contribute to these differences. PMID:22991315

  7. SHORT COMMUNICATION: Comparison between two mobile absolute gravimeters: optical versus atomic interferometers

    NASA Astrophysics Data System (ADS)

    Merlet, S.; Bodart, Q.; Malossi, N.; Landragin, A.; Pereira Dos Santos, F.; Gitlein, O.; Timmen, L.

    2010-08-01

    We report a comparison between two absolute gravimeters: the LNE-SYRTE cold atom gravimeter and FG5#220 of Leibniz Universität of Hannover. They rely on different principles of operation: atomic and optical interferometry. Both are movable which enabled them to participate in the last International Comparison of Absolute Gravimeters (ICAG'09) at BIPM. Immediately after, their bilateral comparison took place in the LNE watt balance laboratory and showed an agreement of (4.3 ± 6.4) µGal.

  8. Paper Capillary Enables Effective Sampling for Microfluidic Paper Analytical Devices.

    PubMed

    Shangguan, Jin-Wen; Liu, Yu; Wang, Sha; Hou, Yun-Xuan; Xu, Bi-Yi; Xu, Jing-Juan; Chen, Hong-Yuan

    2018-06-06

    Paper capillary is introduced to enable effective sampling on microfluidic paper analytical devices. By coupling mac-roscale capillary force of paper capillary and microscale capillary forces of native paper, fluid transport can be flexibly tailored with proper design. Subsequently, a hybrid-fluid-mode paper capillary device was proposed, which enables fast and reliable sampling in an arrayed form, with less surface adsorption and bias for different components. The resulting device thus well supports high throughput, quantitative, and repeatable assays all by hands operation. With all these merits, multiplex analysis of ions, proteins, and microbe have all been realized on this platform, which has paved the way to level-up analysis on μPADs.

  9. A Targeted LC-MS/MS Method for the Simultaneous Detection and Quantitation of Egg, Milk, and Peanut Allergens in Sugar Cookies.

    PubMed

    Boo, Chelsea C; Parker, Christine H; Jackson, Lauren S

    2018-01-01

    Food allergy is a growing public health concern, with many individuals reporting allergies to multiple food sources. Compliance with food labeling regulations and prevention of inadvertent cross-contact in manufacturing requires the use of reliable methods for the detection and quantitation of allergens in processed foods. In this work, a novel liquid chromatography-tandem mass spectrometry multiple-reaction monitoring method for multiallergen detection and quantitation of egg, milk, and peanut was developed and evaluated in an allergen-incurred baked sugar cookie matrix. A systematic evaluation of method parameters, including sample extraction, concentration, and digestion, were optimized for candidate allergen peptide markers. The optimized method enabled the reliable detection and quantitation of egg, milk, and peanut allergens in sugar cookies, with allergen concentrations as low as 5 ppm allergen-incurred ingredient.

  10. Using Electronic Messaging to Improve the Quality of Instruction.

    ERIC Educational Resources Information Center

    Zack, Michael H.

    1995-01-01

    Qualitative and quantitative data from business students using electronic mail and computer conferencing showed these methods enabled the instructor to be more accessible and responsive; greater class cohesion developed, and perceived quality of the course and instructor effectiveness increased. (SK)

  11. Advanced Technologies for Structural and Functional Optical Coherence Tomography

    DTIC Science & Technology

    2015-01-07

    vertical scale bar: 500 um. 9 OCT speckle noise can significantly affect polarimetry measurement and must be reduced for birefringence...shown in Figure 7. This technique enables more accurate polarimetry measurement and quantitative assessment of tissue birefringence. Figure 7

  12. Grid-Enabled Quantitative Analysis of Breast Cancer

    DTIC Science & Technology

    2010-10-01

    large-scale, multi-modality computerized image analysis . The central hypothesis of this research is that large-scale image analysis for breast cancer...research, we designed a pilot study utilizing large scale parallel Grid computing harnessing nationwide infrastructure for medical image analysis . Also

  13. Using Technology to Balance Algebraic Explorations

    ERIC Educational Resources Information Center

    Kurz, Terri L.

    2013-01-01

    In 2000, the "National Council of Teachers of Mathematics" recommended that Algebra Standards, "instructional programs from prekindergarten through grade 12 should enable all students to use mathematical models to represent and understand quantitative relationships." In this article, the authors suggest the "Balance"…

  14. SPECHT - single-stage phosphopeptide enrichment and stable-isotope chemical tagging: quantitative phosphoproteomics of insulin action in muscle.

    PubMed

    Kettenbach, Arminja N; Sano, Hiroyuki; Keller, Susanna R; Lienhard, Gustav E; Gerber, Scott A

    2015-01-30

    The study of cellular signaling remains a significant challenge for translational and clinical research. In particular, robust and accurate methods for quantitative phosphoproteomics in tissues and tumors represent significant hurdles for such efforts. In the present work, we design, implement and validate a method for single-stage phosphopeptide enrichment and stable isotope chemical tagging, or SPECHT, that enables the use of iTRAQ, TMT and/or reductive dimethyl-labeling strategies to be applied to phosphoproteomics experiments performed on primary tissue. We develop and validate our approach using reductive dimethyl-labeling and HeLa cells in culture, and find these results indistinguishable from data generated from more traditional SILAC-labeled HeLa cells mixed at the cell level. We apply the SPECHT approach to the quantitative analysis of insulin signaling in a murine myotube cell line and muscle tissue, identify known as well as new phosphorylation events, and validate these phosphorylation sites using phospho-specific antibodies. Taken together, our work validates chemical tagging post-single-stage phosphoenrichment as a general strategy for studying cellular signaling in primary tissues. Through the use of a quantitatively reproducible, proteome-wide phosphopeptide enrichment strategy, we demonstrated the feasibility of post-phosphopeptide purification chemical labeling and tagging as an enabling approach for quantitative phosphoproteomics of primary tissues. Using reductive dimethyl labeling as a generalized chemical tagging strategy, we compared the performance of post-phosphopeptide purification chemical tagging to the well established community standard, SILAC, in insulin-stimulated tissue culture cells. We then extended our method to the analysis of low-dose insulin signaling in murine muscle tissue, and report on the analytical and biological significance of our results. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Tandem transmission/reflection mode XRD instrument including XRF for in situ measurement of Martian rocks and soils

    NASA Astrophysics Data System (ADS)

    Delhez, Robert; Van der Gaast, S. J.; Wielders, Arno; de Boer, J. L.; Helmholdt, R. B.; van Mechelen, J.; Reiss, C.; Woning, L.; Schenk, H.

    2003-02-01

    The mineralogy of the surface material of Mars is the key to disclose its present and past life and climates. Clay mineral species, carbonates, and ice (water and CO2) are and/or contain their witnesses. X-ray powder diffraction (XRPD) is the most powerful analytical method to identify and quantitatively characterize minerals in complex mixtures. This paper discusses the development of a working model of an instrument consisting of a reflection mode diffractometer and a transmission mode CCD-XRPD instrument, combined with an XRF module. The CCD-XRD/XRF instrument is analogous to the instrument for Mars missions developed by Sarrazin et al. (1998). This part of the tandem instrument enables "quick and dirty" analysis of powdered (!) matter to monitor semi-quantitatively the presence of clay minerals as a group, carbonates, and ices and yields semi-quantitative chemical information from X-ray fluorescence (XRF). The reflection mode instrument (i) enables in-situ measurements of rocks and soils and quantitative information on the compounds identified, (ii) has a high resolution and reveals large spacings for accurate identification, in particular of clay mineral species, and (iii) the shape of the line profiles observed reveals the kind and approximate amounts of lattice imperfections present. It will be shown that the information obtained with the reflection mode diffractometer is crucial for finding signs of life and changes in the climate on Mars. Obviously this instrument can also be used for other extra-terrestrial research.

  16. Detection of Rare Drug Resistance Mutations by Digital PCR in a Human Influenza A Virus Model System and Clinical Samples

    PubMed Central

    Bushell, Claire A.; Grant, Paul R.; Cowen, Simon; Gutierrez-Aguirre, Ion; O'Sullivan, Denise M.; Žel, Jana; Milavec, Mojca; Foy, Carole A.; Nastouli, Eleni; Garson, Jeremy A.; Huggett, Jim F.

    2015-01-01

    Digital PCR (dPCR) is being increasingly used for the quantification of sequence variations, including single nucleotide polymorphisms (SNPs), due to its high accuracy and precision in comparison with techniques such as quantitative PCR (qPCR) and melt curve analysis. To develop and evaluate dPCR for SNP detection using DNA, RNA, and clinical samples, an influenza virus model of resistance to oseltamivir (Tamiflu) was used. First, this study was able to recognize and reduce off-target amplification in dPCR quantification, thereby enabling technical sensitivities down to 0.1% SNP abundance at a range of template concentrations, a 50-fold improvement on the qPCR assay used routinely in the clinic. Second, a method was developed for determining the false-positive rate (background) signal. Finally, comparison of dPCR with qPCR results on clinical samples demonstrated the potential impact dPCR could have on clinical research and patient management by earlier (trace) detection of rare drug-resistant sequence variants. Ultimately this could reduce the quantity of ineffective drugs taken and facilitate early switching to alternative medication when available. In the short term such methods could advance our understanding of microbial dynamics and therapeutic responses in a range of infectious diseases such as HIV, viral hepatitis, and tuberculosis. Furthermore, the findings presented here are directly relevant to other diagnostic areas, such as the detection of rare SNPs in malignancy, monitoring of graft rejection, and fetal screening. PMID:26659206

  17. Detection of Rare Drug Resistance Mutations by Digital PCR in a Human Influenza A Virus Model System and Clinical Samples.

    PubMed

    Whale, Alexandra S; Bushell, Claire A; Grant, Paul R; Cowen, Simon; Gutierrez-Aguirre, Ion; O'Sullivan, Denise M; Žel, Jana; Milavec, Mojca; Foy, Carole A; Nastouli, Eleni; Garson, Jeremy A; Huggett, Jim F

    2016-02-01

    Digital PCR (dPCR) is being increasingly used for the quantification of sequence variations, including single nucleotide polymorphisms (SNPs), due to its high accuracy and precision in comparison with techniques such as quantitative PCR (qPCR) and melt curve analysis. To develop and evaluate dPCR for SNP detection using DNA, RNA, and clinical samples, an influenza virus model of resistance to oseltamivir (Tamiflu) was used. First, this study was able to recognize and reduce off-target amplification in dPCR quantification, thereby enabling technical sensitivities down to 0.1% SNP abundance at a range of template concentrations, a 50-fold improvement on the qPCR assay used routinely in the clinic. Second, a method was developed for determining the false-positive rate (background) signal. Finally, comparison of dPCR with qPCR results on clinical samples demonstrated the potential impact dPCR could have on clinical research and patient management by earlier (trace) detection of rare drug-resistant sequence variants. Ultimately this could reduce the quantity of ineffective drugs taken and facilitate early switching to alternative medication when available. In the short term such methods could advance our understanding of microbial dynamics and therapeutic responses in a range of infectious diseases such as HIV, viral hepatitis, and tuberculosis. Furthermore, the findings presented here are directly relevant to other diagnostic areas, such as the detection of rare SNPs in malignancy, monitoring of graft rejection, and fetal screening. Copyright © 2016 Whale et al.

  18. Predicting loop–helix tertiary structural contacts in RNA pseudoknots

    PubMed Central

    Cao, Song; Giedroc, David P.; Chen, Shi-Jie

    2010-01-01

    Tertiary interactions between loops and helical stems play critical roles in the biological function of many RNA pseudoknots. However, quantitative predictions for RNA tertiary interactions remain elusive. Here we report a statistical mechanical model for the prediction of noncanonical loop–stem base-pairing interactions in RNA pseudoknots. Central to the model is the evaluation of the conformational entropy for the pseudoknotted folds with defined loop–stem tertiary structural contacts. We develop an RNA virtual bond-based conformational model (Vfold model), which permits a rigorous computation of the conformational entropy for a given fold that contains loop–stem tertiary contacts. With the entropy parameters predicted from the Vfold model and the energy parameters for the tertiary contacts as inserted parameters, we can then predict the RNA folding thermodynamics, from which we can extract the tertiary contact thermodynamic parameters from theory–experimental comparisons. These comparisons reveal a contact enthalpy (ΔH) of −14 kcal/mol and a contact entropy (ΔS) of −38 cal/mol/K for a protonated C+•(G–C) base triple at pH 7.0, and (ΔH = −7 kcal/mol, ΔS = −19 cal/mol/K) for an unprotonated base triple. Tests of the model for a series of pseudoknots show good theory–experiment agreement. Based on the extracted energy parameters for the tertiary structural contacts, the model enables predictions for the structure, stability, and folding pathways for RNA pseudoknots with known or postulated loop–stem tertiary contacts from the nucleotide sequence alone. PMID:20100813

  19. "Dip-and-read" paper-based analytical devices using distance-based detection with color screening.

    PubMed

    Yamada, Kentaro; Citterio, Daniel; Henry, Charles S

    2018-05-15

    An improved paper-based analytical device (PAD) using color screening to enhance device performance is described. Current detection methods for PADs relying on the distance-based signalling motif can be slow due to the assay time being limited by capillary flow rates that wick fluid through the detection zone. For traditional distance-based detection motifs, analysis can take up to 45 min for a channel length of 5 cm. By using a color screening method, quantification with a distance-based PAD can be achieved in minutes through a "dip-and-read" approach. A colorimetric indicator line deposited onto a paper substrate using inkjet-printing undergoes a concentration-dependent colorimetric response for a given analyte. This color intensity-based response has been converted to a distance-based signal by overlaying a color filter with a continuous color intensity gradient matching the color of the developed indicator line. As a proof-of-concept, Ni quantification in welding fume was performed as a model assay. The results of multiple independent user testing gave mean absolute percentage error and average relative standard deviations of 10.5% and 11.2% respectively, which were an improvement over analysis based on simple visual color comparison with a read guide (12.2%, 14.9%). In addition to the analytical performance comparison, an interference study and a shelf life investigation were performed to further demonstrate practical utility. The developed system demonstrates an alternative detection approach for distance-based PADs enabling fast (∼10 min), quantitative, and straightforward assays.

  20. Diffusion kurtosis imaging of the liver at 3 Tesla: in vivo comparison to standard diffusion-weighted imaging.

    PubMed

    Budjan, Johannes; Sauter, Elke A; Zoellner, Frank G; Lemke, Andreas; Wambsganss, Jens; Schoenberg, Stefan O; Attenberger, Ulrike I

    2018-01-01

    Background Functional techniques like diffusion-weighted imaging (DWI) are gaining more and more importance in liver magnetic resonance imaging (MRI). Diffusion kurtosis imaging (DKI) is an advanced technique that might help to overcome current limitations of DWI. Purpose To evaluate DKI for the differentiation of hepatic lesions in comparison to conventional DWI at 3 Tesla. Material and Methods Fifty-six consecutive patients were examined using a routine abdominal MR protocol at 3 Tesla which included DWI with b-values of 50, 400, 800, and 1000 s/mm 2 . Apparent diffusion coefficient maps were calculated applying a standard mono-exponential fit, while a non-Gaussian kurtosis fit was used to obtain DKI maps. ADC as well as Kurtosis-corrected diffusion ( D) values were quantified by region of interest analysis and compared between lesions. Results Sixty-eight hepatic lesions (hepatocellular carcinoma [HCC] [n = 25]; hepatic adenoma [n = 4], cysts [n = 18]; hepatic hemangioma [HH] [n = 18]; and focal nodular hyperplasia [n = 3]) were identified. Differentiation of malignant and benign lesions was possible based on both DWI ADC as well as DKI D-values ( P values were in the range of 0.04 to < 0.0001). Conclusion In vivo abdominal DKI calculated using standard b-values is feasible and enables quantitative differentiation between malignant and benign liver lesions. Assessment of conventional ADC values leads to similar results when using b-values below 1000 s/mm 2 for DKI calculation.

Top