Sample records for enable quantitative analysis

  1. Risk Assessment and Integration Team (RAIT) Portfolio Risk Analysis Strategy

    NASA Technical Reports Server (NTRS)

    Edwards, Michelle

    2010-01-01

    Impact at management level: Qualitative assessment of risk criticality in conjunction with risk consequence, likelihood, and severity enable development of an "investment policy" towards managing a portfolio of risks. Impact at research level: Quantitative risk assessments enable researchers to develop risk mitigation strategies with meaningful risk reduction results. Quantitative assessment approach provides useful risk mitigation information.

  2. SCOPA and META-SCOPA: software for the analysis and aggregation of genome-wide association studies of multiple correlated phenotypes.

    PubMed

    Mägi, Reedik; Suleimanov, Yury V; Clarke, Geraldine M; Kaakinen, Marika; Fischer, Krista; Prokopenko, Inga; Morris, Andrew P

    2017-01-11

    Genome-wide association studies (GWAS) of single nucleotide polymorphisms (SNPs) have been successful in identifying loci contributing genetic effects to a wide range of complex human diseases and quantitative traits. The traditional approach to GWAS analysis is to consider each phenotype separately, despite the fact that many diseases and quantitative traits are correlated with each other, and often measured in the same sample of individuals. Multivariate analyses of correlated phenotypes have been demonstrated, by simulation, to increase power to detect association with SNPs, and thus may enable improved detection of novel loci contributing to diseases and quantitative traits. We have developed the SCOPA software to enable GWAS analysis of multiple correlated phenotypes. The software implements "reverse regression" methodology, which treats the genotype of an individual at a SNP as the outcome and the phenotypes as predictors in a general linear model. SCOPA can be applied to quantitative traits and categorical phenotypes, and can accommodate imputed genotypes under a dosage model. The accompanying META-SCOPA software enables meta-analysis of association summary statistics from SCOPA across GWAS. Application of SCOPA to two GWAS of high-and low-density lipoprotein cholesterol, triglycerides and body mass index, and subsequent meta-analysis with META-SCOPA, highlighted stronger association signals than univariate phenotype analysis at established lipid and obesity loci. The META-SCOPA meta-analysis also revealed a novel signal of association at genome-wide significance for triglycerides mapping to GPC5 (lead SNP rs71427535, p = 1.1x10 -8 ), which has not been reported in previous large-scale GWAS of lipid traits. The SCOPA and META-SCOPA software enable discovery and dissection of multiple phenotype association signals through implementation of a powerful reverse regression approach.

  3. Quantitative Determination of Aluminum in Deodorant Brands: A Guided Inquiry Learning Experience in Quantitative Analysis Laboratory

    ERIC Educational Resources Information Center

    Sedwick, Victoria; Leal, Anne; Turner, Dea; Kanu, A. Bakarr

    2018-01-01

    The monitoring of metals in commercial products is essential for protecting public health against the hazards of metal toxicity. This article presents a guided inquiry (GI) experimental lab approach in a quantitative analysis lab class that enabled students' to determine the levels of aluminum in deodorant brands. The utility of a GI experimental…

  4. Label-free quantitative cell division monitoring of endothelial cells by digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Kemper, Björn; Bauwens, Andreas; Vollmer, Angelika; Ketelhut, Steffi; Langehanenberg, Patrik; Müthing, Johannes; Karch, Helge; von Bally, Gert

    2010-05-01

    Digital holographic microscopy (DHM) enables quantitative multifocus phase contrast imaging for nondestructive technical inspection and live cell analysis. Time-lapse investigations on human brain microvascular endothelial cells demonstrate the use of DHM for label-free dynamic quantitative monitoring of cell division of mother cells into daughter cells. Cytokinetic DHM analysis provides future applications in toxicology and cancer research.

  5. MIiSR: Molecular Interactions in Super-Resolution Imaging Enables the Analysis of Protein Interactions, Dynamics and Formation of Multi-protein Structures.

    PubMed

    Caetano, Fabiana A; Dirk, Brennan S; Tam, Joshua H K; Cavanagh, P Craig; Goiko, Maria; Ferguson, Stephen S G; Pasternak, Stephen H; Dikeakos, Jimmy D; de Bruyn, John R; Heit, Bryan

    2015-12-01

    Our current understanding of the molecular mechanisms which regulate cellular processes such as vesicular trafficking has been enabled by conventional biochemical and microscopy techniques. However, these methods often obscure the heterogeneity of the cellular environment, thus precluding a quantitative assessment of the molecular interactions regulating these processes. Herein, we present Molecular Interactions in Super Resolution (MIiSR) software which provides quantitative analysis tools for use with super-resolution images. MIiSR combines multiple tools for analyzing intermolecular interactions, molecular clustering and image segmentation. These tools enable quantification, in the native environment of the cell, of molecular interactions and the formation of higher-order molecular complexes. The capabilities and limitations of these analytical tools are demonstrated using both modeled data and examples derived from the vesicular trafficking system, thereby providing an established and validated experimental workflow capable of quantitatively assessing molecular interactions and molecular complex formation within the heterogeneous environment of the cell.

  6. Quantitative maps of genetic interactions in yeast - comparative evaluation and integrative analysis.

    PubMed

    Lindén, Rolf O; Eronen, Ville-Pekka; Aittokallio, Tero

    2011-03-24

    High-throughput genetic screening approaches have enabled systematic means to study how interactions among gene mutations contribute to quantitative fitness phenotypes, with the aim of providing insights into the functional wiring diagrams of genetic interaction networks on a global scale. However, it is poorly known how well these quantitative interaction measurements agree across the screening approaches, which hinders their integrated use toward improving the coverage and quality of the genetic interaction maps in yeast and other organisms. Using large-scale data matrices from epistatic miniarray profiling (E-MAP), genetic interaction mapping (GIM), and synthetic genetic array (SGA) approaches, we carried out here a systematic comparative evaluation among these quantitative maps of genetic interactions in yeast. The relatively low association between the original interaction measurements or their customized scores could be improved using a matrix-based modelling framework, which enables the use of single- and double-mutant fitness estimates and measurements, respectively, when scoring genetic interactions. Toward an integrative analysis, we show how the detections from the different screening approaches can be combined to suggest novel positive and negative interactions which are complementary to those obtained using any single screening approach alone. The matrix approximation procedure has been made available to support the design and analysis of the future screening studies. We have shown here that even if the correlation between the currently available quantitative genetic interaction maps in yeast is relatively low, their comparability can be improved by means of our computational matrix approximation procedure, which will enable integrative analysis and detection of a wider spectrum of genetic interactions using data from the complementary screening approaches.

  7. Coupling Reagent for UV/vis Absorbing Azobenzene-Based Quantitative Analysis of the Extent of Functional Group Immobilization on Silica.

    PubMed

    Choi, Ra-Young; Lee, Chang-Hee; Jun, Chul-Ho

    2018-05-18

    A methallylsilane coupling reagent, containing both a N-hydroxysuccinimidyl(NHS)-ester group and a UV/vis absorbing azobenzene linker undergoes acid-catalyzed immobilization on silica. Analysis of the UV/vis absorption band associated with the azobenzene group in the adduct enables facile quantitative determination of the extent of loading of the NHS groups. Reaction of NHS-groups on the silica surface with amine groups of GOx and rhodamine can be employed to generate enzyme or dye-immobilized silica for quantitative analysis.

  8. Identification and Quantitation of Asparagine and Citrulline Using High-Performance Liquid Chromatography (HPLC)

    PubMed Central

    Bai, Cheng; Reilly, Charles C.; Wood, Bruce W.

    2007-01-01

    High-performance liquid chromatography (HPLC) analysis was used for identification of two problematic ureides, asparagine and citrulline. We report here a technique that takes advantage of the predictable delay in retention time of the co-asparagine/citrulline peak to enable both qualitative and quantitative analysis of asparagine and citrulline using the Platinum EPS reverse-phase C18 column (Alltech Associates). Asparagine alone is eluted earlier than citrulline alone, but when both of them are present in biological samples they may co-elute. HPLC retention times for asparagine and citrulline were influenced by other ureides in the mixture. We found that at various asparagines and citrulline ratios [= 3:1, 1:1, and 1:3; corresponding to 75:25, 50:50, and 25:75 (μMol ml−1/μMol ml−1)], the resulting peak exhibited different retention times. Adjustment of ureide ratios as internal standards enables peak identification and quantification. Both chemicals were quantified in xylem sap samples of pecan [Carya illinoinensis (Wangenh.) K. Koch] trees. Analysis revealed that tree nickel nutrition status affects relative concentrations of Urea Cycle intermediates, asparagine and citrulline, present in sap. Consequently, we concluded that the HPLC methods are presented to enable qualitative and quantitative analysis of these metabolically important ureides. PMID:19662174

  9. Identification and quantitation of asparagine and citrulline using high-performance liquid chromatography (HPLC).

    PubMed

    Bai, Cheng; Reilly, Charles C; Wood, Bruce W

    2007-03-28

    High-performance liquid chromatography (HPLC) analysis was used for identification of two problematic ureides, asparagine and citrulline. We report here a technique that takes advantage of the predictable delay in retention time of the co-asparagine/citrulline peak to enable both qualitative and quantitative analysis of asparagine and citrulline using the Platinum EPS reverse-phase C18 column (Alltech Associates). Asparagine alone is eluted earlier than citrulline alone, but when both of them are present in biological samples they may co-elute. HPLC retention times for asparagine and citrulline were influenced by other ureides in the mixture. We found that at various asparagines and citrulline ratios [= 3:1, 1:1, and 1:3; corresponding to 75:25, 50:50, and 25:75 (microMol ml(-1)/microMol ml(-1))], the resulting peak exhibited different retention times. Adjustment of ureide ratios as internal standards enables peak identification and quantification. Both chemicals were quantified in xylem sap samples of pecan [Carya illinoinensis (Wangenh.) K. Koch] trees. Analysis revealed that tree nickel nutrition status affects relative concentrations of Urea Cycle intermediates, asparagine and citrulline, present in sap. Consequently, we concluded that the HPLC methods are presented to enable qualitative and quantitative analysis of these metabolically important ureides.

  10. Informatics methods to enable sharing of quantitative imaging research data.

    PubMed

    Levy, Mia A; Freymann, John B; Kirby, Justin S; Fedorov, Andriy; Fennessy, Fiona M; Eschrich, Steven A; Berglund, Anders E; Fenstermacher, David A; Tan, Yongqiang; Guo, Xiaotao; Casavant, Thomas L; Brown, Bartley J; Braun, Terry A; Dekker, Andre; Roelofs, Erik; Mountz, James M; Boada, Fernando; Laymon, Charles; Oborski, Matt; Rubin, Daniel L

    2012-11-01

    The National Cancer Institute Quantitative Research Network (QIN) is a collaborative research network whose goal is to share data, algorithms and research tools to accelerate quantitative imaging research. A challenge is the variability in tools and analysis platforms used in quantitative imaging. Our goal was to understand the extent of this variation and to develop an approach to enable sharing data and to promote reuse of quantitative imaging data in the community. We performed a survey of the current tools in use by the QIN member sites for representation and storage of their QIN research data including images, image meta-data and clinical data. We identified existing systems and standards for data sharing and their gaps for the QIN use case. We then proposed a system architecture to enable data sharing and collaborative experimentation within the QIN. There are a variety of tools currently used by each QIN institution. We developed a general information system architecture to support the QIN goals. We also describe the remaining architecture gaps we are developing to enable members to share research images and image meta-data across the network. As a research network, the QIN will stimulate quantitative imaging research by pooling data, algorithms and research tools. However, there are gaps in current functional requirements that will need to be met by future informatics development. Special attention must be given to the technical requirements needed to translate these methods into the clinical research workflow to enable validation and qualification of these novel imaging biomarkers. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. Grid-Enabled Quantitative Analysis of Breast Cancer

    DTIC Science & Technology

    2010-10-01

    large-scale, multi-modality computerized image analysis . The central hypothesis of this research is that large-scale image analysis for breast cancer...research, we designed a pilot study utilizing large scale parallel Grid computing harnessing nationwide infrastructure for medical image analysis . Also

  12. Neutron-activation analysis applied to copper ores and artifacts

    NASA Technical Reports Server (NTRS)

    Linder, N. F.

    1970-01-01

    Neutron activation analysis is used for quantitative identification of trace metals in copper. Establishing a unique fingerprint of impurities in Michigan copper would enable identification of artifacts made from this copper.

  13. Applications of pathology-assisted image analysis of immunohistochemistry-based biomarkers in oncology.

    PubMed

    Shinde, V; Burke, K E; Chakravarty, A; Fleming, M; McDonald, A A; Berger, A; Ecsedy, J; Blakemore, S J; Tirrell, S M; Bowman, D

    2014-01-01

    Immunohistochemistry-based biomarkers are commonly used to understand target inhibition in key cancer pathways in preclinical models and clinical studies. Automated slide-scanning and advanced high-throughput image analysis software technologies have evolved into a routine methodology for quantitative analysis of immunohistochemistry-based biomarkers. Alongside the traditional pathology H-score based on physical slides, the pathology world is welcoming digital pathology and advanced quantitative image analysis, which have enabled tissue- and cellular-level analysis. An automated workflow was implemented that includes automated staining, slide-scanning, and image analysis methodologies to explore biomarkers involved in 2 cancer targets: Aurora A and NEDD8-activating enzyme (NAE). The 2 workflows highlight the evolution of our immunohistochemistry laboratory and the different needs and requirements of each biological assay. Skin biopsies obtained from MLN8237 (Aurora A inhibitor) phase 1 clinical trials were evaluated for mitotic and apoptotic index, while mitotic index and defects in chromosome alignment and spindles were assessed in tumor biopsies to demonstrate Aurora A inhibition. Additionally, in both preclinical xenograft models and an acute myeloid leukemia phase 1 trial of the NAE inhibitor MLN4924, development of a novel image algorithm enabled measurement of downstream pathway modulation upon NAE inhibition. In the highlighted studies, developing a biomarker strategy based on automated image analysis solutions enabled project teams to confirm target and pathway inhibition and understand downstream outcomes of target inhibition with increased throughput and quantitative accuracy. These case studies demonstrate a strategy that combines a pathologist's expertise with automated image analysis to support oncology drug discovery and development programs.

  14. Qualitative and quantitative interpretation of SEM image using digital image processing.

    PubMed

    Saladra, Dawid; Kopernik, Magdalena

    2016-10-01

    The aim of the this study is improvement of qualitative and quantitative analysis of scanning electron microscope micrographs by development of computer program, which enables automatic crack analysis of scanning electron microscopy (SEM) micrographs. Micromechanical tests of pneumatic ventricular assist devices result in a large number of micrographs. Therefore, the analysis must be automatic. Tests for athrombogenic titanium nitride/gold coatings deposited on polymeric substrates (Bionate II) are performed. These tests include microshear, microtension and fatigue analysis. Anisotropic surface defects observed in the SEM micrographs require support for qualitative and quantitative interpretation. Improvement of qualitative analysis of scanning electron microscope images was achieved by a set of computational tools that includes binarization, simplified expanding, expanding, simple image statistic thresholding, the filters Laplacian 1, and Laplacian 2, Otsu and reverse binarization. Several modifications of the known image processing techniques and combinations of the selected image processing techniques were applied. The introduced quantitative analysis of digital scanning electron microscope images enables computation of stereological parameters such as area, crack angle, crack length, and total crack length per unit area. This study also compares the functionality of the developed computer program of digital image processing with existing applications. The described pre- and postprocessing may be helpful in scanning electron microscopy and transmission electron microscopy surface investigations. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  15. Investigating the Educational Value of Social Learning Networks: A Quantitative Analysis

    ERIC Educational Resources Information Center

    Dafoulas, Georgios; Shokri, Azam

    2016-01-01

    Purpose: The emergence of Education 2.0 enabled technology-enhanced learning, necessitating new pedagogical approaches, while e-learning has evolved into an instrumental pedagogy of collaboration through affordances of social media. Social learning networks and ubiquitous learning enabled individual and group learning through social engagement and…

  16. Stereological analysis of bacterial load and lung lesions in nonhuman primates (rhesus macaques) experimentally infected with Mycobacterium tuberculosis.

    PubMed

    Luciw, Paul A; Oslund, Karen L; Yang, Xiao-Wei; Adamson, Lourdes; Ravindran, Resmi; Canfield, Don R; Tarara, Ross; Hirst, Linda; Christensen, Miles; Lerche, Nicholas W; Offenstein, Heather; Lewinsohn, David; Ventimiglia, Frank; Brignolo, Laurie; Wisner, Erik R; Hyde, Dallas M

    2011-11-01

    Infection with Mycobacterium tuberculosis primarily produces a multifocal distribution of pulmonary granulomas in which the pathogen resides. Accordingly, quantitative assessment of the bacterial load and pathology is a substantial challenge in tuberculosis. Such assessments are critical for studies of the pathogenesis and for the development of vaccines and drugs in animal models of experimental M. tuberculosis infection. Stereology enables unbiased quantitation of three-dimensional objects from two-dimensional sections and thus is suited to quantify histological lesions. We have developed a protocol for stereological analysis of the lung in rhesus macaques inoculated with a pathogenic clinical strain of M. tuberculosis (Erdman strain). These animals exhibit a pattern of infection and tuberculosis similar to that of naturally infected humans. Conditions were optimized for collecting lung samples in a nonbiased, random manner. Bacterial load in these samples was assessed by a standard plating assay, and granulomas were graded and enumerated microscopically. Stereological analysis provided quantitative data that supported a significant correlation between bacterial load and lung granulomas. Thus this stereological approach enables a quantitative, statistically valid analysis of the impact of M. tuberculosis infection in the lung and will serve as an essential tool for objectively comparing the efficacy of drugs and vaccines.

  17. Optimized protocol for quantitative multiple reaction monitoring-based proteomic analysis of formalin-fixed, paraffin embedded tissues

    PubMed Central

    Kennedy, Jacob J.; Whiteaker, Jeffrey R.; Schoenherr, Regine M.; Yan, Ping; Allison, Kimberly; Shipley, Melissa; Lerch, Melissa; Hoofnagle, Andrew N.; Baird, Geoffrey Stuart; Paulovich, Amanda G.

    2016-01-01

    Despite a clinical, economic, and regulatory imperative to develop companion diagnostics, precious few new biomarkers have been successfully translated into clinical use, due in part to inadequate protein assay technologies to support large-scale testing of hundreds of candidate biomarkers in formalin-fixed paraffin embedded (FFPE) tissues. While the feasibility of using targeted, multiple reaction monitoring-mass spectrometry (MRM-MS) for quantitative analyses of FFPE tissues has been demonstrated, protocols have not been systematically optimized for robust quantification across a large number of analytes, nor has the performance of peptide immuno-MRM been evaluated. To address this gap, we used a test battery approach coupled to MRM-MS with the addition of stable isotope labeled standard peptides (targeting 512 analytes) to quantitatively evaluate the performance of three extraction protocols in combination with three trypsin digestion protocols (i.e. 9 processes). A process based on RapiGest buffer extraction and urea-based digestion was identified to enable similar quantitation results from FFPE and frozen tissues. Using the optimized protocols for MRM-based analysis of FFPE tissues, median precision was 11.4% (across 249 analytes). There was excellent correlation between measurements made on matched FFPE and frozen tissues, both for direct MRM analysis (R2 = 0.94) and immuno-MRM (R2 = 0.89). The optimized process enables highly reproducible, multiplex, standardizable, quantitative MRM in archival tissue specimens. PMID:27462933

  18. The role of 3-D interactive visualization in blind surveys of H I in galaxies

    NASA Astrophysics Data System (ADS)

    Punzo, D.; van der Hulst, J. M.; Roerdink, J. B. T. M.; Oosterloo, T. A.; Ramatsoku, M.; Verheijen, M. A. W.

    2015-09-01

    Upcoming H I surveys will deliver large datasets, and automated processing using the full 3-D information (two positional dimensions and one spectral dimension) to find and characterize H I objects is imperative. In this context, visualization is an essential tool for enabling qualitative and quantitative human control on an automated source finding and analysis pipeline. We discuss how Visual Analytics, the combination of automated data processing and human reasoning, creativity and intuition, supported by interactive visualization, enables flexible and fast interaction with the 3-D data, helping the astronomer to deal with the analysis of complex sources. 3-D visualization, coupled to modeling, provides additional capabilities helping the discovery and analysis of subtle structures in the 3-D domain. The requirements for a fully interactive visualization tool are: coupled 1-D/2-D/3-D visualization, quantitative and comparative capabilities, combined with supervised semi-automated analysis. Moreover, the source code must have the following characteristics for enabling collaborative work: open, modular, well documented, and well maintained. We review four state of-the-art, 3-D visualization packages assessing their capabilities and feasibility for use in the case of 3-D astronomical data.

  19. Photo ion spectrometer

    DOEpatents

    Gruen, Dieter M.; Young, Charles E.; Pellin, Michael J.

    1989-01-01

    A method and apparatus for extracting for quantitative analysis ions of selected atomic components of a sample. A lens system is configured to provide a slowly diminishing field region for a volume containing the selected atomic components, enabling accurate energy analysis of ions generated in the slowly diminishing field region. The lens system also enables focusing on a sample of a charged particle beam, such as an ion beam, along a path length perpendicular to the sample and extraction of the charged particles along a path length also perpendicular to the sample. Improvement of signal to noise ratio is achieved by laser excitation of ions to selected autoionization states before carrying out quantitative analysis. Accurate energy analysis of energetic charged particles is assured by using a preselected resistive thick film configuration disposed on an insulator substrate for generating predetermined electric field boundary conditions to achieve for analysis the required electric field potential. The spectrometer also is applicable in the fields of SIMS, ISS and electron spectroscopy.

  20. Photo ion spectrometer

    DOEpatents

    Gruen, D.M.; Young, C.E.; Pellin, M.J.

    1989-08-08

    A method and apparatus are described for extracting for quantitative analysis ions of selected atomic components of a sample. A lens system is configured to provide a slowly diminishing field region for a volume containing the selected atomic components, enabling accurate energy analysis of ions generated in the slowly diminishing field region. The lens system also enables focusing on a sample of a charged particle beam, such as an ion beam, along a path length perpendicular to the sample and extraction of the charged particles along a path length also perpendicular to the sample. Improvement of signal to noise ratio is achieved by laser excitation of ions to selected auto-ionization states before carrying out quantitative analysis. Accurate energy analysis of energetic charged particles is assured by using a preselected resistive thick film configuration disposed on an insulator substrate for generating predetermined electric field boundary conditions to achieve for analysis the required electric field potential. The spectrometer also is applicable in the fields of SIMS, ISS and electron spectroscopy. 8 figs.

  1. Design and analysis issues in quantitative proteomics studies.

    PubMed

    Karp, Natasha A; Lilley, Kathryn S

    2007-09-01

    Quantitative proteomics is the comparison of distinct proteomes which enables the identification of protein species which exhibit changes in expression or post-translational state in response to a given stimulus. Many different quantitative techniques are being utilized and generate large datasets. Independent of the technique used, these large datasets need robust data analysis to ensure valid conclusions are drawn from such studies. Approaches to address the problems that arise with large datasets are discussed to give insight into the types of statistical analyses of data appropriate for the various experimental strategies that can be employed by quantitative proteomic studies. This review also highlights the importance of employing a robust experimental design and highlights various issues surrounding the design of experiments. The concepts and examples discussed within will show how robust design and analysis will lead to confident results that will ensure quantitative proteomics delivers.

  2. Assessing signal-to-noise in quantitative proteomics: multivariate statistical analysis in DIGE experiments.

    PubMed

    Friedman, David B

    2012-01-01

    All quantitative proteomics experiments measure variation between samples. When performing large-scale experiments that involve multiple conditions or treatments, the experimental design should include the appropriate number of individual biological replicates from each condition to enable the distinction between a relevant biological signal from technical noise. Multivariate statistical analyses, such as principal component analysis (PCA), provide a global perspective on experimental variation, thereby enabling the assessment of whether the variation describes the expected biological signal or the unanticipated technical/biological noise inherent in the system. Examples will be shown from high-resolution multivariable DIGE experiments where PCA was instrumental in demonstrating biologically significant variation as well as sample outliers, fouled samples, and overriding technical variation that would not be readily observed using standard univariate tests.

  3. Potential use of combining the diffusion equation with the free Shrödinger equation to improve the Optical Coherence Tomography image analysis

    NASA Astrophysics Data System (ADS)

    Cabrera Fernandez, Delia; Salinas, Harry M.; Somfai, Gabor; Puliafito, Carmen A.

    2006-03-01

    Optical coherence tomography (OCT) is a rapidly emerging medical imaging technology. In ophthalmology, OCT is a powerful tool because it enables visualization of the cross sectional structure of the retina and anterior eye with higher resolutions than any other non-invasive imaging modality. Furthermore, OCT image information can be quantitatively analyzed, enabling objective assessment of features such as macular edema and diabetes retinopathy. We present specific improvements in the quantitative analysis of the OCT system, by combining the diffusion equation with the free Shrödinger equation. In such formulation, important features of the image can be extracted by extending the analysis from the real axis to the complex domain. Experimental results indicate that our proposed novel approach has good performance in speckle noise removal, enhancement and segmentation of the various cellular layers of the retina using the OCT system.

  4. Multidimensional quantitative analysis of mRNA expression within intact vertebrate embryos.

    PubMed

    Trivedi, Vikas; Choi, Harry M T; Fraser, Scott E; Pierce, Niles A

    2018-01-08

    For decades, in situ hybridization methods have been essential tools for studies of vertebrate development and disease, as they enable qualitative analyses of mRNA expression in an anatomical context. Quantitative mRNA analyses typically sacrifice the anatomy, relying on embryo microdissection, dissociation, cell sorting and/or homogenization. Here, we eliminate the trade-off between quantitation and anatomical context, using quantitative in situ hybridization chain reaction (qHCR) to perform accurate and precise relative quantitation of mRNA expression with subcellular resolution within whole-mount vertebrate embryos. Gene expression can be queried in two directions: read-out from anatomical space to expression space reveals co-expression relationships in selected regions of the specimen; conversely, read-in from multidimensional expression space to anatomical space reveals those anatomical locations in which selected gene co-expression relationships occur. As we demonstrate by examining gene circuits underlying somitogenesis, quantitative read-out and read-in analyses provide the strengths of flow cytometry expression analyses, but by preserving subcellular anatomical context, they enable bi-directional queries that open a new era for in situ hybridization. © 2018. Published by The Company of Biologists Ltd.

  5. Quantitation of sweet steviol glycosides by means of a HILIC-MS/MS-SIDA approach.

    PubMed

    Well, Caroline; Frank, Oliver; Hofmann, Thomas

    2013-11-27

    Meeting the rising consumer demand for natural food ingredients, steviol glycosides, the sweet principle of Stevia rebaudiana Bertoni (Bertoni), have recently been approved as food additives in the European Union. As regulatory constraints require sensitive methods to analyze the sweet-tasting steviol glycosides in foods and beverages, a HILIC-MS/MS method was developed enabling the accurate and reliable quantitation of the major steviol glycosides stevioside, rebaudiosides A-F, steviolbioside, rubusoside, and dulcoside A by using the corresponding deuterated 16,17-dihydrosteviol glycosides as suitable internal standards. This quantitation not only enables the analysis of the individual steviol glycosides in foods and beverages but also can support the optimization of breeding and postharvest downstream processing of Stevia plants to produce preferentially sweet and least bitter tasting Stevia extracts.

  6. Quantitative prediction of phase transformations in silicon during nanoindentation

    NASA Astrophysics Data System (ADS)

    Zhang, Liangchi; Basak, Animesh

    2013-08-01

    This paper establishes the first quantitative relationship between the phases transformed in silicon and the shape characteristics of nanoindentation curves. Based on an integrated analysis using TEM and unit cell properties of phases, the volumes of the phases emerged in a nanoindentation are formulated as a function of pop-out size and depth of nanoindentation impression. This simple formula enables a fast, accurate and quantitative prediction of the phases in a nanoindentation cycle, which has been impossible before.

  7. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. A comparative study of quantitative immunohistochemistry and quantum dot immunohistochemistry for mutation carrier identification in Lynch syndrome.

    PubMed

    Barrow, Emma; Evans, D Gareth; McMahon, Ray; Hill, James; Byers, Richard

    2011-03-01

    Lynch Syndrome is caused by mutations in DNA mismatch repair (MMR) genes. Mutation carrier identification is facilitated by immunohistochemical detection of the MMR proteins MHL1 and MSH2 in tumour tissue and is desirable as colonoscopic screening reduces mortality. However, protein detection by conventional immunohistochemistry (IHC) is subjective, and quantitative techniques are required. Quantum dots (QDs) are novel fluorescent labels that enable quantitative multiplex staining. This study compared their use with quantitative 3,3'-diaminobenzidine (DAB) IHC for the diagnosis of Lynch Syndrome. Tumour sections from 36 mutation carriers and six controls were obtained. These were stained with DAB on an automated platform using antibodies against MLH1 and MSH2. Multiplex QD immunofluorescent staining of the sections was performed using antibodies against MLH1, MSH2 and smooth muscle actin (SMA). Multispectral analysis of the slides was performed. The staining intensity of DAB and QDs was measured in multiple colonic crypts, and the mean intensity scores calculated. Receiver operating characteristic (ROC) curves of staining performance for the identification of mutation carriers were evaluated. For quantitative DAB IHC, the area under the MLH1 ROC curve was 0.872 (95% CI 0.763 to 0.981), and the area under the MSH2 ROC curve was 0.832 (95% CI 0.704 to 0.960). For quantitative QD IHC, the area under the MLH1 ROC curve was 0.812 (95% CI 0.681 to 0.943), and the area under the MSH2 ROC curve was 0.598 (95% CI 0.418 to 0.777). Despite the advantage of QD staining to enable several markers to be measured simultaneously, it is of lower utility than DAB IHC for the identification of MMR mutation carriers. Automated DAB IHC staining and quantitative slide analysis may enable high-throughput IHC.

  9. In-depth Qualitative and Quantitative Profiling of Tyrosine Phosphorylation Using a Combination of Phosphopeptide Immunoaffinity Purification and Stable Isotope Dimethyl Labeling*

    PubMed Central

    Boersema, Paul J.; Foong, Leong Yan; Ding, Vanessa M. Y.; Lemeer, Simone; van Breukelen, Bas; Philp, Robin; Boekhorst, Jos; Snel, Berend; den Hertog, Jeroen; Choo, Andre B. H.; Heck, Albert J. R.

    2010-01-01

    Several mass spectrometry-based assays have emerged for the quantitative profiling of cellular tyrosine phosphorylation. Ideally, these methods should reveal the exact sites of tyrosine phosphorylation, be quantitative, and not be cost-prohibitive. The latter is often an issue as typically several milligrams of (stable isotope-labeled) starting protein material are required to enable the detection of low abundance phosphotyrosine peptides. Here, we adopted and refined a peptidecentric immunoaffinity purification approach for the quantitative analysis of tyrosine phosphorylation by combining it with a cost-effective stable isotope dimethyl labeling method. We were able to identify by mass spectrometry, using just two LC-MS/MS runs, more than 1100 unique non-redundant phosphopeptides in HeLa cells from about 4 mg of starting material without requiring any further affinity enrichment as close to 80% of the identified peptides were tyrosine phosphorylated peptides. Stable isotope dimethyl labeling could be incorporated prior to the immunoaffinity purification, even for the large quantities (mg) of peptide material used, enabling the quantification of differences in tyrosine phosphorylation upon pervanadate treatment or epidermal growth factor stimulation. Analysis of the epidermal growth factor-stimulated HeLa cells, a frequently used model system for tyrosine phosphorylation, resulted in the quantification of 73 regulated unique phosphotyrosine peptides. The quantitative data were found to be exceptionally consistent with the literature, evidencing that such a targeted quantitative phosphoproteomics approach can provide reproducible results. In general, the combination of immunoaffinity purification of tyrosine phosphorylated peptides with large scale stable isotope dimethyl labeling provides a cost-effective approach that can alleviate variation in sample preparation and analysis as samples can be combined early on. Using this approach, a rather complete qualitative and quantitative picture of tyrosine phosphorylation signaling events can be generated. PMID:19770167

  10. Whole-body PET parametric imaging employing direct 4D nested reconstruction and a generalized non-linear Patlak model

    NASA Astrophysics Data System (ADS)

    Karakatsanis, Nicolas A.; Rahmim, Arman

    2014-03-01

    Graphical analysis is employed in the research setting to provide quantitative estimation of PET tracer kinetics from dynamic images at a single bed. Recently, we proposed a multi-bed dynamic acquisition framework enabling clinically feasible whole-body parametric PET imaging by employing post-reconstruction parameter estimation. In addition, by incorporating linear Patlak modeling within the system matrix, we enabled direct 4D reconstruction in order to effectively circumvent noise amplification in dynamic whole-body imaging. However, direct 4D Patlak reconstruction exhibits a relatively slow convergence due to the presence of non-sparse spatial correlations in temporal kinetic analysis. In addition, the standard Patlak model does not account for reversible uptake, thus underestimating the influx rate Ki. We have developed a novel whole-body PET parametric reconstruction framework in the STIR platform, a widely employed open-source reconstruction toolkit, a) enabling accelerated convergence of direct 4D multi-bed reconstruction, by employing a nested algorithm to decouple the temporal parameter estimation from the spatial image update process, and b) enhancing the quantitative performance particularly in regions with reversible uptake, by pursuing a non-linear generalized Patlak 4D nested reconstruction algorithm. A set of published kinetic parameters and the XCAT phantom were employed for the simulation of dynamic multi-bed acquisitions. Quantitative analysis on the Ki images demonstrated considerable acceleration in the convergence of the nested 4D whole-body Patlak algorithm. In addition, our simulated and patient whole-body data in the postreconstruction domain indicated the quantitative benefits of our extended generalized Patlak 4D nested reconstruction for tumor diagnosis and treatment response monitoring.

  11. Quantitative analysis of culture using millions of digitized books

    PubMed Central

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  12. Quantitative analysis of culture using millions of digitized books.

    PubMed

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.

  13. Generation of High-Quality SWATH® Acquisition Data for Label-free Quantitative Proteomics Studies Using TripleTOF® Mass Spectrometers

    PubMed Central

    Schilling, Birgit; Gibson, Bradford W.; Hunter, Christie L.

    2017-01-01

    Data-independent acquisition is a powerful mass spectrometry technique that enables comprehensive MS and MS/MS analysis of all detectable species, providing an information rich data file that can be mined deeply. Here, we describe how to acquire high-quality SWATH® Acquisition data to be used for large quantitative proteomic studies. We specifically focus on using variable sized Q1 windows for acquisition of MS/MS data for generating higher specificity quantitative data. PMID:28188533

  14. New approaches for the analysis of confluent cell layers with quantitative phase digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Pohl, L.; Kaiser, M.; Ketelhut, S.; Pereira, S.; Goycoolea, F.; Kemper, Björn

    2016-03-01

    Digital holographic microscopy (DHM) enables high resolution non-destructive inspection of technical surfaces and minimally-invasive label-free live cell imaging. However, the analysis of confluent cell layers represents a challenge as quantitative DHM phase images in this case do not provide sufficient information for image segmentation, determination of the cellular dry mass or calculation of the cell thickness. We present novel strategies for the analysis of confluent cell layers with quantitative DHM phase contrast utilizing a histogram based-evaluation procedure. The applicability of our approach is illustrated by quantification of drug induced cell morphology changes and it is shown that the method is capable to quantify reliable global morphology changes of confluent cell layers.

  15. Automated quantitative cytological analysis using portable microfluidic microscopy.

    PubMed

    Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva

    2016-06-01

    In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Risk manager formula for success: Influencing decision making.

    PubMed

    Midgley, Mike

    2017-10-01

    Providing the ultimate decision makers with a quantitative risk analysis based on thoughtful assessment by the organization's experts enables an efficient decision. © 2017 American Society for Healthcare Risk Management of the American Hospital Association.

  17. Grid-Enabled Quantitative Analysis of Breast Cancer

    DTIC Science & Technology

    2009-10-01

    large-scale, multi-modality computerized image analysis . The central hypothesis of this research is that large-scale image analysis for breast cancer...pilot study to utilize large scale parallel Grid computing to harness the nationwide cluster infrastructure for optimization of medical image ... analysis parameters. Additionally, we investigated the use of cutting edge dataanalysis/ mining techniques as applied to Ultrasound, FFDM, and DCE-MRI Breast

  18. Systematic exploration of essential yeast gene function with temperature-sensitive mutants

    PubMed Central

    Li, Zhijian; Vizeacoumar, Franco J; Bahr, Sondra; Li, Jingjing; Warringer, Jonas; Vizeacoumar, Frederick S; Min, Renqiang; VanderSluis, Benjamin; Bellay, Jeremy; DeVit, Michael; Fleming, James A; Stephens, Andrew; Haase, Julian; Lin, Zhen-Yuan; Baryshnikova, Anastasia; Lu, Hong; Yan, Zhun; Jin, Ke; Barker, Sarah; Datti, Alessandro; Giaever, Guri; Nislow, Corey; Bulawa, Chris; Myers, Chad L; Costanzo, Michael; Gingras, Anne-Claude; Zhang, Zhaolei; Blomberg, Anders; Bloom, Kerry; Andrews, Brenda; Boone, Charles

    2012-01-01

    Conditional temperature-sensitive (ts) mutations are valuable reagents for studying essential genes in the yeast Saccharomyces cerevisiae. We constructed 787 ts strains, covering 497 (~45%) of the 1,101 essential yeast genes, with ~30% of the genes represented by multiple alleles. All of the alleles are integrated into their native genomic locus in the S288C common reference strain and are linked to a kanMX selectable marker, allowing further genetic manipulation by synthetic genetic array (SGA)–based, high-throughput methods. We show two such manipulations: barcoding of 440 strains, which enables chemical-genetic suppression analysis, and the construction of arrays of strains carrying different fluorescent markers of subcellular structure, which enables quantitative analysis of phenotypes using high-content screening. Quantitative analysis of a GFP-tubulin marker identified roles for cohesin and condensin genes in spindle disassembly. This mutant collection should facilitate a wide range of systematic studies aimed at understanding the functions of essential genes. PMID:21441928

  19. Paper Capillary Enables Effective Sampling for Microfluidic Paper Analytical Devices.

    PubMed

    Shangguan, Jin-Wen; Liu, Yu; Wang, Sha; Hou, Yun-Xuan; Xu, Bi-Yi; Xu, Jing-Juan; Chen, Hong-Yuan

    2018-06-06

    Paper capillary is introduced to enable effective sampling on microfluidic paper analytical devices. By coupling mac-roscale capillary force of paper capillary and microscale capillary forces of native paper, fluid transport can be flexibly tailored with proper design. Subsequently, a hybrid-fluid-mode paper capillary device was proposed, which enables fast and reliable sampling in an arrayed form, with less surface adsorption and bias for different components. The resulting device thus well supports high throughput, quantitative, and repeatable assays all by hands operation. With all these merits, multiplex analysis of ions, proteins, and microbe have all been realized on this platform, which has paved the way to level-up analysis on μPADs.

  20. Recent advances on multidimensional liquid chromatography-mass spectrometry for proteomics: from qualitative to quantitative analysis--a review.

    PubMed

    Wu, Qi; Yuan, Huiming; Zhang, Lihua; Zhang, Yukui

    2012-06-20

    With the acceleration of proteome research, increasing attention has been paid to multidimensional liquid chromatography-mass spectrometry (MDLC-MS) due to its high peak capacity and separation efficiency. Recently, many efforts have been put to improve MDLC-based strategies including "top-down" and "bottom-up" to enable highly sensitive qualitative and quantitative analysis of proteins, as well as accelerate the whole analytical procedure. Integrated platforms with combination of sample pretreatment, multidimensional separations and identification were also developed to achieve high throughput and sensitive detection of proteomes, facilitating highly accurate and reproducible quantification. This review summarized the recent advances of such techniques and their applications in qualitative and quantitative analysis of proteomes. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. Urea Biosynthesis Using Liver Slices

    ERIC Educational Resources Information Center

    Teal, A. R.

    1976-01-01

    Presented is a practical scheme to enable introductory biology students to investigate the mechanism by which urea is synthesized in the liver. The tissue-slice technique is discussed, and methods for the quantitative analysis of metabolites are presented. (Author/SL)

  2. Quantitative force measurements using frequency modulation atomic force microscopy—theoretical foundations

    NASA Astrophysics Data System (ADS)

    Sader, John E.; Uchihashi, Takayuki; Higgins, Michael J.; Farrell, Alan; Nakayama, Yoshikazu; Jarvis, Suzanne P.

    2005-03-01

    Use of the atomic force microscope (AFM) in quantitative force measurements inherently requires a theoretical framework enabling conversion of the observed deflection properties of the cantilever to an interaction force. In this paper, the theoretical foundations of using frequency modulation atomic force microscopy (FM-AFM) in quantitative force measurements are examined and rigorously elucidated, with consideration being given to both 'conservative' and 'dissipative' interactions. This includes a detailed discussion of the underlying assumptions involved in such quantitative force measurements, the presentation of globally valid explicit formulae for evaluation of so-called 'conservative' and 'dissipative' forces, discussion of the origin of these forces, and analysis of the applicability of FM-AFM to quantitative force measurements in liquid.

  3. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Two-Photon Flow Cytometry

    NASA Technical Reports Server (NTRS)

    Zhog, Cheng Frank; Ye, Jing Yong; Norris, Theodore B.; Myc, Andrzej; Cao, Zhengyl; Bielinska, Anna; Thomas, Thommey; Baker, James R., Jr.

    2004-01-01

    Flow cytometry is a powerful technique for obtaining quantitative information from fluorescence in cells. Quantitation is achieved by assuring a high degree of uniformity in the optical excitation and detection, generally by using a highly controlled flow such as is obtained via hydrodynamic focusing. In this work, we demonstrate a two-beam, two- channel detection and two-photon excitation flow cytometry (T(sup 3)FC) system that enables multi-dye analysis to be performed very simply, with greatly relaxed requirements on the fluid flow. Two-photon excitation using a femtosecond near-infrared (NIR) laser has the advantages that it enables simultaneous excitation of multiple dyes and achieves very high signal-to-noise ratio through simplified filtering and fluorescence background reduction. By matching the excitation volume to the size of a cell, single-cell detection is ensured. Labeling of cells by targeted nanoparticles with multiple fluorophores enables normalization of the fluorescence signal and thus ratiometric measurements under nonuniform excitation. Quantitative size measurements can also be done even under conditions of nonuniform flow via a two-beam layout. This innovative detection scheme not only considerably simplifies the fluid flow system and the excitation and collection optics, it opens the way to quantitative cytometry in simple and compact microfluidics systems, or in vivo. Real-time detection of fluorescent microbeads in the vasculature of mouse ear demonstrates the ability to do flow cytometry in vivo. The conditions required to perform quantitative in vivo cytometry on labeled cells will be presented.

  5. Hyperspectral and differential CARS microscopy for quantitative chemical imaging in human adipocytes

    PubMed Central

    Di Napoli, Claudia; Pope, Iestyn; Masia, Francesco; Watson, Peter; Langbein, Wolfgang; Borri, Paola

    2014-01-01

    In this work, we demonstrate the applicability of coherent anti-Stokes Raman scattering (CARS) micro-spectroscopy for quantitative chemical imaging of saturated and unsaturated lipids in human stem-cell derived adipocytes. We compare dual-frequency/differential CARS (D-CARS), which enables rapid imaging and simple data analysis, with broadband hyperspectral CARS microscopy analyzed using an unsupervised phase-retrieval and factorization method recently developed by us for quantitative chemical image analysis. Measurements were taken in the vibrational fingerprint region (1200–2000/cm) and in the CH stretch region (2600–3300/cm) using a home-built CARS set-up which enables hyperspectral imaging with 10/cm resolution via spectral focussing from a single broadband 5 fs Ti:Sa laser source. Through a ratiometric analysis, both D-CARS and phase-retrieved hyperspectral CARS determine the concentration of unsaturated lipids with comparable accuracy in the fingerprint region, while in the CH stretch region D-CARS provides only a qualitative contrast owing to its non-linear behavior. When analyzing hyperspectral CARS images using the blind factorization into susceptibilities and concentrations of chemical components recently demonstrated by us, we are able to determine vol:vol concentrations of different lipid components and spatially resolve inhomogeneities in lipid composition with superior accuracy compared to state-of-the art ratiometric methods. PMID:24877002

  6. Image-derived arterial input function for quantitative fluorescence imaging of receptor-drug binding in vivo

    PubMed Central

    Elliott, Jonathan T.; Samkoe, Kimberley S.; Davis, Scott C.; Gunn, Jason R.; Paulsen, Keith D.; Roberts, David W.; Pogue, Brian W.

    2017-01-01

    Receptor concentration imaging (RCI) with targeted-untargeted optical dye pairs has enabled in vivo immunohistochemistry analysis in preclinical subcutaneous tumors. Successful application of RCI to fluorescence guided resection (FGR), so that quantitative molecular imaging of tumor-specific receptors could be performed in situ, would have a high impact. However, assumptions of pharmacokinetics, permeability and retention, as well as the lack of a suitable reference region limit the potential for RCI in human neurosurgery. In this study, an arterial input graphic analysis (AIGA) method is presented which is enabled by independent component analysis (ICA). The percent difference in arterial concentration between the image-derived arterial input function (AIFICA) and that obtained by an invasive method (ICACAR) was 2.0 ± 2.7% during the first hour of circulation of a targeted-untargeted dye pair in mice. Estimates of distribution volume and receptor concentration in tumor bearing mice (n = 5) recovered using the AIGA technique did not differ significantly from values obtained using invasive AIF measurements (p=0.12). The AIGA method, enabled by the subject-specific AIFICA, was also applied in a rat orthotopic model of U-251 glioblastoma to obtain the first reported receptor concentration and distribution volume maps during open craniotomy. PMID:26349671

  7. Automated classification of cell morphology by coherence-controlled holographic microscopy

    NASA Astrophysics Data System (ADS)

    Strbkova, Lenka; Zicha, Daniel; Vesely, Pavel; Chmelik, Radim

    2017-08-01

    In the last few years, classification of cells by machine learning has become frequently used in biology. However, most of the approaches are based on morphometric (MO) features, which are not quantitative in terms of cell mass. This may result in poor classification accuracy. Here, we study the potential contribution of coherence-controlled holographic microscopy enabling quantitative phase imaging for the classification of cell morphologies. We compare our approach with the commonly used method based on MO features. We tested both classification approaches in an experiment with nutritionally deprived cancer tissue cells, while employing several supervised machine learning algorithms. Most of the classifiers provided higher performance when quantitative phase features were employed. Based on the results, it can be concluded that the quantitative phase features played an important role in improving the performance of the classification. The methodology could be valuable help in refining the monitoring of live cells in an automated fashion. We believe that coherence-controlled holographic microscopy, as a tool for quantitative phase imaging, offers all preconditions for the accurate automated analysis of live cell behavior while enabling noninvasive label-free imaging with sufficient contrast and high-spatiotemporal phase sensitivity.

  8. Automated classification of cell morphology by coherence-controlled holographic microscopy.

    PubMed

    Strbkova, Lenka; Zicha, Daniel; Vesely, Pavel; Chmelik, Radim

    2017-08-01

    In the last few years, classification of cells by machine learning has become frequently used in biology. However, most of the approaches are based on morphometric (MO) features, which are not quantitative in terms of cell mass. This may result in poor classification accuracy. Here, we study the potential contribution of coherence-controlled holographic microscopy enabling quantitative phase imaging for the classification of cell morphologies. We compare our approach with the commonly used method based on MO features. We tested both classification approaches in an experiment with nutritionally deprived cancer tissue cells, while employing several supervised machine learning algorithms. Most of the classifiers provided higher performance when quantitative phase features were employed. Based on the results, it can be concluded that the quantitative phase features played an important role in improving the performance of the classification. The methodology could be valuable help in refining the monitoring of live cells in an automated fashion. We believe that coherence-controlled holographic microscopy, as a tool for quantitative phase imaging, offers all preconditions for the accurate automated analysis of live cell behavior while enabling noninvasive label-free imaging with sufficient contrast and high-spatiotemporal phase sensitivity. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  9. Visualization and Quantitative Analysis of Crack-Tip Plastic Zone in Pure Nickel

    NASA Astrophysics Data System (ADS)

    Kelton, Randall; Sola, Jalal Fathi; Meletis, Efstathios I.; Huang, Haiying

    2018-05-01

    Changes in surface morphology have long been thought to be associated with crack propagation in metallic materials. We have studied areal surface texture changes around crack tips in an attempt to understand the correlations between surface texture changes and crack growth behavior. Detailed profiling of the fatigue sample surface was carried out at short fatigue intervals. An image processing algorithm was developed to calculate the surface texture changes. Quantitative analysis of the crack-tip plastic zone, crack-arrested sites near triple points, and large surface texture changes associated with crack release from arrested locations was carried out. The results indicate that surface texture imaging enables visualization of the development of plastic deformation around a crack tip. Quantitative analysis of the surface texture changes reveals the effects of local microstructures on the crack growth behavior.

  10. Recent advances in the application of transmission Raman spectroscopy to pharmaceutical analysis.

    PubMed

    Buckley, Kevin; Matousek, Pavel

    2011-06-25

    This article reviews recent advances in transmission Raman spectroscopy and its applications, from the perspective of pharmaceutical analysis. The emerging concepts enable rapid non-invasive volumetric analysis of pharmaceutical formulations and could lead to many important applications in pharmaceutical settings, including quantitative bulk analysis of intact pharmaceutical tablets and capsules in quality and process control. Crown Copyright © 2010. Published by Elsevier B.V. All rights reserved.

  11. Image velocimetry and spectral analysis enable quantitative characterization of larval zebrafish gut motility.

    PubMed

    Ganz, J; Baker, R P; Hamilton, M K; Melancon, E; Diba, P; Eisen, J S; Parthasarathy, R

    2018-05-02

    Normal gut function requires rhythmic and coordinated movements that are affected by developmental processes, physical and chemical stimuli, and many debilitating diseases. The imaging and characterization of gut motility, especially regarding periodic, propagative contractions driving material transport, are therefore critical goals. Previous image analysis approaches have successfully extracted properties related to the temporal frequency of motility modes, but robust measures of contraction magnitude, especially from in vivo image data, remain challenging to obtain. We developed a new image analysis method based on image velocimetry and spectral analysis that reveals temporal characteristics such as frequency and wave propagation speed, while also providing quantitative measures of the amplitude of gut motion. We validate this approach using several challenges to larval zebrafish, imaged with differential interference contrast microscopy. Both acetylcholine exposure and feeding increase frequency and amplitude of motility. Larvae lacking enteric nervous system gut innervation show the same average motility frequency, but reduced and less variable amplitude compared to wild types. Our image analysis approach enables insights into gut dynamics in a wide variety of developmental and physiological contexts and can also be extended to analyze other types of cell movements. © 2018 John Wiley & Sons Ltd.

  12. A Data-Processing System for Quantitative Analysis in Speech Production. CLCS Occasional Paper No. 17.

    ERIC Educational Resources Information Center

    Chasaide, Ailbhe Ni; Davis, Eugene

    The data processing system used at Trinity College's Centre for Language and Communication Studies (Ireland) enables computer-automated collection and analysis of phonetic data and has many advantages for research on speech production. The system allows accurate handling of large quantities of data, eliminates many of the limitations of manual…

  13. Rotorcraft Conceptual Design Environment

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne; Sinsay, Jeffrey

    2009-01-01

    Requirements for a rotorcraft conceptual design environment are discussed, from the perspective of a government laboratory. Rotorcraft design work in a government laboratory must support research, by producing technology impact assessments and defining the context for research and development; and must support the acquisition process, including capability assessments and quantitative evaluation of designs, concepts, and alternatives. An information manager that will enable increased fidelity of analysis early in the design effort is described. This manager will be a framework to organize information that describes the aircraft, and enable movement of that information to and from analyses. Finally, a recently developed rotorcraft system analysis tool is described.

  14. Rotorcraft Conceptual Design Environment

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne; Sinsay, Jeffrey D.

    2010-01-01

    Requirements for a rotorcraft conceptual design environment are discussed, from the perspective of a government laboratory. Rotorcraft design work in a government laboratory must support research, by producing technology impact assessments and defining the context for research and development; and must support the acquisition process, including capability assessments and quantitative evaluation of designs, concepts, and alternatives. An information manager that will enable increased fidelity of analysis early in the design effort is described. This manager will be a framework to organize information that describes the aircraft, and enable movement of that information to and from analyses. Finally, a recently developed rotorcraft system analysis tool is described.

  15. Kinetic Modeling of Accelerated Stability Testing Enabled by Second Harmonic Generation Microscopy.

    PubMed

    Song, Zhengtian; Sarkar, Sreya; Vogt, Andrew D; Danzer, Gerald D; Smith, Casey J; Gualtieri, Ellen J; Simpson, Garth J

    2018-04-03

    The low limits of detection afforded by second harmonic generation (SHG) microscopy coupled with image analysis algorithms enabled quantitative modeling of the temperature-dependent crystallization of active pharmaceutical ingredients (APIs) within amorphous solid dispersions (ASDs). ASDs, in which an API is maintained in an amorphous state within a polymer matrix, are finding increasing use to address solubility limitations of small-molecule APIs. Extensive stability testing is typically performed for ASD characterization, the time frame for which is often dictated by the earliest detectable onset of crystal formation. Here a study of accelerated stability testing on ritonavir, a human immunodeficiency virus (HIV) protease inhibitor, has been conducted. Under the condition for accelerated stability testing at 50 °C/75%RH and 40 °C/75%RH, ritonavir crystallization kinetics from amorphous solid dispersions were monitored by SHG microscopy. SHG microscopy coupled by image analysis yielded limits of detection for ritonavir crystals as low as 10 ppm, which is about 2 orders of magnitude lower than other methods currently available for crystallinity detection in ASDs. The four decade dynamic range of SHG microscopy enabled quantitative modeling with an established (JMAK) kinetic model. From the SHG images, nucleation and crystal growth rates were independently determined.

  16. Quantitative analysis of pork and chicken products by droplet digital PCR.

    PubMed

    Cai, Yicun; Li, Xiang; Lv, Rong; Yang, Jielin; Li, Jian; He, Yuping; Pan, Liangwen

    2014-01-01

    In this project, a highly precise quantitative method based on the digital polymerase chain reaction (dPCR) technique was developed to determine the weight of pork and chicken in meat products. Real-time quantitative polymerase chain reaction (qPCR) is currently used for quantitative molecular analysis of the presence of species-specific DNAs in meat products. However, it is limited in amplification efficiency and relies on standard curves based Ct values, detecting and quantifying low copy number target DNA, as in some complex mixture meat products. By using the dPCR method, we find the relationships between the raw meat weight and DNA weight and between the DNA weight and DNA copy number were both close to linear. This enabled us to establish formulae to calculate the raw meat weight based on the DNA copy number. The accuracy and applicability of this method were tested and verified using samples of pork and chicken powder mixed in known proportions. Quantitative analysis indicated that dPCR is highly precise in quantifying pork and chicken in meat products and therefore has the potential to be used in routine analysis by government regulators and quality control departments of commercial food and feed enterprises.

  17. On the analysis of complex biological supply chains: From Process Systems Engineering to Quantitative Systems Pharmacology.

    PubMed

    Rao, Rohit T; Scherholz, Megerle L; Hartmanshenn, Clara; Bae, Seul-A; Androulakis, Ioannis P

    2017-12-05

    The use of models in biology has become particularly relevant as it enables investigators to develop a mechanistic framework for understanding the operating principles of living systems as well as in quantitatively predicting their response to both pathological perturbations and pharmacological interventions. This application has resulted in a synergistic convergence of systems biology and pharmacokinetic-pharmacodynamic modeling techniques that has led to the emergence of quantitative systems pharmacology (QSP). In this review, we discuss how the foundational principles of chemical process systems engineering inform the progressive development of more physiologically-based systems biology models.

  18. Quantitative aspects of inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Bulska, Ewa; Wagner, Barbara

    2016-10-01

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue 'Quantitative mass spectrometry'.

  19. Quantitative Image Informatics for Cancer Research (QIICR) | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    Imaging has enormous untapped potential to improve cancer research through software to extract and process morphometric and functional biomarkers. In the era of non-cytotoxic treatment agents, multi- modality image-guided ablative therapies and rapidly evolving computational resources, quantitative imaging software can be transformative in enabling minimally invasive, objective and reproducible evaluation of cancer treatment response. Post-processing algorithms are integral to high-throughput analysis and fine- grained differentiation of multiple molecular targets.

  20. A Strategy for Identifying Quantitative Trait Genes Using Gene Expression Analysis and Causal Analysis.

    PubMed

    Ishikawa, Akira

    2017-11-27

    Large numbers of quantitative trait loci (QTL) affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs) for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.

  1. A Novel Quantitative Hemolytic Assay Coupled with Restriction Fragment Length Polymorphisms Analysis Enabled Early Diagnosis of Atypical Hemolytic Uremic Syndrome and Identified Unique Predisposing Mutations in Japan

    PubMed Central

    Yoshida, Yoko; Miyata, Toshiyuki; Matsumoto, Masanori; Shirotani-Ikejima, Hiroko; Uchida, Yumiko; Ohyama, Yoshifumi; Kokubo, Tetsuro; Fujimura, Yoshihiro

    2015-01-01

    For thrombotic microangiopathies (TMAs), the diagnosis of atypical hemolytic uremic syndrome (aHUS) is made by ruling out Shiga toxin-producing Escherichia coli (STEC)-associated HUS and ADAMTS13 activity-deficient thrombotic thrombocytopenic purpura (TTP), often using the exclusion criteria for secondary TMAs. Nowadays, assays for ADAMTS13 activity and evaluation for STEC infection can be performed within a few hours. However, a confident diagnosis of aHUS often requires comprehensive gene analysis of the alternative complement activation pathway, which usually takes at least several weeks. However, predisposing genetic abnormalities are only identified in approximately 70% of aHUS. To facilitate the diagnosis of complement-mediated aHUS, we describe a quantitative hemolytic assay using sheep red blood cells (RBCs) and human citrated plasma, spiked with or without a novel inhibitory anti-complement factor H (CFH) monoclonal antibody. Among 45 aHUS patients in Japan, 24% (11/45) had moderate-to-severe (≥50%) hemolysis, whereas the remaining 76% (34/45) patients had mild or no hemolysis (<50%). The former group is largely attributed to CFH-related abnormalities, and the latter group has C3-p.I1157T mutations (16/34), which were identified by restriction fragment length polymorphism (RFLP) analysis. Thus, a quantitative hemolytic assay coupled with RFLP analysis enabled the early diagnosis of complement-mediated aHUS in 60% (27/45) of patients in Japan within a week of presentation. We hypothesize that this novel quantitative hemolytic assay would be more useful in a Caucasian population, who may have a higher proportion of CFH mutations than Japanese patients. PMID:25951460

  2. Towards quantitative mass spectrometry-based metabolomics in microbial and mammalian systems.

    PubMed

    Kapoore, Rahul Vijay; Vaidyanathan, Seetharaman

    2016-10-28

    Metabolome analyses are a suite of analytical approaches that enable us to capture changes in the metabolome (small molecular weight components, typically less than 1500 Da) in biological systems. Mass spectrometry (MS) has been widely used for this purpose. The key challenge here is to be able to capture changes in a reproducible and reliant manner that is representative of the events that take place in vivo Typically, the analysis is carried out in vitro, by isolating the system and extracting the metabolome. MS-based approaches enable us to capture metabolomic changes with high sensitivity and resolution. When developing the technique for different biological systems, there are similarities in challenges and differences that are specific to the system under investigation. Here, we review some of the challenges in capturing quantitative changes in the metabolome with MS based approaches, primarily in microbial and mammalian systems.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Author(s).

  3. Attendance at NHS mandatory training sessions.

    PubMed

    Brand, Darren

    2015-02-17

    To identify factors that affect NHS healthcare professionals' attendance at mandatory training sessions. A quantitative approach was used, with a questionnaire sent to 400 randomly selected participants. A total of 122 responses were received, providing a mix of qualitative and quantitative data. Quantitative data were analysed using statistical methods. Open-ended responses were reviewed using thematic analysis. Clinical staff value mandatory training sessions highly. They are aware of the requirement to keep practice up-to-date and ensure patient safety remains a priority. However, changes to the delivery format of mandatory training sessions are required to enable staff to participate more easily, as staff are often unable to attend. The delivery of mandatory training should move from classroom-based sessions into the clinical area to maximise participation. Delivery should be assisted by local 'experts' who are able to customise course content to meet local requirements and the requirements of different staff groups. Improved arrangements to provide staff cover, for those attending training, would enable more staff to attend training sessions.

  4. Fluorescence-labeled methylation-sensitive amplified fragment length polymorphism (FL-MS-AFLP) analysis for quantitative determination of DNA methylation and demethylation status.

    PubMed

    Kageyama, Shinji; Shinmura, Kazuya; Yamamoto, Hiroko; Goto, Masanori; Suzuki, Koichi; Tanioka, Fumihiko; Tsuneyoshi, Toshihiro; Sugimura, Haruhiko

    2008-04-01

    The PCR-based DNA fingerprinting method called the methylation-sensitive amplified fragment length polymorphism (MS-AFLP) analysis is used for genome-wide scanning of methylation status. In this study, we developed a method of fluorescence-labeled MS-AFLP (FL-MS-AFLP) analysis by applying a fluorescence-labeled primer and fluorescence-detecting electrophoresis apparatus to the existing method of MS-AFLP analysis. The FL-MS-AFLP analysis enables quantitative evaluation of more than 350 random CpG loci per run. It was shown to allow evaluation of the differences in methylation level of blood DNA of gastric cancer patients and evaluation of hypermethylation and hypomethylation in DNA from gastric cancer tissue in comparison with adjacent non-cancerous tissue.

  5. Development of an exposure measurement database on five lung carcinogens (ExpoSYN) for quantitative retrospective occupational exposure assessment.

    PubMed

    Peters, Susan; Vermeulen, Roel; Olsson, Ann; Van Gelder, Rainer; Kendzia, Benjamin; Vincent, Raymond; Savary, Barbara; Williams, Nick; Woldbæk, Torill; Lavoué, Jérôme; Cavallo, Domenico; Cattaneo, Andrea; Mirabelli, Dario; Plato, Nils; Dahmann, Dirk; Fevotte, Joelle; Pesch, Beate; Brüning, Thomas; Straif, Kurt; Kromhout, Hans

    2012-01-01

    SYNERGY is a large pooled analysis of case-control studies on the joint effects of occupational carcinogens and smoking in the development of lung cancer. A quantitative job-exposure matrix (JEM) will be developed to assign exposures to five major lung carcinogens [asbestos, chromium, nickel, polycyclic aromatic hydrocarbons (PAH), and respirable crystalline silica (RCS)]. We assembled an exposure database, called ExpoSYN, to enable such a quantitative exposure assessment. Existing exposure databases were identified and European and Canadian research institutes were approached to identify pertinent exposure measurement data. Results of individual air measurements were entered anonymized according to a standardized protocol. The ExpoSYN database currently includes 356 551 measurements from 19 countries. In total, 140 666 personal and 215 885 stationary data points were available. Measurements were distributed over the five agents as follows: RCS (42%), asbestos (20%), chromium (16%), nickel (15%), and PAH (7%). The measurement data cover the time period from 1951 to present. However, only a small portion of measurements (1.4%) were performed prior to 1975. The major contributing countries for personal measurements were Germany (32%), UK (22%), France (14%), and Norway and Canada (both 11%). ExpoSYN is a unique occupational exposure database with measurements from 18 European countries and Canada covering a time period of >50 years. This database will be used to develop a country-, job-, and time period-specific quantitative JEM. This JEM will enable data-driven quantitative exposure assessment in a multinational pooled analysis of community-based lung cancer case-control studies.

  6. [Integral quantitative evaluation of working conditions in the construction industry].

    PubMed

    Guseĭnov, A A

    1993-01-01

    Present method evaluating the quality of environment (using MAC and MAL) does not enable to assess completely and objectively the work conditions of building industry due to multiple confounding elements. A solution to this complicated problem including the analysis of various correlating elements of the system "human--work conditions--environment" may be encouraged by social norm of morbidity, which is independent on industrial and natural environment. The complete integral assessment enables to see the whole situation and reveal the points at risk.

  7. GeLC-MRM quantitation of mutant KRAS oncoprotein in complex biological samples.

    PubMed

    Halvey, Patrick J; Ferrone, Cristina R; Liebler, Daniel C

    2012-07-06

    Tumor-derived mutant KRAS (v-Ki-ras-2 Kirsten rat sarcoma viral oncogene) oncoprotein is a critical driver of cancer phenotypes and a potential biomarker for many epithelial cancers. Targeted mass spectrometry analysis by multiple reaction monitoring (MRM) enables selective detection and quantitation of wild-type and mutant KRAS proteins in complex biological samples. A recently described immunoprecipitation approach (Proc. Nat. Acad. Sci.2011, 108, 2444-2449) can be used to enrich KRAS for MRM analysis, but requires large protein inputs (2-4 mg). Here, we describe sodium dodecyl sulfate-polyacrylamide gel electrophoresis-based enrichment of KRAS in a low molecular weight (20-25 kDa) protein fraction prior to MRM analysis (GeLC-MRM). This approach reduces background proteome complexity, thus, allowing mutant KRAS to be reliably quantified in low protein inputs (5-50 μg). GeLC-MRM detected KRAS mutant variants (G12D, G13D, G12V, G12S) in a panel of cancer cell lines. GeLC-MRM analysis of wild-type and mutant was linear with respect to protein input and showed low variability across process replicates (CV = 14%). Concomitant analysis of a peptide from the highly similar HRAS and NRAS proteins enabled correction of KRAS-targeted measurements for contributions from these other proteins. KRAS peptides were also quantified in fluid from benign pancreatic cysts and pancreatic cancers at concentrations from 0.08 to 1.1 fmol/μg protein. GeLC-MRM provides a robust, sensitive approach to quantitation of mutant proteins in complex biological samples.

  8. Apparatus enables automatic microanalysis of body fluids

    NASA Technical Reports Server (NTRS)

    Soffen, G. A.; Stuart, J. L.

    1966-01-01

    Apparatus will automatically and quantitatively determine body fluid constituents which are amenable to analysis by fluorometry or colorimetry. The results of the tests are displayed as percentages of full scale deflection on a strip-chart recorder. The apparatus can also be adapted for microanalysis of various other fluids.

  9. The Case for Open Source Software: The Interactional Discourse Lab

    ERIC Educational Resources Information Center

    Choi, Seongsook

    2016-01-01

    Computational techniques and software applications for the quantitative content analysis of texts are now well established, and many qualitative data software applications enable the manipulation of input variables and the visualization of complex relations between them via interactive and informative graphical interfaces. Although advances in…

  10. The application of time series models to cloud field morphology analysis

    NASA Technical Reports Server (NTRS)

    Chin, Roland T.; Jau, Jack Y. C.; Weinman, James A.

    1987-01-01

    A modeling method for the quantitative description of remotely sensed cloud field images is presented. A two-dimensional texture modeling scheme based on one-dimensional time series procedures is adopted for this purpose. The time series procedure used is the seasonal autoregressive, moving average (ARMA) process in Box and Jenkins. Cloud field properties such as directionality, clustering and cloud coverage can be retrieved by this method. It has been demonstrated that a cloud field image can be quantitatively defined by a small set of parameters and synthesized surrogates can be reconstructed from these model parameters. This method enables cloud climatology to be studied quantitatively.

  11. Quantitative aspects of inductively coupled plasma mass spectrometry

    PubMed Central

    Wagner, Barbara

    2016-01-01

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644971

  12. Ranking Quantitative Resistance to Septoria tritici Blotch in Elite Wheat Cultivars Using Automated Image Analysis.

    PubMed

    Karisto, Petteri; Hund, Andreas; Yu, Kang; Anderegg, Jonas; Walter, Achim; Mascher, Fabio; McDonald, Bruce A; Mikaberidze, Alexey

    2018-05-01

    Quantitative resistance is likely to be more durable than major gene resistance for controlling Septoria tritici blotch (STB) on wheat. Earlier studies hypothesized that resistance affecting the degree of host damage, as measured by the percentage of leaf area covered by STB lesions, is distinct from resistance that affects pathogen reproduction, as measured by the density of pycnidia produced within lesions. We tested this hypothesis using a collection of 335 elite European winter wheat cultivars that was naturally infected by a diverse population of Zymoseptoria tritici in a replicated field experiment. We used automated image analysis of 21,420 scanned wheat leaves to obtain quantitative measures of conditional STB intensity that were precise, objective, and reproducible. These measures allowed us to explicitly separate resistance affecting host damage from resistance affecting pathogen reproduction, enabling us to confirm that these resistance traits are largely independent. The cultivar rankings based on host damage were different from the rankings based on pathogen reproduction, indicating that the two forms of resistance should be considered separately in breeding programs aiming to increase STB resistance. We hypothesize that these different forms of resistance are under separate genetic control, enabling them to be recombined to form new cultivars that are highly resistant to STB. We found a significant correlation between rankings based on automated image analysis and rankings based on traditional visual scoring, suggesting that image analysis can complement conventional measurements of STB resistance, based largely on host damage, while enabling a much more precise measure of pathogen reproduction. We showed that measures of pathogen reproduction early in the growing season were the best predictors of host damage late in the growing season, illustrating the importance of breeding for resistance that reduces pathogen reproduction in order to minimize yield losses caused by STB. These data can already be used by breeding programs to choose wheat cultivars that are broadly resistant to naturally diverse Z. tritici populations according to the different classes of resistance.

  13. Grid workflow validation using ontology-based tacit knowledge: A case study for quantitative remote sensing applications

    NASA Astrophysics Data System (ADS)

    Liu, Jia; Liu, Longli; Xue, Yong; Dong, Jing; Hu, Yingcui; Hill, Richard; Guang, Jie; Li, Chi

    2017-01-01

    Workflow for remote sensing quantitative retrieval is the ;bridge; between Grid services and Grid-enabled application of remote sensing quantitative retrieval. Workflow averts low-level implementation details of the Grid and hence enables users to focus on higher levels of application. The workflow for remote sensing quantitative retrieval plays an important role in remote sensing Grid and Cloud computing services, which can support the modelling, construction and implementation of large-scale complicated applications of remote sensing science. The validation of workflow is important in order to support the large-scale sophisticated scientific computation processes with enhanced performance and to minimize potential waste of time and resources. To research the semantic correctness of user-defined workflows, in this paper, we propose a workflow validation method based on tacit knowledge research in the remote sensing domain. We first discuss the remote sensing model and metadata. Through detailed analysis, we then discuss the method of extracting the domain tacit knowledge and expressing the knowledge with ontology. Additionally, we construct the domain ontology with Protégé. Through our experimental study, we verify the validity of this method in two ways, namely data source consistency error validation and parameters matching error validation.

  14. Dominant Epistasis Between Two Quantitative Trait Loci Governing Sporulation Efficiency in Yeast Saccharomyces cerevisiae

    PubMed Central

    Bergman, Juraj; Mitrikeski, Petar T.

    2015-01-01

    Summary Sporulation efficiency in the yeast Saccharomyces cerevisiae is a well-established model for studying quantitative traits. A variety of genes and nucleotides causing different sporulation efficiencies in laboratory, as well as in wild strains, has already been extensively characterised (mainly by reciprocal hemizygosity analysis and nucleotide exchange methods). We applied a different strategy in order to analyze the variation in sporulation efficiency of laboratory yeast strains. Coupling classical quantitative genetic analysis with simulations of phenotypic distributions (a method we call phenotype modelling) enabled us to obtain a detailed picture of the quantitative trait loci (QTLs) relationships underlying the phenotypic variation of this trait. Using this approach, we were able to uncover a dominant epistatic inheritance of loci governing the phenotype. Moreover, a molecular analysis of known causative quantitative trait genes and nucleotides allowed for the detection of novel alleles, potentially responsible for the observed phenotypic variation. Based on the molecular data, we hypothesise that the observed dominant epistatic relationship could be caused by the interaction of multiple quantitative trait nucleotides distributed across a 60--kb QTL region located on chromosome XIV and the RME1 locus on chromosome VII. Furthermore, we propose a model of molecular pathways which possibly underlie the phenotypic variation of this trait. PMID:27904371

  15. Garlic (Allium sativum L.) fertility: transcriptome and proteome analyses provide insight into flower and pollen development

    PubMed Central

    Shemesh-Mayer, Einat; Ben-Michael, Tomer; Rotem, Neta; Rabinowitch, Haim D.; Doron-Faigenboim, Adi; Kosmala, Arkadiusz; Perlikowski, Dawid; Sherman, Amir; Kamenetsky, Rina

    2015-01-01

    Commercial cultivars of garlic, a popular condiment, are sterile, making genetic studies and breeding of this plant challenging. However, recent fertility restoration has enabled advanced physiological and genetic research and hybridization in this important crop. Morphophysiological studies, combined with transcriptome and proteome analyses and quantitative PCR validation, enabled the identification of genes and specific processes involved in gametogenesis in fertile and male-sterile garlic genotypes. Both genotypes exhibit normal meiosis at early stages of anther development, but in the male-sterile plants, tapetal hypertrophy after microspore release leads to pollen degeneration. Transcriptome analysis and global gene-expression profiling showed that >16,000 genes are differentially expressed in the fertile vs. male-sterile developing flowers. Proteome analysis and quantitative comparison of 2D-gel protein maps revealed 36 significantly different protein spots, 9 of which were present only in the male-sterile genotype. Bioinformatic and quantitative PCR validation of 10 candidate genes exhibited significant expression differences between male-sterile and fertile flowers. A comparison of morphophysiological and molecular traits of fertile and male-sterile garlic flowers suggests that respiratory restrictions and/or non-regulated programmed cell death of the tapetum can lead to energy deficiency and consequent pollen abortion. Potential molecular markers for male fertility and sterility in garlic are proposed. PMID:25972879

  16. RAPID MEASUREMENT OF AQUEOUS HYDROXYL RADICAL CONCENTRATIONS IN STEADY-STATE HO· FLUX SYSTEMS

    EPA Science Inventory

    The spin-trap compound a-(4-pyridyl-1-oxide)-N-tert-butyl-nitrone (4-POBN) is utilized for the detection and quantitation of the hydroxyl radical (HO·) in aqueous solution. Capillary electrophoresis enables rapid analysis of the probe compound. The thermally unstable HO· radical ...

  17. A strategy to apply quantitative epistasis analysis on developmental traits.

    PubMed

    Labocha, Marta K; Yuan, Wang; Aleman-Meza, Boanerges; Zhong, Weiwei

    2017-05-15

    Genetic interactions are keys to understand complex traits and evolution. Epistasis analysis is an effective method to map genetic interactions. Large-scale quantitative epistasis analysis has been well established for single cells. However, there is a substantial lack of such studies in multicellular organisms and their complex phenotypes such as development. Here we present a method to extend quantitative epistasis analysis to developmental traits. In the nematode Caenorhabditis elegans, we applied RNA interference on mutants to inactivate two genes, used an imaging system to quantitatively measure phenotypes, and developed a set of statistical methods to extract genetic interactions from phenotypic measurement. Using two different C. elegans developmental phenotypes, body length and sex ratio, as examples, we showed that this method could accommodate various metazoan phenotypes with performances comparable to those methods in single cell growth studies. Comparing with qualitative observations, this method of quantitative epistasis enabled detection of new interactions involving subtle phenotypes. For example, several sex-ratio genes were found to interact with brc-1 and brd-1, the orthologs of the human breast cancer genes BRCA1 and BARD1, respectively. We confirmed the brc-1 interactions with the following genes in DNA damage response: C34F6.1, him-3 (ortholog of HORMAD1, HORMAD2), sdc-1, and set-2 (ortholog of SETD1A, SETD1B, KMT2C, KMT2D), validating the effectiveness of our method in detecting genetic interactions. We developed a reliable, high-throughput method for quantitative epistasis analysis of developmental phenotypes.

  18. Radiomic analysis in prediction of Human Papilloma Virus status.

    PubMed

    Yu, Kaixian; Zhang, Youyi; Yu, Yang; Huang, Chao; Liu, Rongjie; Li, Tengfei; Yang, Liuqing; Morris, Jeffrey S; Baladandayuthapani, Veerabhadran; Zhu, Hongtu

    2017-12-01

    Human Papilloma Virus (HPV) has been associated with oropharyngeal cancer prognosis. Traditionally the HPV status is tested through invasive lab test. Recently, the rapid development of statistical image analysis techniques has enabled precise quantitative analysis of medical images. The quantitative analysis of Computed Tomography (CT) provides a non-invasive way to assess HPV status for oropharynx cancer patients. We designed a statistical radiomics approach analyzing CT images to predict HPV status. Various radiomics features were extracted from CT scans, and analyzed using statistical feature selection and prediction methods. Our approach ranked the highest in the 2016 Medical Image Computing and Computer Assisted Intervention (MICCAI) grand challenge: Oropharynx Cancer (OPC) Radiomics Challenge, Human Papilloma Virus (HPV) Status Prediction. Further analysis on the most relevant radiomic features distinguishing HPV positive and negative subjects suggested that HPV positive patients usually have smaller and simpler tumors.

  19. A standardized kit for automated quantitative assessment of candidate protein biomarkers in human plasma.

    PubMed

    Percy, Andrew J; Mohammed, Yassene; Yang, Juncong; Borchers, Christoph H

    2015-12-01

    An increasingly popular mass spectrometry-based quantitative approach for health-related research in the biomedical field involves the use of stable isotope-labeled standards (SIS) and multiple/selected reaction monitoring (MRM/SRM). To improve inter-laboratory precision and enable more widespread use of this 'absolute' quantitative technique in disease-biomarker assessment studies, methods must be standardized. Results/methodology: Using this MRM-with-SIS-peptide approach, we developed an automated method (encompassing sample preparation, processing and analysis) for quantifying 76 candidate protein markers (spanning >4 orders of magnitude in concentration) in neat human plasma. The assembled biomarker assessment kit - the 'BAK-76' - contains the essential materials (SIS mixes), methods (for acquisition and analysis), and tools (Qualis-SIS software) for performing biomarker discovery or verification studies in a rapid and standardized manner.

  20. Apricot DNA as an indicator for persipan: detection and quantitation in marzipan using ligation-dependent probe amplification.

    PubMed

    Luber, Florian; Demmel, Anja; Hosken, Anne; Busch, Ulrich; Engel, Karl-Heinz

    2012-06-13

    The confectionery ingredient marzipan is exclusively prepared from almond kernels and sugar. The potential use of apricot kernels, so-called persipan, is an important issue for the quality assessment of marzipan. Therefore, a ligation-dependent probe amplification (LPA) assay was developed that enables a specific and sensitive detection of apricot DNA, as an indicator for the presence of persipan. The limit of detection was determined to be 0.1% persipan in marzipan. The suitability of the method was confirmed by the analysis of 20 commercially available food samples. The integration of a Prunus -specific probe in the LPA assay as a reference allowed for the relative quantitation of persipan in marzipan. The limit of quantitation was determined to be 0.5% persipan in marzipan. The analysis of two self-prepared mixtures of marzipan and persipan demonstrated the applicability of the quantitation method at concentration levels of practical relevance for quality control.

  1. Magnetic fingerprints of rolling cells for quantitative flow cytometry in whole blood

    NASA Astrophysics Data System (ADS)

    Reisbeck, Mathias; Helou, Michael Johannes; Richter, Lukas; Kappes, Barbara; Friedrich, Oliver; Hayden, Oliver

    2016-09-01

    Over the past 50 years, flow cytometry has had a profound impact on preclinical and clinical applications requiring single cell function information for counting, sub-typing and quantification of epitope expression. At the same time, the workflow complexity and high costs of such optical systems still limit flow cytometry applications to specialized laboratories. Here, we present a quantitative magnetic flow cytometer that incorporates in situ magnetophoretic cell focusing for highly accurate and reproducible rolling of the cellular targets over giant magnetoresistance sensing elements. Time-of-flight analysis is used to unveil quantitative single cell information contained in its magnetic fingerprint. Furthermore, we used erythrocytes as a biological model to validate our methodology with respect to precise analysis of the hydrodynamic cell diameter, quantification of binding capacity of immunomagnetic labels, and discrimination of cell morphology. The extracted time-of-flight information should enable point-of-care quantitative flow cytometry in whole blood for clinical applications, such as immunology and primary hemostasis.

  2. SPECHT - single-stage phosphopeptide enrichment and stable-isotope chemical tagging: quantitative phosphoproteomics of insulin action in muscle.

    PubMed

    Kettenbach, Arminja N; Sano, Hiroyuki; Keller, Susanna R; Lienhard, Gustav E; Gerber, Scott A

    2015-01-30

    The study of cellular signaling remains a significant challenge for translational and clinical research. In particular, robust and accurate methods for quantitative phosphoproteomics in tissues and tumors represent significant hurdles for such efforts. In the present work, we design, implement and validate a method for single-stage phosphopeptide enrichment and stable isotope chemical tagging, or SPECHT, that enables the use of iTRAQ, TMT and/or reductive dimethyl-labeling strategies to be applied to phosphoproteomics experiments performed on primary tissue. We develop and validate our approach using reductive dimethyl-labeling and HeLa cells in culture, and find these results indistinguishable from data generated from more traditional SILAC-labeled HeLa cells mixed at the cell level. We apply the SPECHT approach to the quantitative analysis of insulin signaling in a murine myotube cell line and muscle tissue, identify known as well as new phosphorylation events, and validate these phosphorylation sites using phospho-specific antibodies. Taken together, our work validates chemical tagging post-single-stage phosphoenrichment as a general strategy for studying cellular signaling in primary tissues. Through the use of a quantitatively reproducible, proteome-wide phosphopeptide enrichment strategy, we demonstrated the feasibility of post-phosphopeptide purification chemical labeling and tagging as an enabling approach for quantitative phosphoproteomics of primary tissues. Using reductive dimethyl labeling as a generalized chemical tagging strategy, we compared the performance of post-phosphopeptide purification chemical tagging to the well established community standard, SILAC, in insulin-stimulated tissue culture cells. We then extended our method to the analysis of low-dose insulin signaling in murine muscle tissue, and report on the analytical and biological significance of our results. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Quantitative mass spectrometry: an overview

    NASA Astrophysics Data System (ADS)

    Urban, Pawel L.

    2016-10-01

    Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry-especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements. This article is part of the themed issue 'Quantitative mass spectrometry'.

  4. Populational analysis of suspended microtissues for high-throughput, multiplexed 3D tissue engineering

    PubMed Central

    Chen, Alice A.; Underhill, Gregory H.; Bhatia, Sangeeta N.

    2014-01-01

    Three-dimensional (3D) tissue models have significantly improved our understanding of structure/function relationships and promise to lead to new advances in regenerative medicine. However, despite the expanding diversity of 3D tissue fabrication methods, approaches for functional assessment have been relatively limited. Here, we describe the fabrication of microtissue (μ-tissue) suspensions and their quantitative evaluation with techniques capable of analyzing large sample numbers and performing multiplexed parallel analysis. We applied this platform to 3D μ-tissues representing multiple stages of liver development and disease including: embryonic stem cells, bipotential hepatic progenitors, mature hepatocytes, and hepatoma cells photoencapsulated in polyethylene glycol hydrogels. Multiparametric μ-tissue cytometry enabled quantitation of fluorescent reporter expression within populations of intact μ-tissues (n≥102-103) and sorting-based enrichment of subsets for subsequent studies. Further, 3D μ-tissues could be implanted in vivo, respond to systemic stimuli, retrieved and quantitatively assessed. In order to facilitate multiplexed ‘pooled’ experimentation, fluorescent labeling strategies were developed and utilized to investigate the impact of μ-tissue composition and exposure to soluble factors. In particular, examination of drug/gene interactions on collections of 3D hepatoma μ-tissues indicated synergistic influence of doxorubicin and knockdown of the anti-apoptotic gene BCL-XL. Collectively, these studies highlight the broad utility of μ-tissue suspensions as an enabling approach for high n, populational analysis of 3D tissue biology in vitro and in vivo. PMID:20820630

  5. A Semi-Automatic Method for Image Analysis of Edge Dynamics in Living Cells

    PubMed Central

    Huang, Lawrence; Helmke, Brian P.

    2011-01-01

    Spatial asymmetry of actin edge ruffling contributes to the process of cell polarization and directional migration, but mechanisms by which external cues control actin polymerization near cell edges remain unclear. We designed a quantitative image analysis strategy to measure the spatiotemporal distribution of actin edge ruffling. Time-lapse images of endothelial cells (ECs) expressing mRFP-actin were segmented using an active contour method. In intensity line profiles oriented normal to the cell edge, peak detection identified the angular distribution of polymerized actin within 1 µm of the cell edge, which was localized to lamellipodia and edge ruffles. Edge features associated with filopodia and peripheral stress fibers were removed. Circular statistical analysis enabled detection of cell polarity, indicated by a unimodal distribution of edge ruffles. To demonstrate the approach, we detected a rapid, nondirectional increase in edge ruffling in serum-stimulated ECs and a change in constitutive ruffling orientation in quiescent, nonpolarized ECs. Error analysis using simulated test images demonstrate robustness of the method to variations in image noise levels, edge ruffle arc length, and edge intensity gradient. These quantitative measurements of edge ruffling dynamics enable investigation at the cellular length scale of the underlying molecular mechanisms regulating actin assembly and cell polarization. PMID:21643526

  6. A Simple and Computationally Efficient Approach to Multifactor Dimensionality Reduction Analysis of Gene-Gene Interactions for Quantitative Traits

    PubMed Central

    Gui, Jiang; Moore, Jason H.; Williams, Scott M.; Andrews, Peter; Hillege, Hans L.; van der Harst, Pim; Navis, Gerjan; Van Gilst, Wiek H.; Asselbergs, Folkert W.; Gilbert-Diamond, Diane

    2013-01-01

    We present an extension of the two-class multifactor dimensionality reduction (MDR) algorithm that enables detection and characterization of epistatic SNP-SNP interactions in the context of a quantitative trait. The proposed Quantitative MDR (QMDR) method handles continuous data by modifying MDR’s constructive induction algorithm to use a T-test. QMDR replaces the balanced accuracy metric with a T-test statistic as the score to determine the best interaction model. We used a simulation to identify the empirical distribution of QMDR’s testing score. We then applied QMDR to genetic data from the ongoing prospective Prevention of Renal and Vascular End-Stage Disease (PREVEND) study. PMID:23805232

  7. Quantitative and Comparative Profiling of Protease Substrates through a Genetically Encoded Multifunctional Photocrosslinker.

    PubMed

    He, Dan; Xie, Xiao; Yang, Fan; Zhang, Heng; Su, Haomiao; Ge, Yun; Song, Haiping; Chen, Peng R

    2017-11-13

    A genetically encoded, multifunctional photocrosslinker was developed for quantitative and comparative proteomics. By bearing a bioorthogonal handle and a releasable linker in addition to its photoaffinity warhead, this probe enables the enrichment of transient and low-abundance prey proteins after intracellular photocrosslinking and prey-bait separation, which can be subject to stable isotope dimethyl labeling and mass spectrometry analysis. This quantitative strategy (termed isoCAPP) allowed a comparative proteomic approach to be adopted to identify the proteolytic substrates of an E. coli protease-chaperone dual machinery DegP. Two newly identified substrates were subsequently confirmed by proteolysis experiments. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Trans-Proteomic Pipeline, a standardized data processing pipeline for large-scale reproducible proteomics informatics

    PubMed Central

    Deutsch, Eric W.; Mendoza, Luis; Shteynberg, David; Slagel, Joseph; Sun, Zhi; Moritz, Robert L.

    2015-01-01

    Democratization of genomics technologies has enabled the rapid determination of genotypes. More recently the democratization of comprehensive proteomics technologies is enabling the determination of the cellular phenotype and the molecular events that define its dynamic state. Core proteomic technologies include mass spectrometry to define protein sequence, protein:protein interactions, and protein post-translational modifications. Key enabling technologies for proteomics are bioinformatic pipelines to identify, quantitate, and summarize these events. The Trans-Proteomics Pipeline (TPP) is a robust open-source standardized data processing pipeline for large-scale reproducible quantitative mass spectrometry proteomics. It supports all major operating systems and instrument vendors via open data formats. Here we provide a review of the overall proteomics workflow supported by the TPP, its major tools, and how it can be used in its various modes from desktop to cloud computing. We describe new features for the TPP, including data visualization functionality. We conclude by describing some common perils that affect the analysis of tandem mass spectrometry datasets, as well as some major upcoming features. PMID:25631240

  9. Trans-Proteomic Pipeline, a standardized data processing pipeline for large-scale reproducible proteomics informatics.

    PubMed

    Deutsch, Eric W; Mendoza, Luis; Shteynberg, David; Slagel, Joseph; Sun, Zhi; Moritz, Robert L

    2015-08-01

    Democratization of genomics technologies has enabled the rapid determination of genotypes. More recently the democratization of comprehensive proteomics technologies is enabling the determination of the cellular phenotype and the molecular events that define its dynamic state. Core proteomic technologies include MS to define protein sequence, protein:protein interactions, and protein PTMs. Key enabling technologies for proteomics are bioinformatic pipelines to identify, quantitate, and summarize these events. The Trans-Proteomics Pipeline (TPP) is a robust open-source standardized data processing pipeline for large-scale reproducible quantitative MS proteomics. It supports all major operating systems and instrument vendors via open data formats. Here, we provide a review of the overall proteomics workflow supported by the TPP, its major tools, and how it can be used in its various modes from desktop to cloud computing. We describe new features for the TPP, including data visualization functionality. We conclude by describing some common perils that affect the analysis of MS/MS datasets, as well as some major upcoming features. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Computerized image analysis for quantitative neuronal phenotyping in zebrafish.

    PubMed

    Liu, Tianming; Lu, Jianfeng; Wang, Ye; Campbell, William A; Huang, Ling; Zhu, Jinmin; Xia, Weiming; Wong, Stephen T C

    2006-06-15

    An integrated microscope image analysis pipeline is developed for automatic analysis and quantification of phenotypes in zebrafish with altered expression of Alzheimer's disease (AD)-linked genes. We hypothesize that a slight impairment of neuronal integrity in a large number of zebrafish carrying the mutant genotype can be detected through the computerized image analysis method. Key functionalities of our zebrafish image processing pipeline include quantification of neuron loss in zebrafish embryos due to knockdown of AD-linked genes, automatic detection of defective somites, and quantitative measurement of gene expression levels in zebrafish with altered expression of AD-linked genes or treatment with a chemical compound. These quantitative measurements enable the archival of analyzed results and relevant meta-data. The structured database is organized for statistical analysis and data modeling to better understand neuronal integrity and phenotypic changes of zebrafish under different perturbations. Our results show that the computerized analysis is comparable to manual counting with equivalent accuracy and improved efficacy and consistency. Development of such an automated data analysis pipeline represents a significant step forward to achieve accurate and reproducible quantification of neuronal phenotypes in large scale or high-throughput zebrafish imaging studies.

  11. Mapping and QTL analysis of drought tolerance in a spring wheat population using AFLP and DArt markers

    USDA-ARS?s Scientific Manuscript database

    Water availability is commonly the most limiting factor to crop production. This study was conducted to map quantitative trait loci (QTL) involved in drought tolerance in wheat (Triticum aestivum L.) to enable their use for marker assisted selection (MAS) in breeding. Using amplified fragment leng...

  12. Highly sensitive image-derived indices of water-stressed plants using hyperspectral imaging in SWIR and histogram analysis

    PubMed Central

    Kim, David M.; Zhang, Hairong; Zhou, Haiying; Du, Tommy; Wu, Qian; Mockler, Todd C.; Berezin, Mikhail Y.

    2015-01-01

    The optical signature of leaves is an important monitoring and predictive parameter for a variety of biotic and abiotic stresses, including drought. Such signatures derived from spectroscopic measurements provide vegetation indices – a quantitative method for assessing plant health. However, the commonly used metrics suffer from low sensitivity. Relatively small changes in water content in moderately stressed plants demand high-contrast imaging to distinguish affected plants. We present a new approach in deriving sensitive indices using hyperspectral imaging in a short-wave infrared range from 800 nm to 1600 nm. Our method, based on high spectral resolution (1.56 nm) instrumentation and image processing algorithms (quantitative histogram analysis), enables us to distinguish a moderate water stress equivalent of 20% relative water content (RWC). The identified image-derived indices 15XX nm/14XX nm (i.e. 1529 nm/1416 nm) were superior to common vegetation indices, such as WBI, MSI, and NDWI, with significantly better sensitivity, enabling early diagnostics of plant health. PMID:26531782

  13. Quantitative carbon detector for enhanced detection of molecules in foods, pharmaceuticals, cosmetics, flavors, and fuels.

    PubMed

    Beach, Connor A; Krumm, Christoph; Spanjers, Charles S; Maduskar, Saurabh; Jones, Andrew J; Dauenhauer, Paul J

    2016-03-07

    Analysis of trace compounds, such as pesticides and other contaminants, within consumer products, fuels, and the environment requires quantification of increasingly complex mixtures of difficult-to-quantify compounds. Many compounds of interest are non-volatile and exhibit poor response in current gas chromatography and flame ionization systems. Here we show the reaction of trimethylsilylated chemical analytes to methane using a quantitative carbon detector (QCD; the Polyarc™ reactor) within a gas chromatograph (GC), thereby enabling enhanced detection (up to 10×) of highly functionalized compounds including carbohydrates, acids, drugs, flavorants, and pesticides. Analysis of a complex mixture of compounds shows that the GC-QCD method exhibits faster and more accurate analysis of complex mixtures commonly encountered in everyday products and the environment.

  14. Distance-based microfluidic quantitative detection methods for point-of-care testing.

    PubMed

    Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James

    2016-04-07

    Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed.

  15. Quantitative proteomics in biological research.

    PubMed

    Wilm, Matthias

    2009-10-01

    Proteomics has enabled the direct investigation of biological material, at first through the analysis of individual proteins, then of lysates from cell cultures, and finally of extracts from tissues and biopsies from entire organisms. Its latest manifestation - quantitative proteomics - allows deeper insight into biological systems. This article reviews the different methods used to extract quantitative information from mass spectra. It follows the technical developments aimed toward global proteomics, the attempt to characterize every expressed protein in a cell by at least one peptide. When applications of the technology are discussed, the focus is placed on yeast biology. In particular, differential quantitative proteomics, the comparison between an experiment and its control, is very discriminating for proteins involved in the process being studied. When trying to understand biological processes on a molecular level, differential quantitative proteomics tends to give a clearer picture than global transcription analyses. As a result, MS has become an even more indispensable tool for biochemically motivated biological research.

  16. Second Harmonic Generation Guided Raman Spectroscopy for Sensitive Detection of Polymorph Transitions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chowdhury, Azhad U.; Ye, Dong Hye; Song, Zhengtian

    Second harmonic generation (SHG) was integrated with Raman spectroscopy for the analysis of pharmaceutical materials. Particulate formulations of clopidogrel bisulfate were prepared in two crystal forms (Form I and Form II). Image analysis approaches enable automated identification of particles by bright field imaging, followed by classification by SHG. Quantitative SHG microscopy enabled discrimination of crystal form on a per particle basis with 99.95% confidence in a total measurement time of ~10 ms per particle. Complementary measurements by Raman and synchrotron XRD are in excellent agreement with the classifications made by SHG, with measurement times of ~1 min and several secondsmore » per particle, respectively. Coupling these capabilities with at-line monitoring may enable real-time feedback for reaction monitoring during pharmaceutical production to favor the more bioavailable but metastable Form I with limits of detection in the ppm regime.« less

  17. Conductive carbon tape used for support and mounting of both whole animal and fragile heat-treated tissue sections for MALDI MS imaging and quantitation.

    PubMed

    Goodwin, Richard J A; Nilsson, Anna; Borg, Daniel; Langridge-Smith, Pat R R; Harrison, David J; Mackay, C Logan; Iverson, Suzanne L; Andrén, Per E

    2012-08-30

    Analysis of whole animal tissue sections by MALDI MS imaging (MSI) requires effective sample collection and transfer methods to allow the highest quality of in situ analysis of small or hard to dissect tissues. We report on the use of double-sided adhesive conductive carbon tape during whole adult rat tissue sectioning of carboxymethyl cellulose (CMC) embedded animals, with samples mounted onto large format conductive glass and conductive plastic MALDI targets, enabling MSI analysis to be performed on both TOF and FT-ICR MALDI mass spectrometers. We show that mounting does not unduly affect small molecule MSI detection by analyzing tiotropium abundance and distribution in rat lung tissues, with direct on-tissue quantitation achieved. Significantly, we use the adhesive tape to provide support to embedded delicate heat-stabilized tissues, enabling sectioning and mounting to be performed that maintained tissue integrity on samples that had previously been impossible to adequately prepare section for MSI analysis. The mapping of larger peptidomic molecules was not hindered by tape mounting samples and we demonstrate this by mapping the distribution of PEP-19 in both native and heat-stabilized rat brains. Furthermore, we show that without heat stabilization PEP-19 degradation fragments can detected and identified directly by MALDI MSI analysis. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. A hybrid approach identifies metabolic signatures of high-producers for chinese hamster ovary clone selection and process optimization.

    PubMed

    Popp, Oliver; Müller, Dirk; Didzus, Katharina; Paul, Wolfgang; Lipsmeier, Florian; Kirchner, Florian; Niklas, Jens; Mauch, Klaus; Beaucamp, Nicola

    2016-09-01

    In-depth characterization of high-producer cell lines and bioprocesses is vital to ensure robust and consistent production of recombinant therapeutic proteins in high quantity and quality for clinical applications. This requires applying appropriate methods during bioprocess development to enable meaningful characterization of CHO clones and processes. Here, we present a novel hybrid approach for supporting comprehensive characterization of metabolic clone performance. The approach combines metabolite profiling with multivariate data analysis and fluxomics to enable a data-driven mechanistic analysis of key metabolic traits associated with desired cell phenotypes. We applied the methodology to quantify and compare metabolic performance in a set of 10 recombinant CHO-K1 producer clones and a host cell line. The comprehensive characterization enabled us to derive an extended set of clone performance criteria that not only captured growth and product formation, but also incorporated information on intracellular clone physiology and on metabolic changes during the process. These criteria served to establish a quantitative clone ranking and allowed us to identify metabolic differences between high-producing CHO-K1 clones yielding comparably high product titers. Through multivariate data analysis of the combined metabolite and flux data we uncovered common metabolic traits characteristic of high-producer clones in the screening setup. This included high intracellular rates of glutamine synthesis, low cysteine uptake, reduced excretion of aspartate and glutamate, and low intracellular degradation rates of branched-chain amino acids and of histidine. Finally, the above approach was integrated into a workflow that enables standardized high-content selection of CHO producer clones in a high-throughput fashion. In conclusion, the combination of quantitative metabolite profiling, multivariate data analysis, and mechanistic network model simulations can identify metabolic traits characteristic of high-performance clones and enables informed decisions on which clones provide a good match for a particular process platform. The proposed approach also provides a mechanistic link between observed clone phenotype, process setup, and feeding regimes, and thereby offers concrete starting points for subsequent process optimization. Biotechnol. Bioeng. 2016;113: 2005-2019. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  19. Metabolic Mapping: Quantitative Enzyme Cytochemistry and Histochemistry to Determine the Activity of Dehydrogenases in Cells and Tissues.

    PubMed

    Molenaar, Remco J; Khurshed, Mohammed; Hira, Vashendriya V V; Van Noorden, Cornelis J F

    2018-05-26

    Altered cellular metabolism is a hallmark of many diseases, including cancer, cardiovascular diseases and infection. The metabolic motor units of cells are enzymes and their activity is heavily regulated at many levels, including the transcriptional, mRNA stability, translational, post-translational and functional level. This complex regulation means that conventional quantitative or imaging assays, such as quantitative mRNA experiments, Western Blots and immunohistochemistry, yield incomplete information regarding the ultimate activity of enzymes, their function and/or their subcellular localization. Quantitative enzyme cytochemistry and histochemistry (i.e., metabolic mapping) show in-depth information on in situ enzymatic activity and its kinetics, function and subcellular localization in an almost true-to-nature situation. We describe a protocol to detect the activity of dehydrogenases, which are enzymes that perform redox reactions to reduce cofactors such as NAD(P) + and FAD. Cells and tissue sections are incubated in a medium that is specific for the enzymatic activity of one dehydrogenase. Subsequently, the dehydrogenase that is the subject of investigation performs its enzymatic activity in its subcellular site. In a chemical reaction with the reaction medium, this ultimately generates blue-colored formazan at the site of the dehydrogenase's activity. The formazan's absorbance is therefore a direct measure of the dehydrogenase's activity and can be quantified using monochromatic light microscopy and image analysis. The quantitative aspect of this protocol enables researchers to draw statistical conclusions from these assays. Besides observational studies, this technique can be used for inhibition studies of specific enzymes. In this context, studies benefit from the true-to-nature advantages of metabolic mapping, giving in situ results that may be physiologically more relevant than in vitro enzyme inhibition studies. In all, metabolic mapping is an indispensable technique to study metabolism at the cellular or tissue level. The technique is easy to adopt, provides in-depth, comprehensive and integrated metabolic information and enables rapid quantitative analysis.

  20. A correlative and quantitative imaging approach enabling characterization of primary cell-cell communication: Case of human CD4+ T cell-macrophage immunological synapses.

    PubMed

    Kasprowicz, Richard; Rand, Emma; O'Toole, Peter J; Signoret, Nathalie

    2018-05-22

    Cell-to-cell communication engages signaling and spatiotemporal reorganization events driven by highly context-dependent and dynamic intercellular interactions, which are difficult to capture within heterogeneous primary cell cultures. Here, we present a straightforward correlative imaging approach utilizing commonly available instrumentation to sample large numbers of cell-cell interaction events, allowing qualitative and quantitative characterization of rare functioning cell-conjugates based on calcium signals. We applied this approach to examine a previously uncharacterized immunological synapse, investigating autologous human blood CD4 + T cells and monocyte-derived macrophages (MDMs) forming functional conjugates in vitro. Populations of signaling conjugates were visualized, tracked and analyzed by combining live imaging, calcium recording and multivariate statistical analysis. Correlative immunofluorescence was added to quantify endogenous molecular recruitments at the cell-cell junction. By analyzing a large number of rare conjugates, we were able to define calcium signatures associated with different states of CD4 + T cell-MDM interactions. Quantitative image analysis of immunostained conjugates detected the propensity of endogenous T cell surface markers and intracellular organelles to polarize towards cell-cell junctions with high and sustained calcium signaling profiles, hence defining immunological synapses. Overall, we developed a broadly applicable approach enabling detailed single cell- and population-based investigations of rare cell-cell communication events with primary cells.

  1. Developing a Multiplexed Quantitative Cross-Linking Mass Spectrometry Platform for Comparative Structural Analysis of Protein Complexes.

    PubMed

    Yu, Clinton; Huszagh, Alexander; Viner, Rosa; Novitsky, Eric J; Rychnovsky, Scott D; Huang, Lan

    2016-10-18

    Cross-linking mass spectrometry (XL-MS) represents a recently popularized hybrid methodology for defining protein-protein interactions (PPIs) and analyzing structures of large protein assemblies. In particular, XL-MS strategies have been demonstrated to be effective in elucidating molecular details of PPIs at the peptide resolution, providing a complementary set of structural data that can be utilized to refine existing complex structures or direct de novo modeling of unknown protein structures. To study structural and interaction dynamics of protein complexes, quantitative cross-linking mass spectrometry (QXL-MS) strategies based on isotope-labeled cross-linkers have been developed. Although successful, these approaches are mostly limited to pairwise comparisons. In order to establish a robust workflow enabling comparative analysis of multiple cross-linked samples simultaneously, we have developed a multiplexed QXL-MS strategy, namely, QMIX (Quantitation of Multiplexed, Isobaric-labeled cross (X)-linked peptides) by integrating MS-cleavable cross-linkers with isobaric labeling reagents. This study has established a new analytical platform for quantitative analysis of cross-linked peptides, which can be directly applied for multiplexed comparisons of the conformational dynamics of protein complexes and PPIs at the proteome scale in future studies.

  2. A Quantitative Risk Analysis Framework for Evaluating and Monitoring Operational Reliability of Cloud Computing

    ERIC Educational Resources Information Center

    Islam, Muhammad Faysal

    2013-01-01

    Cloud computing offers the advantage of on-demand, reliable and cost efficient computing solutions without the capital investment and management resources to build and maintain in-house data centers and network infrastructures. Scalability of cloud solutions enable consumers to upgrade or downsize their services as needed. In a cloud environment,…

  3. Application of Simulation to Individualized Self-Paced Training. Final Report. TAEG Report No. 11-2.

    ERIC Educational Resources Information Center

    Lindahl, William H.; Gardner, James H.

    Computer simulation is recognized as a valuable systems analysis research tool which enables the detailed examination, evaluation, and manipulation, under stated conditions, of a system without direct action on the system. This technique provides management with quantitative data on system performance and capabilities which can be used to compare…

  4. Land Cover Change and Remote Sensing in the Classroom: An Exercise to Study Urban Growth

    ERIC Educational Resources Information Center

    Delahunty, Tina; Lewis-Gonzales, Sarah; Phelps, Jack; Sawicki, Ben; Roberts, Charles; Carpenter, Penny

    2012-01-01

    The processes and implications of urban growth are studied in a variety of disciplines as urban growth affects both the physical and human landscape. Remote sensing methods provide ways to visualize and mathematically represent urban growth; and resultant land cover change data enable both quantitative and qualitative analysis. This article helps…

  5. Three-dimensional characterization of pigment dispersion in dried paint films using focused ion beam-scanning electron microscopy.

    PubMed

    Lin, Jui-Ching; Heeschen, William; Reffner, John; Hook, John

    2012-04-01

    The combination of integrated focused ion beam-scanning electron microscope (FIB-SEM) serial sectioning and imaging techniques with image analysis provided quantitative characterization of three-dimensional (3D) pigment dispersion in dried paint films. The focused ion beam in a FIB-SEM dual beam system enables great control in slicing paints, and the sectioning process can be synchronized with SEM imaging providing high quality serial cross-section images for 3D reconstruction. Application of Euclidean distance map and ultimate eroded points image analysis methods can provide quantitative characterization of 3D particle distribution. It is concluded that 3D measurement of binder distribution in paints is effective to characterize the order of pigment dispersion in dried paint films.

  6. Experimental analysis of bruises in human volunteers using radiometric depth profiling and diffuse reflectance spectroscopy

    NASA Astrophysics Data System (ADS)

    Vidovič, Luka; Milanič, Matija; Majaron, Boris

    2015-07-01

    We combine pulsed photothermal radiometry (PPTR) depth profiling with diffuse reflectance spectroscopy (DRS) measurements for a comprehensive analysis of bruise evolution in vivo. While PPTR enables extraction of detailed depth distribution and concentration profiles of selected absorbers (e.g. melanin, hemoglobin), DRS provides information in a wide range of visible wavelengths and thus offers an additional insight into dynamics of the hemoglobin degradation products. Combining the two approaches enables us to quantitatively characterize bruise evolution dynamics. Our results indicate temporal variations of the bruise evolution parameters in the course of bruise self-healing process. The obtained parameter values and trends represent a basis for a future development of an objective technique for bruise age determination.

  7. Quantitative 3D analysis of bone in hip osteoarthritis using clinical computed tomography.

    PubMed

    Turmezei, Tom D; Treece, Graham M; Gee, Andrew H; Fotiadou, Anastasia F; Poole, Kenneth E S

    2016-07-01

    To assess the relationship between proximal femoral cortical bone thickness and radiological hip osteoarthritis using quantitative 3D analysis of clinical computed tomography (CT) data. Image analysis was performed on clinical CT imaging data from 203 female volunteers with a technique called cortical bone mapping (CBM). Colour thickness maps were created for each proximal femur. Statistical parametric mapping was performed to identify statistically significant differences in cortical bone thickness that corresponded with the severity of radiological hip osteoarthritis. Kellgren and Lawrence (K&L) grade, minimum joint space width (JSW) and a novel CT-based osteophyte score were also blindly assessed from the CT data. For each increase in K&L grade, cortical thickness increased by up to 25 % in distinct areas of the superolateral femoral head-neck junction and superior subchondral bone plate. For increasing severity of CT osteophytes, the increase in cortical thickness was more circumferential, involving a wider portion of the head-neck junction, with up to a 7 % increase in cortical thickness per increment in score. Results were not significant for minimum JSW. These findings indicate that quantitative 3D analysis of the proximal femur can identify changes in cortical bone thickness relevant to structural hip osteoarthritis. • CT is being increasingly used to assess bony involvement in osteoarthritis • CBM provides accurate and reliable quantitative analysis of cortical bone thickness • Cortical bone is thicker at the superior femoral head-neck with worse osteoarthritis • Regions of increased thickness co-locate with impingement and osteophyte formation • Quantitative 3D bone analysis could enable clinical disease prediction and therapy development.

  8. HuMOVE: a low-invasive wearable monitoring platform in sexual medicine.

    PubMed

    Ciuti, Gastone; Nardi, Matteo; Valdastri, Pietro; Menciassi, Arianna; Basile Fasolo, Ciro; Dario, Paolo

    2014-10-01

    To investigate an accelerometer-based wearable system, named Human Movement (HuMOVE) platform, designed to enable quantitative and continuous measurement of sexual performance with minimal invasiveness and inconvenience for users. Design, implementation, and development of HuMOVE, a wearable platform equipped with an accelerometer sensor for monitoring inertial parameters for sexual performance assessment and diagnosis, were performed. The system enables quantitative measurement of movement parameters during sexual intercourse, meeting the requirements of wearability, data storage, sampling rate, and interfacing methods, which are fundamental for human sexual intercourse performance analysis. HuMOVE was validated through characterization using a controlled experimental test bench and evaluated in a human model during simulated sexual intercourse conditions. HuMOVE demonstrated to be a robust and quantitative monitoring platform and a reliable candidate for sexual performance evaluation and diagnosis. Characterization analysis on the controlled experimental test bench demonstrated an accurate correlation between the HuMOVE system and data from a reference displacement sensor. Experimental tests in the human model during simulated intercourse conditions confirmed the accuracy of the sexual performance evaluation platform and the effectiveness of the selected and derived parameters. The obtained outcomes also established the project expectations in terms of usability and comfort, evidenced by the questionnaires that highlighted the low invasiveness and acceptance of the device. To the best of our knowledge, HuMOVE platform is the first device for human sexual performance analysis compatible with sexual intercourse; the system has the potential to be a helpful tool for physicians to accurately classify sexual disorders, such as premature or delayed ejaculation. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Improving Attachments of Non-Invasive (Type III) Electronic Data Loggers to Cetaceans

    DTIC Science & Technology

    2015-09-30

    animals in human care will be performed to test and validate this approach. The cadaver trials will enable controlled testing to failure or with both...quantitative metrics and analysis tools to assess the impact of a tag on the animal . Here we will present: 1) the characterization of the mechanical...fine scale motion analysis for swimming animals . 2 APPROACH Our approach is divided into four subtasks: Task 1: Forces and failure modes

  10. The quantitative surface analysis of an antioxidant additive in a lubricant oil matrix by desorption electrospray ionization mass spectrometry

    PubMed Central

    Da Costa, Caitlyn; Reynolds, James C; Whitmarsh, Samuel; Lynch, Tom; Creaser, Colin S

    2013-01-01

    RATIONALE Chemical additives are incorporated into commercial lubricant oils to modify the physical and chemical properties of the lubricant. The quantitative analysis of additives in oil-based lubricants deposited on a surface without extraction of the sample from the surface presents a challenge. The potential of desorption electrospray ionization mass spectrometry (DESI-MS) for the quantitative surface analysis of an oil additive in a complex oil lubricant matrix without sample extraction has been evaluated. METHODS The quantitative surface analysis of the antioxidant additive octyl (4-hydroxy-3,5-di-tert-butylphenyl)propionate in an oil lubricant matrix was carried out by DESI-MS in the presence of 2-(pentyloxy)ethyl 3-(3,5-di-tert-butyl-4-hydroxyphenyl)propionate as an internal standard. A quadrupole/time-of-flight mass spectrometer fitted with an in-house modified ion source enabling non-proximal DESI-MS was used for the analyses. RESULTS An eight-point calibration curve ranging from 1 to 80 µg/spot of octyl (4-hydroxy-3,5-di-tert-butylphenyl)propionate in an oil lubricant matrix and in the presence of the internal standard was used to determine the quantitative response of the DESI-MS method. The sensitivity and repeatability of the technique were assessed by conducting replicate analyses at each concentration. The limit of detection was determined to be 11 ng/mm2 additive on spot with relative standard deviations in the range 3–14%. CONCLUSIONS The application of DESI-MS to the direct, quantitative surface analysis of a commercial lubricant additive in a native oil lubricant matrix is demonstrated. © 2013 The Authors. Rapid Communications in Mass Spectrometry published by John Wiley & Sons, Ltd. PMID:24097398

  11. Quantitative nephelometry

    MedlinePlus

    ... this page: //medlineplus.gov/ency/article/003545.htm Quantitative nephelometry test To use the sharing features on this page, please enable JavaScript. Quantitative nephelometry is a lab test to quickly and ...

  12. A Targeted Q-PCR-Based Method for Point Mutation Testing by Analyzing Circulating DNA for Cancer Management Care.

    PubMed

    Thierry, Alain R

    2016-01-01

    Circulating cell-free DNA (cfDNA) is a valuable source of tumor material available with a simple blood sampling enabling a noninvasive quantitative and qualitative analysis of the tumor genome. cfDNA is released by tumor cells and exhibits the genetic and epigenetic alterations of the tumor of origin. Circulating cell-free DNA (cfDNA) analysis constitutes a hopeful approach to provide a noninvasive tumor molecular test for cancer patients. Based upon basic research on the origin and structure of cfDNA, new information on circulating cell-free DNA (cfDNA) structure, and specific determination of cfDNA fragmentation and size, we revisited Q-PCR-based method and recently developed a the allele-specific-Q-PCR-based method with blocker (termed as Intplex) which is the first multiplexed test for cfDNA. This technique, named Intplex(®) and based on a refined Q-PCR method, derived from critical observations made on the specific structure and size of cfDNA. It enables the simultaneous determination of five parameters: the cfDNA total concentration, the presence of a previously known point mutation, the mutant (tumor) cfDNA concentration (ctDNA), the proportion of mutant cfDNA, and the cfDNA fragmentation index. Intplex(®) has enabled the first clinical validation of ctDNA analysis in oncology by detecting KRAS and BRAF point mutations in mCRC patients and has demonstrated that a blood test could replace tumor section analysis for the detection of KRAS and BRAF mutations. The Intplex(®) test can be adapted to all mutations, genes, or cancers and enables rapid, highly sensitive, cost-effective, and repetitive analysis. As regards to the determination of mutations on cfDNA Intplex(®) is limited to the mutational status of known hotspot mutation; it is a "targeted approach." However, it offers the opportunity in detecting quantitatively and dynamically mutation and could constitute a noninvasive attractive tool potentially allowing diagnosis, prognosis, theranostics, therapeutic monitoring, and follow-up of cancer patients expanding the scope of personalized cancer medicine.

  13. Automated reagent-dispensing system for microfluidic cell biology assays.

    PubMed

    Ly, Jimmy; Masterman-Smith, Michael; Ramakrishnan, Ravichandran; Sun, Jing; Kokubun, Brent; van Dam, R Michael

    2013-12-01

    Microscale systems that enable measurements of oncological phenomena at the single-cell level have a great capacity to improve therapeutic strategies and diagnostics. Such measurements can reveal unprecedented insights into cellular heterogeneity and its implications into the progression and treatment of complicated cellular disease processes such as those found in cancer. We describe a novel fluid-delivery platform to interface with low-cost microfluidic chips containing arrays of microchambers. Using multiple pairs of needles to aspirate and dispense reagents, the platform enables automated coating of chambers, loading of cells, and treatment with growth media or other agents (e.g., drugs, fixatives, membrane permeabilizers, washes, stains, etc.). The chips can be quantitatively assayed using standard fluorescence-based immunocytochemistry, microscopy, and image analysis tools, to determine, for example, drug response based on differences in protein expression and/or activation of cellular targets on an individual-cell level. In general, automation of fluid and cell handling increases repeatability, eliminates human error, and enables increased throughput, especially for sophisticated, multistep assays such as multiparameter quantitative immunocytochemistry. We report the design of the automated platform and compare several aspects of its performance to manually-loaded microfluidic chips.

  14. Quantitative assessment of RNA-protein interactions with high-throughput sequencing-RNA affinity profiling.

    PubMed

    Ozer, Abdullah; Tome, Jacob M; Friedman, Robin C; Gheba, Dan; Schroth, Gary P; Lis, John T

    2015-08-01

    Because RNA-protein interactions have a central role in a wide array of biological processes, methods that enable a quantitative assessment of these interactions in a high-throughput manner are in great demand. Recently, we developed the high-throughput sequencing-RNA affinity profiling (HiTS-RAP) assay that couples sequencing on an Illumina GAIIx genome analyzer with the quantitative assessment of protein-RNA interactions. This assay is able to analyze interactions between one or possibly several proteins with millions of different RNAs in a single experiment. We have successfully used HiTS-RAP to analyze interactions of the EGFP and negative elongation factor subunit E (NELF-E) proteins with their corresponding canonical and mutant RNA aptamers. Here we provide a detailed protocol for HiTS-RAP that can be completed in about a month (8 d hands-on time). This includes the preparation and testing of recombinant proteins and DNA templates, clustering DNA templates on a flowcell, HiTS and protein binding with a GAIIx instrument, and finally data analysis. We also highlight aspects of HiTS-RAP that can be further improved and points of comparison between HiTS-RAP and two other recently developed methods, quantitative analysis of RNA on a massively parallel array (RNA-MaP) and RNA Bind-n-Seq (RBNS), for quantitative analysis of RNA-protein interactions.

  15. Image registration and analysis for quantitative myocardial perfusion: application to dynamic circular cardiac CT.

    PubMed

    Isola, A A; Schmitt, H; van Stevendaal, U; Begemann, P G; Coulon, P; Boussel, L; Grass, M

    2011-09-21

    Large area detector computed tomography systems with fast rotating gantries enable volumetric dynamic cardiac perfusion studies. Prospectively, ECG-triggered acquisitions limit the data acquisition to a predefined cardiac phase and thereby reduce x-ray dose and limit motion artefacts. Even in the case of highly accurate prospective triggering and stable heart rate, spatial misalignment of the cardiac volumes acquired and reconstructed per cardiac cycle may occur due to small motion pattern variations from cycle to cycle. These misalignments reduce the accuracy of the quantitative analysis of myocardial perfusion parameters on a per voxel basis. An image-based solution to this problem is elastic 3D image registration of dynamic volume sequences with variable contrast, as it is introduced in this contribution. After circular cone-beam CT reconstruction of cardiac volumes covering large areas of the myocardial tissue, the complete series is aligned with respect to a chosen reference volume. The results of the registration process and the perfusion analysis with and without registration are evaluated quantitatively in this paper. The spatial alignment leads to improved quantification of myocardial perfusion for three different pig data sets.

  16. Using Image Modelling to Teach Newton's Laws with the Ollie Trick

    ERIC Educational Resources Information Center

    Dias, Marco Adriano; Carvalho, Paulo Simeão; Vianna, Deise Miranda

    2016-01-01

    Image modelling is a video-based teaching tool that is a combination of strobe images and video analysis. This tool can enable a qualitative and a quantitative approach to the teaching of physics, in a much more engaging and appealling way than the traditional expositive practice. In a specific scenario shown in this paper, the Ollie trick, we…

  17. A Quantitative Analysis of the Role of Social Networks in Educational Contexts

    ERIC Educational Resources Information Center

    Shokri, Azam; Dafoulas, Georgios

    2016-01-01

    Recent advances in Information Technology (IT) and the advent of Web 2.0 created the path for education to ascertain its potential from this phenomenon. The role of e-learning has transformed completely as Web 2.0 technologies enabled the creation of learning content that is no longer based on textbooks and learning guides, but on manageable,…

  18. Theoretical Framework for Interaction Game Design

    DTIC Science & Technology

    2016-05-19

    modeling. We take a data-driven quantitative approach to understand conversational behaviors by measuring conversational behaviors using advanced sensing...current state of the art, human computing is considered to be a reasonable approach to break through the current limitation. To solicit high quality and...proper resources in conversation to enable smooth and effective interaction. The last technique is about conversation measurement , analysis, and

  19. Automated Detection of P. falciparum Using Machine Learning Algorithms with Quantitative Phase Images of Unstained Cells.

    PubMed

    Park, Han Sang; Rinehart, Matthew T; Walzer, Katelyn A; Chi, Jen-Tsan Ashley; Wax, Adam

    2016-01-01

    Malaria detection through microscopic examination of stained blood smears is a diagnostic challenge that heavily relies on the expertise of trained microscopists. This paper presents an automated analysis method for detection and staging of red blood cells infected by the malaria parasite Plasmodium falciparum at trophozoite or schizont stage. Unlike previous efforts in this area, this study uses quantitative phase images of unstained cells. Erythrocytes are automatically segmented using thresholds of optical phase and refocused to enable quantitative comparison of phase images. Refocused images are analyzed to extract 23 morphological descriptors based on the phase information. While all individual descriptors are highly statistically different between infected and uninfected cells, each descriptor does not enable separation of populations at a level satisfactory for clinical utility. To improve the diagnostic capacity, we applied various machine learning techniques, including linear discriminant classification (LDC), logistic regression (LR), and k-nearest neighbor classification (NNC), to formulate algorithms that combine all of the calculated physical parameters to distinguish cells more effectively. Results show that LDC provides the highest accuracy of up to 99.7% in detecting schizont stage infected cells compared to uninfected RBCs. NNC showed slightly better accuracy (99.5%) than either LDC (99.0%) or LR (99.1%) for discriminating late trophozoites from uninfected RBCs. However, for early trophozoites, LDC produced the best accuracy of 98%. Discrimination of infection stage was less accurate, producing high specificity (99.8%) but only 45.0%-66.8% sensitivity with early trophozoites most often mistaken for late trophozoite or schizont stage and late trophozoite and schizont stage most often confused for each other. Overall, this methodology points to a significant clinical potential of using quantitative phase imaging to detect and stage malaria infection without staining or expert analysis.

  20. Automated Detection of P. falciparum Using Machine Learning Algorithms with Quantitative Phase Images of Unstained Cells

    PubMed Central

    Park, Han Sang; Rinehart, Matthew T.; Walzer, Katelyn A.; Chi, Jen-Tsan Ashley; Wax, Adam

    2016-01-01

    Malaria detection through microscopic examination of stained blood smears is a diagnostic challenge that heavily relies on the expertise of trained microscopists. This paper presents an automated analysis method for detection and staging of red blood cells infected by the malaria parasite Plasmodium falciparum at trophozoite or schizont stage. Unlike previous efforts in this area, this study uses quantitative phase images of unstained cells. Erythrocytes are automatically segmented using thresholds of optical phase and refocused to enable quantitative comparison of phase images. Refocused images are analyzed to extract 23 morphological descriptors based on the phase information. While all individual descriptors are highly statistically different between infected and uninfected cells, each descriptor does not enable separation of populations at a level satisfactory for clinical utility. To improve the diagnostic capacity, we applied various machine learning techniques, including linear discriminant classification (LDC), logistic regression (LR), and k-nearest neighbor classification (NNC), to formulate algorithms that combine all of the calculated physical parameters to distinguish cells more effectively. Results show that LDC provides the highest accuracy of up to 99.7% in detecting schizont stage infected cells compared to uninfected RBCs. NNC showed slightly better accuracy (99.5%) than either LDC (99.0%) or LR (99.1%) for discriminating late trophozoites from uninfected RBCs. However, for early trophozoites, LDC produced the best accuracy of 98%. Discrimination of infection stage was less accurate, producing high specificity (99.8%) but only 45.0%-66.8% sensitivity with early trophozoites most often mistaken for late trophozoite or schizont stage and late trophozoite and schizont stage most often confused for each other. Overall, this methodology points to a significant clinical potential of using quantitative phase imaging to detect and stage malaria infection without staining or expert analysis. PMID:27636719

  1. NanoDrop Microvolume Quantitation of Nucleic Acids

    PubMed Central

    Desjardins, Philippe; Conklin, Deborah

    2010-01-01

    Biomolecular assays are continually being developed that use progressively smaller amounts of material, often precluding the use of conventional cuvette-based instruments for nucleic acid quantitation for those that can perform microvolume quantitation. The NanoDrop microvolume sample retention system (Thermo Scientific NanoDrop Products) functions by combining fiber optic technology and natural surface tension properties to capture and retain minute amounts of sample independent of traditional containment apparatus such as cuvettes or capillaries. Furthermore, the system employs shorter path lengths, which result in a broad range of nucleic acid concentration measurements, essentially eliminating the need to perform dilutions. Reducing the volume of sample required for spectroscopic analysis also facilitates the inclusion of additional quality control steps throughout many molecular workflows, increasing efficiency and ultimately leading to greater confidence in downstream results. The need for high-sensitivity fluorescent analysis of limited mass has also emerged with recent experimental advances. Using the same microvolume sample retention technology, fluorescent measurements may be performed with 2 μL of material, allowing fluorescent assays volume requirements to be significantly reduced. Such microreactions of 10 μL or less are now possible using a dedicated microvolume fluorospectrometer. Two microvolume nucleic acid quantitation protocols will be demonstrated that use integrated sample retention systems as practical alternatives to traditional cuvette-based protocols. First, a direct A260 absorbance method using a microvolume spectrophotometer is described. This is followed by a demonstration of a fluorescence-based method that enables reduced-volume fluorescence reactions with a microvolume fluorospectrometer. These novel techniques enable the assessment of nucleic acid concentrations ranging from 1 pg/ μL to 15,000 ng/ μL with minimal consumption of sample. PMID:21189466

  2. Single molecule quantitation and sequencing of rare translocations using microfluidic nested digital PCR.

    PubMed

    Shuga, Joe; Zeng, Yong; Novak, Richard; Lan, Qing; Tang, Xiaojiang; Rothman, Nathaniel; Vermeulen, Roel; Li, Laiyu; Hubbard, Alan; Zhang, Luoping; Mathies, Richard A; Smith, Martyn T

    2013-09-01

    Cancers are heterogeneous and genetically unstable. New methods are needed that provide the sensitivity and specificity to query single cells at the genetic loci that drive cancer progression, thereby enabling researchers to study the progression of individual tumors. Here, we report the development and application of a bead-based hemi-nested microfluidic droplet digital PCR (dPCR) technology to achieve 'quantitative' measurement and single-molecule sequencing of somatically acquired carcinogenic translocations at extremely low levels (<10(-6)) in healthy subjects. We use this technique in our healthy study population to determine the overall concentration of the t(14;18) translocation, which is strongly associated with follicular lymphoma. The nested dPCR approach improves the detection limit to 1×10(-7) or lower while maintaining the analysis efficiency and specificity. Further, the bead-based dPCR enabled us to isolate and quantify the relative amounts of the various clonal forms of t(14;18) translocation in these subjects, and the single-molecule sensitivity and resolution of dPCR led to the discovery of new clonal forms of t(14;18) that were otherwise masked by the conventional quantitative PCR measurements. In this manner, we created a quantitative map for this carcinogenic mutation in this healthy population and identified the positions on chromosomes 14 and 18 where the vast majority of these t(14;18) events occur.

  3. Quantitative analysis of periodontal pathogens by ELISA and real-time polymerase chain reaction.

    PubMed

    Hamlet, Stephen M

    2010-01-01

    The development of analytical methods enabling the accurate identification and enumeration of bacterial species colonizing the oral cavity has led to the identification of a small number of bacterial pathogens that are major factors in the etiology of periodontal disease. Further, these methods also underpin more recent epidemiological analyses of the impact of periodontal disease on general health. Given the complex milieu of over 700 species of microorganisms known to exist within the complex biofilms found in the oral cavity, the identification and enumeration of oral periodontopathogens has not been an easy task. In recent years however, some of the intrinsic limitations of the more traditional microbiological analyses previously used have been overcome with the advent of immunological and molecular analytical methods. Of the plethora of methodologies reported in the literature, the enzyme-linked immunosorbent assay (ELISA), which combines the specificity of antibody with the sensitivity of simple enzyme assays and the polymerase chain reaction (PCR), has been widely utilized in both laboratory and clinical applications. Although conventional PCR does not allow quantitation of the target organism, real-time PCR (rtPCR) has the ability to detect amplicons as they accumulate in "real time" allowing subsequent quantitation. These methods enable the accurate quantitation of as few as 10(2) (using rtPCR) to 10(4) (using ELISA) periodontopathogens in dental plaque samples.

  4. Statistical representative elementary volumes of porous media determined using greyscale analysis of 3D tomograms

    NASA Astrophysics Data System (ADS)

    Bruns, S.; Stipp, S. L. S.; Sørensen, H. O.

    2017-09-01

    Digital rock physics carries the dogmatic concept of having to segment volume images for quantitative analysis but segmentation rejects huge amounts of signal information. Information that is essential for the analysis of difficult and marginally resolved samples, such as materials with very small features, is lost during segmentation. In X-ray nanotomography reconstructions of Hod chalk we observed partial volume voxels with an abundance that limits segmentation based analysis. Therefore, we investigated the suitability of greyscale analysis for establishing statistical representative elementary volumes (sREV) for the important petrophysical parameters of this type of chalk, namely porosity, specific surface area and diffusive tortuosity, by using volume images without segmenting the datasets. Instead, grey level intensities were transformed to a voxel level porosity estimate using a Gaussian mixture model. A simple model assumption was made that allowed formulating a two point correlation function for surface area estimates using Bayes' theory. The same assumption enables random walk simulations in the presence of severe partial volume effects. The established sREVs illustrate that in compacted chalk, these simulations cannot be performed in binary representations without increasing the resolution of the imaging system to a point where the spatial restrictions of the represented sample volume render the precision of the measurement unacceptable. We illustrate this by analyzing the origins of variance in the quantitative analysis of volume images, i.e. resolution dependence and intersample and intrasample variance. Although we cannot make any claims on the accuracy of the approach, eliminating the segmentation step from the analysis enables comparative studies with higher precision and repeatability.

  5. Quantitation without Calibration: Response Profile as an Indicator of Target Amount.

    PubMed

    Debnath, Mrittika; Farace, Jessica M; Johnson, Kristopher D; Nesterova, Irina V

    2018-06-21

    Quantitative assessment of biomarkers is essential in numerous contexts from decision-making in clinical situations to food quality monitoring to interpretation of life-science research findings. However, appropriate quantitation techniques are not as widely addressed as detection methods. One of the major challenges in biomarker's quantitation is the need to have a calibration for correlating a measured signal to a target amount. The step complicates the methodologies and makes them less sustainable. In this work we address the issue via a new strategy: relying on position of response profile rather than on an absolute signal value for assessment of a target's amount. In order to enable the capability we develop a target-probe binding mechanism based on a negative cooperativity effect. A proof-of-concept example demonstrates that the model is suitable for quantitative analysis of nucleic acids over a wide concentration range. The general principles of the platform will be applicable toward a variety of biomarkers such as nucleic acids, proteins, peptides, and others.

  6. Quantitative 3D Analysis of Nuclear Morphology and Heterochromatin Organization from Whole-Mount Plant Tissue Using NucleusJ.

    PubMed

    Desset, Sophie; Poulet, Axel; Tatout, Christophe

    2018-01-01

    Image analysis is a classical way to study nuclear organization. While nuclear organization used to be investigated by colorimetric or fluorescent labeling of DNA or specific nuclear compartments, new methods in microscopy imaging now enable qualitative and quantitative analyses of chromatin pattern, and nuclear size and shape. Several procedures have been developed to prepare samples in order to collect 3D images for the analysis of spatial chromatin organization, but only few preserve the positional information of the cell within its tissue context. Here, we describe a whole mount tissue preparation procedure coupled to DNA staining using the PicoGreen ® intercalating agent suitable for image analysis of the nucleus in living and fixed tissues. 3D Image analysis is then performed using NucleusJ, an open source ImageJ plugin, which allows for quantifying variations in nuclear morphology such as nuclear volume, sphericity, elongation, and flatness as well as in heterochromatin content and position in respect to the nuclear periphery.

  7. Quantitative allochem compositional analysis of Lochkovian-Pragian boundary sections in the Prague Basin (Czech Republic)

    NASA Astrophysics Data System (ADS)

    Weinerová, Hedvika; Hron, Karel; Bábek, Ondřej; Šimíček, Daniel; Hladil, Jindřich

    2017-06-01

    Quantitative allochem compositional trends across the Lochkovian-Pragian boundary Event were examined at three sections recording the proximal to more distal carbonate ramp environment of the Prague Basin. Multivariate statistical methods (principal component analysis, correspondence analysis, cluster analysis) of point-counted thin section data were used to reconstruct facies stacking patterns and sea-level history. Both the closed-nature allochem percentages and their centred log-ratio (clr) coordinates were used. Both these approaches allow for distinguishing of lowstand, transgressive and highstand system tracts within the Praha Formation, which show gradual transition from crinoid-dominated facies deposited above the storm wave base to dacryoconarid-dominated facies of deep-water environment below the storm wave base. Quantitative compositional data also indicate progradative-retrogradative trends in the macrolithologically monotonous shallow-water succession and enable its stratigraphic correlation with successions from deeper-water environments. Generally, the stratigraphic trends of the clr data are more sensitive to subtle changes in allochem composition in comparison to the results based on raw data. A heterozoan-dominated allochem association in shallow-water environments of the Praha Formation supports the carbonate ramp environment assumed by previous authors.

  8. DIGE Analysis of Human Tissues.

    PubMed

    Gelfi, Cecilia; Capitanio, Daniele

    2018-01-01

    Two-dimensional difference gel electrophoresis (2-D DIGE) is an advanced and elegant gel electrophoretic analytical tool for comparative protein assessment. It is based on two-dimensional gel electrophoresis (2-DE) separation of fluorescently labeled protein extracts. The tagging procedures are designed to not interfere with the chemical properties of proteins with respect to their pI and electrophoretic mobility, once a proper labeling protocol is followed. The two-dye or three-dye systems can be adopted and their choice depends on specific applications. Furthermore, the use of an internal pooled standard makes 2-D DIGE a highly accurate quantitative method enabling multiple protein samples to be separated on the same two-dimensional gel. The image matching and cross-gel statistical analysis generates robust quantitative results making data validation by independent technologies successful.

  9. LC-MS Data Processing with MAVEN: A Metabolomic Analysis and Visualization Engine

    PubMed Central

    Clasquin, Michelle F.; Melamud, Eugene; Rabinowitz, Joshua D.

    2014-01-01

    MAVEN is an open-source software program for interactive processing of LC-MS-based metabolomics data. MAVEN enables rapid and reliable metabolite quantitation from multiple reaction monitoring data or high-resolution full-scan mass spectrometry data. It automatically detects and reports peak intensities for isotope-labeled metabolites. Menu-driven, click-based navigation allows visualization of raw and analyzed data. Here we provide a User Guide for MAVEN. Step-by-step instructions are provided for data import, peak alignment across samples, identification of metabolites that differ strongly between biological conditions, quantitation and visualization of isotope-labeling patterns, and export of tables of metabolite-specific peak intensities. Together, these instructions describe a workflow that allows efficient processing of raw LC-MS data into a form ready for biological analysis. PMID:22389014

  10. LC-MS data processing with MAVEN: a metabolomic analysis and visualization engine.

    PubMed

    Clasquin, Michelle F; Melamud, Eugene; Rabinowitz, Joshua D

    2012-03-01

    MAVEN is an open-source software program for interactive processing of LC-MS-based metabolomics data. MAVEN enables rapid and reliable metabolite quantitation from multiple reaction monitoring data or high-resolution full-scan mass spectrometry data. It automatically detects and reports peak intensities for isotope-labeled metabolites. Menu-driven, click-based navigation allows visualization of raw and analyzed data. Here we provide a User Guide for MAVEN. Step-by-step instructions are provided for data import, peak alignment across samples, identification of metabolites that differ strongly between biological conditions, quantitation and visualization of isotope-labeling patterns, and export of tables of metabolite-specific peak intensities. Together, these instructions describe a workflow that allows efficient processing of raw LC-MS data into a form ready for biological analysis.

  11. Orthogonal analytical methods for botanical standardization: Determination of green tea catechins by qNMR and LC-MS/MS

    PubMed Central

    Napolitano, José G.; Gödecke, Tanja; Lankin, David C.; Jaki, Birgit U.; McAlpine, James B.; Chen, Shao-Nong; Pauli, Guido F.

    2013-01-01

    The development of analytical methods for parallel characterization of multiple phytoconstituents is essential to advance the quality control of herbal products. While chemical standardization is commonly carried out by targeted analysis using gas or liquid chromatography-based methods, more universal approaches based on quantitative 1H NMR (qHNMR) measurements are being used increasingly in the multi-targeted assessment of these complex mixtures. The present study describes the development of a 1D qHNMR-based method for simultaneous identification and quantification of green tea constituents. This approach utilizes computer-assisted 1H iterative Full Spin Analysis (HiFSA) and enables rapid profiling of seven catechins in commercial green tea extracts. The qHNMR results were cross-validated against quantitative profiles obtained with an orthogonal LC-MS/MS method. The relative strengths and weaknesses of both approaches are discussed, with special emphasis on the role of identical reference standards in qualitative and quantitative analyses. PMID:23870106

  12. Enabling Analysis of Big, Thick, Long, and Wide Data: Data Management for the Analysis of a Large Longitudinal and Cross-National Narrative Data Set.

    PubMed

    Winskell, Kate; Singleton, Robyn; Sabben, Gaelle

    2018-03-01

    Distinctive longitudinal narrative data, collected during a critical 18-year period in the history of the HIV epidemic, offer a unique opportunity to examine how young Africans are making sense of evolving developments in HIV prevention and treatment. More than 200,000 young people from across sub-Saharan Africa took part in HIV-themed scriptwriting contests held at eight discrete time points between 1997 and 2014, creating more than 75,000 narratives. This article describes the data reduction and management strategies developed for our cross-national and longitudinal study of these qualitative data. The study aims to inform HIV communication practice by identifying cultural meanings and contextual factors that inform sexual behaviors and social practices, and also to help increase understanding of processes of sociocultural change. We describe our sampling strategies and our triangulating methodologies, combining in-depth narrative analysis, thematic qualitative analysis, and quantitative analysis, which are designed to enable systematic comparison without sacrificing ethnographic richness.

  13. Network analysis for the visualization and analysis of qualitative data.

    PubMed

    Pokorny, Jennifer J; Norman, Alex; Zanesco, Anthony P; Bauer-Wu, Susan; Sahdra, Baljinder K; Saron, Clifford D

    2018-03-01

    We present a novel manner in which to visualize the coding of qualitative data that enables representation and analysis of connections between codes using graph theory and network analysis. Network graphs are created from codes applied to a transcript or audio file using the code names and their chronological location. The resulting network is a representation of the coding data that characterizes the interrelations of codes. This approach enables quantification of qualitative codes using network analysis and facilitates examination of associations of network indices with other quantitative variables using common statistical procedures. Here, as a proof of concept, we applied this method to a set of interview transcripts that had been coded in 2 different ways and the resultant network graphs were examined. The creation of network graphs allows researchers an opportunity to view and share their qualitative data in an innovative way that may provide new insights and enhance transparency of the analytical process by which they reach their conclusions. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  14. Recent advances in mass spectrometry-based proteomics of gastric cancer.

    PubMed

    Kang, Changwon; Lee, Yejin; Lee, J Eugene

    2016-10-07

    The last decade has witnessed remarkable technological advances in mass spectrometry-based proteomics. The development of proteomics techniques has enabled the reliable analysis of complex proteomes, leading to the identification and quantification of thousands of proteins in gastric cancer cells, tissues, and sera. This quantitative information has been used to profile the anomalies in gastric cancer and provide insights into the pathogenic mechanism of the disease. In this review, we mainly focus on the advances in mass spectrometry and quantitative proteomics that were achieved in the last five years and how these up-and-coming technologies are employed to track biochemical changes in gastric cancer cells. We conclude by presenting a perspective on quantitative proteomics and its future applications in the clinic and translational gastric cancer research.

  15. Accurate radiation temperature and chemical potential from quantitative photoluminescence analysis of hot carrier populations.

    PubMed

    Gibelli, François; Lombez, Laurent; Guillemoles, Jean-François

    2017-02-15

    In order to characterize hot carrier populations in semiconductors, photoluminescence measurement is a convenient tool, enabling us to probe the carrier thermodynamical properties in a contactless way. However, the analysis of the photoluminescence spectra is based on some assumptions which will be discussed in this work. We especially emphasize the importance of the variation of the material absorptivity that should be considered to access accurate thermodynamical properties of the carriers, especially by varying the excitation power. The proposed method enables us to obtain more accurate results of thermodynamical properties by taking into account a rigorous physical description and finds direct application in investigating hot carrier solar cells, which are an adequate concept for achieving high conversion efficiencies with a relatively simple device architecture.

  16. Lipid Informed Quantitation and Identification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kevin Crowell, PNNL

    2014-07-21

    LIQUID (Lipid Informed Quantitation and Identification) is a software program that has been developed to enable users to conduct both informed and high-throughput global liquid chromatography-tandem mass spectrometry (LC-MS/MS)-based lipidomics analysis. This newly designed desktop application can quickly identify and quantify lipids from LC-MS/MS datasets while providing a friendly graphical user interface for users to fully explore the data. Informed data analysis simply involves the user specifying an electrospray ionization mode, lipid common name (i.e. PE(16:0/18:2)), and associated charge carrier. A stemplot of the isotopic profile and a line plot of the extracted ion chromatogram are also provided to showmore » the MS-level evidence of the identified lipid. In addition to plots, other information such as intensity, mass measurement error, and elution time are also provided. Typically, a global analysis for 15,000 lipid targets« less

  17. Deformation analysis of MEMS structures by modified digital moiré methods

    NASA Astrophysics Data System (ADS)

    Liu, Zhanwei; Lou, Xinhao; Gao, Jianxin

    2010-11-01

    Quantitative deformation analysis of micro-fabricated electromechanical systems is of importance for the design and functional control of microsystems. In this paper, two modified digital moiré processing methods, Gaussian blurring algorithm combined with digital phase shifting and geometrical phase analysis (GPA) technique based on digital moiré method, are developed to quantitatively analyse the deformation behaviour of micro-electro-mechanical system (MEMS) structures. Measuring principles and experimental procedures of the two methods are described in detail. A digital moiré fringe pattern is generated by superimposing a specimen grating etched directly on a microstructure surface with a digital reference grating (DRG). Most of the grating noise is removed from the digital moiré fringes, which enables the phase distribution of the moiré fringes to be obtained directly. Strain measurement result of a MEMS structure demonstrates the feasibility of the two methods.

  18. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kiefel, Denis, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com; Stoessel, Rainer, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com; Grosse, Christian, E-mail: Grosse@tum.de

    2015-03-31

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT)more » system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.« less

  19. Quantitative Glycomics Strategies*

    PubMed Central

    Mechref, Yehia; Hu, Yunli; Desantos-Garcia, Janie L.; Hussein, Ahmed; Tang, Haixu

    2013-01-01

    The correlations between protein glycosylation and many biological processes and diseases are increasing the demand for quantitative glycomics strategies enabling sensitive monitoring of changes in the abundance and structure of glycans. This is currently attained through multiple strategies employing several analytical techniques such as capillary electrophoresis, liquid chromatography, and mass spectrometry. The detection and quantification of glycans often involve labeling with ionic and/or hydrophobic reagents. This step is needed in order to enhance detection in spectroscopic and mass spectrometric measurements. Recently, labeling with stable isotopic reagents has also been presented as a very viable strategy enabling relative quantitation. The different strategies available for reliable and sensitive quantitative glycomics are herein described and discussed. PMID:23325767

  20. Dynamic contrast optical coherence tomography images transit time and quantifies microvascular plasma volume and flow in the retina and choriocapillaris

    PubMed Central

    Merkle, Conrad W.; Leahy, Conor; Srinivasan, Vivek J.

    2016-01-01

    Despite the prevalence of optical imaging techniques to measure hemodynamics in large retinal vessels, quantitative measurements of retinal capillary and choroidal hemodynamics have traditionally been challenging. Here, a new imaging technique called dynamic contrast optical coherence tomography (DyC-OCT) is applied in the rat eye to study microvascular blood flow in individual retinal and choroidal layers in vivo. DyC-OCT is based on imaging the transit of an intravascular tracer dynamically as it passes through the field-of-view. Hemodynamic parameters can be determined through quantitative analysis of tracer kinetics. In addition to enabling depth-resolved transit time, volume, and flow measurements, the injected tracer also enhances OCT angiograms and enables clear visualization of the choriocapillaris, particularly when combined with a post-processing method for vessel enhancement. DyC-OCT complements conventional OCT angiography through quantification of tracer dynamics, similar to fluorescence angiography, but with the important added benefit of laminar resolution. PMID:27867732

  1. Dynamic contrast optical coherence tomography images transit time and quantifies microvascular plasma volume and flow in the retina and choriocapillaris.

    PubMed

    Merkle, Conrad W; Leahy, Conor; Srinivasan, Vivek J

    2016-10-01

    Despite the prevalence of optical imaging techniques to measure hemodynamics in large retinal vessels, quantitative measurements of retinal capillary and choroidal hemodynamics have traditionally been challenging. Here, a new imaging technique called dynamic contrast optical coherence tomography (DyC-OCT) is applied in the rat eye to study microvascular blood flow in individual retinal and choroidal layers in vivo . DyC-OCT is based on imaging the transit of an intravascular tracer dynamically as it passes through the field-of-view. Hemodynamic parameters can be determined through quantitative analysis of tracer kinetics. In addition to enabling depth-resolved transit time, volume, and flow measurements, the injected tracer also enhances OCT angiograms and enables clear visualization of the choriocapillaris, particularly when combined with a post-processing method for vessel enhancement. DyC-OCT complements conventional OCT angiography through quantification of tracer dynamics, similar to fluorescence angiography, but with the important added benefit of laminar resolution.

  2. Quantification and clustering of phenotypic screening data using time-series analysis for chemotherapy of schistosomiasis.

    PubMed

    Lee, Hyokyeong; Moody-Davis, Asher; Saha, Utsab; Suzuki, Brian M; Asarnow, Daniel; Chen, Steven; Arkin, Michelle; Caffrey, Conor R; Singh, Rahul

    2012-01-01

    Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Development of techniques for automated, high-throughput drug screening against these diseases, especially in whole-organism settings, constitutes one of the great challenges of modern drug discovery. We present a method for enabling high-throughput phenotypic drug screening against diseases caused by helminths with a focus on schistosomiasis. The proposed method allows for a quantitative analysis of the systemic impact of a drug molecule on the pathogen as exhibited by the complex continuum of its phenotypic responses. This method consists of two key parts: first, biological image analysis is employed to automatically monitor and quantify shape-, appearance-, and motion-based phenotypes of the parasites. Next, we represent these phenotypes as time-series and show how to compare, cluster, and quantitatively reason about them using techniques of time-series analysis. We present results on a number of algorithmic issues pertinent to the time-series representation of phenotypes. These include results on appropriate representation of phenotypic time-series, analysis of different time-series similarity measures for comparing phenotypic responses over time, and techniques for clustering such responses by similarity. Finally, we show how these algorithmic techniques can be used for quantifying the complex continuum of phenotypic responses of parasites. An important corollary is the ability of our method to recognize and rigorously group parasites based on the variability of their phenotypic response to different drugs. The methods and results presented in this paper enable automatic and quantitative scoring of high-throughput phenotypic screens focused on helmintic diseases. Furthermore, these methods allow us to analyze and stratify parasites based on their phenotypic response to drugs. Together, these advancements represent a significant breakthrough for the process of drug discovery against schistosomiasis in particular and can be extended to other helmintic diseases which together afflict a large part of humankind.

  3. Quantification and clustering of phenotypic screening data using time-series analysis for chemotherapy of schistosomiasis

    PubMed Central

    2012-01-01

    Background Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Development of techniques for automated, high-throughput drug screening against these diseases, especially in whole-organism settings, constitutes one of the great challenges of modern drug discovery. Method We present a method for enabling high-throughput phenotypic drug screening against diseases caused by helminths with a focus on schistosomiasis. The proposed method allows for a quantitative analysis of the systemic impact of a drug molecule on the pathogen as exhibited by the complex continuum of its phenotypic responses. This method consists of two key parts: first, biological image analysis is employed to automatically monitor and quantify shape-, appearance-, and motion-based phenotypes of the parasites. Next, we represent these phenotypes as time-series and show how to compare, cluster, and quantitatively reason about them using techniques of time-series analysis. Results We present results on a number of algorithmic issues pertinent to the time-series representation of phenotypes. These include results on appropriate representation of phenotypic time-series, analysis of different time-series similarity measures for comparing phenotypic responses over time, and techniques for clustering such responses by similarity. Finally, we show how these algorithmic techniques can be used for quantifying the complex continuum of phenotypic responses of parasites. An important corollary is the ability of our method to recognize and rigorously group parasites based on the variability of their phenotypic response to different drugs. Conclusions The methods and results presented in this paper enable automatic and quantitative scoring of high-throughput phenotypic screens focused on helmintic diseases. Furthermore, these methods allow us to analyze and stratify parasites based on their phenotypic response to drugs. Together, these advancements represent a significant breakthrough for the process of drug discovery against schistosomiasis in particular and can be extended to other helmintic diseases which together afflict a large part of humankind. PMID:22369037

  4. Transient deformation of a droplet near a microfluidic constriction: A quantitative analysis

    NASA Astrophysics Data System (ADS)

    Trégouët, Corentin; Salez, Thomas; Monteux, Cécile; Reyssat, Mathilde

    2018-05-01

    We report on experiments that consist of deforming a collection of monodisperse droplets produced by a microfluidic chip through a flow-focusing device. We show that a proper numerical modeling of the flow is necessary to access the stress applied by the latter on the droplet along its trajectory through the chip. This crucial step enables the full integration of the differential equation governing the dynamical deformation, and consequently the robust measurement of the interfacial tension by fitting the experiments with the calculated deformation. Our study thus demonstrates the feasibility of quantitative in situ rheology in microfluidic flows involving, e.g., droplets, capsules, or cells.

  5. The A-Like Faker Assay for Measuring Yeast Chromosome III Stability.

    PubMed

    Novoa, Carolina A; Ang, J Sidney; Stirling, Peter C

    2018-01-01

    The ability to rapidly assess chromosome instability (CIN) has enabled profiling of most yeast genes for potential effects on genome stability. The A-like faker (ALF) assay is one of several qualitative and quantitative marker loss assays that indirectly measure loss or conversion of genetic material using a counterselection step. The ALF assay relies on the ability to count spurious mating events that occur upon loss of the MATα locus of haploid Saccharomyces cerevisiae strains. Here, we describe the deployment of the ALF assay for both rapid and simple qualitative, and more in-depth quantitative analysis allowing determination of absolute ALF frequencies.

  6. Wires in the soup: quantitative models of cell signaling

    PubMed Central

    Cheong, Raymond; Levchenko, Andre

    2014-01-01

    Living cells are capable of extracting information from their environments and mounting appropriate responses to a variety of associated challenges. The underlying signal transduction networks enabling this can be quite complex, necessitating for their unraveling by sophisticated computational modeling coupled with precise experimentation. Although we are still at the beginning of this process, some recent examples of integrative analysis of cell signaling are very encouraging. This review highlights the case of the NF-κB pathway in order to illustrate how a quantitative model of a signaling pathway can be gradually constructed through continuous experimental validation, and what lessons one might learn from such exercises. PMID:18291655

  7. Indirect Observation in Everyday Contexts: Concepts and Methodological Guidelines within a Mixed Methods Framework.

    PubMed

    Anguera, M Teresa; Portell, Mariona; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2018-01-01

    Indirect observation is a recent concept in systematic observation. It largely involves analyzing textual material generated either indirectly from transcriptions of audio recordings of verbal behavior in natural settings (e.g., conversation, group discussions) or directly from narratives (e.g., letters of complaint, tweets, forum posts). It may also feature seemingly unobtrusive objects that can provide relevant insights into daily routines. All these materials constitute an extremely rich source of information for studying everyday life, and they are continuously growing with the burgeoning of new technologies for data recording, dissemination, and storage. Narratives are an excellent vehicle for studying everyday life, and quantitization is proposed as a means of integrating qualitative and quantitative elements. However, this analysis requires a structured system that enables researchers to analyze varying forms and sources of information objectively. In this paper, we present a methodological framework detailing the steps and decisions required to quantitatively analyze a set of data that was originally qualitative. We provide guidelines on study dimensions, text segmentation criteria, ad hoc observation instruments, data quality controls, and coding and preparation of text for quantitative analysis. The quality control stage is essential to ensure that the code matrices generated from the qualitative data are reliable. We provide examples of how an indirect observation study can produce data for quantitative analysis and also describe the different software tools available for the various stages of the process. The proposed method is framed within a specific mixed methods approach that involves collecting qualitative data and subsequently transforming these into matrices of codes (not frequencies) for quantitative analysis to detect underlying structures and behavioral patterns. The data collection and quality control procedures fully meet the requirement of flexibility and provide new perspectives on data integration in the study of biopsychosocial aspects in everyday contexts.

  8. Evaporative light scattering detector in normal-phase high-performance liquid chromatography determination of FAME oxidation products.

    PubMed

    Morales, Arturo; Marmesat, Susana; Dobarganes, M Carmen; Márquez-Ruiz, Gloria; Velasco, Joaquín

    2012-09-07

    The use of an ELS detector in NP-HPLC for quantitative analysis of oxidation products in FAME obtained from oils is evaluated in this study. The results obtained have shown that the ELS detector enables the quantitative determination of the hydroperoxides of oleic and linoleic acid methyl esters as a whole, and connected in series with a UV detector makes it possible to determine both groups of compounds by difference, providing useful complementary information. The limits of detection (LOD) and quantification (LOQ) found for hydroperoxides were respectively 2.5 and 5.7 μg mL⁻¹ and precision of quantitation expressed as coefficient of variation was lower than 10%. Due to a low sensitivity the ELS detector shows limitations to determine the low contents of secondary oxidation products in the direct analysis of FAME oxidized at low or moderate temperature. Analysis of FAME samples obtained either from high linoleic sunflower oil (HLSO) or high oleic sunflower oil (HOSO) and oxidized at 80 °C showed that only ketodienes formed from methyl linoleate can be determined in samples with relatively high oxidation, being the LOD and LOQ 0.2 and 0.4 mg/g FAME, respectively, at the analytical conditions applied. The ELS detector also enabled the determination of methyl cis-9,10-epoxystearate and methyl trans-9,10-epoxystearate, which were resolved at the chromatographic conditions applied. Results showed that these compounds, which are formed from methyl oleate, were not detected in the high-linoleic sample, but occurred at non-negligible levels in the oxidized FAME obtained from HOSO. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Optical barcoding of PLGA for multispectral analysis of nanoparticle fate in vivo.

    PubMed

    Medina, David X; Householder, Kyle T; Ceton, Ricki; Kovalik, Tina; Heffernan, John M; Shankar, Rohini V; Bowser, Robert P; Wechsler-Reya, Robert J; Sirianni, Rachael W

    2017-05-10

    Understanding of the mechanisms by which systemically administered nanoparticles achieve delivery across biological barriers remains incomplete, due in part to the challenge of tracking nanoparticle fate in the body. Here, we develop a new approach for "barcoding" nanoparticles composed of poly(lactic-co-glycolic acid) (PLGA) with bright, spectrally defined quantum dots (QDs) to enable direct, fluorescent detection of nanoparticle fate with subcellular resolution. We show that QD labeling does not affect major biophysical properties of nanoparticles or their interaction with cells and tissues. Live cell imaging enabled simultaneous visualization of the interaction of control and targeted nanoparticles with bEnd.3 cells in a flow chamber, providing direct evidence that surface modification of nanoparticles with the cell-penetrating peptide TAT increases their biophysical association with cell surfaces over very short time periods under convective current. We next developed this technique for quantitative biodistribution analysis in vivo. These studies demonstrate that nanoparticle surface modification with the cell penetrating peptide TAT facilitates brain-specific delivery that is restricted to brain vasculature. Although nanoparticle entry into the healthy brain parenchyma is minimal, with no evidence for movement of nanoparticles across the blood-brain barrier (BBB), we observed that nanoparticles are able to enter to the central nervous system (CNS) through regions of altered BBB permeability - for example, into circumventricular organs in the brain or leaky vasculature of late-stage intracranial tumors. In sum, these data demonstrate a new, multispectral approach for barcoding PLGA, which enables simultaneous, quantitative analysis of the fate of multiple nanoparticle formulations in vivo. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  10. Web-based interactive visualization in a Grid-enabled neuroimaging application using HTML5.

    PubMed

    Siewert, René; Specovius, Svenja; Wu, Jie; Krefting, Dagmar

    2012-01-01

    Interactive visualization and correction of intermediate results are required in many medical image analysis pipelines. To allow certain interaction in the remote execution of compute- and data-intensive applications, new features of HTML5 are used. They allow for transparent integration of user interaction into Grid- or Cloud-enabled scientific workflows. Both 2D and 3D visualization and data manipulation can be performed through a scientific gateway without the need to install specific software or web browser plugins. The possibilities of web-based visualization are presented along the FreeSurfer-pipeline, a popular compute- and data-intensive software tool for quantitative neuroimaging.

  11. Quantitative 3D investigation of Neuronal network in mouse spinal cord model

    NASA Astrophysics Data System (ADS)

    Bukreeva, I.; Campi, G.; Fratini, M.; Spanò, R.; Bucci, D.; Battaglia, G.; Giove, F.; Bravin, A.; Uccelli, A.; Venturi, C.; Mastrogiacomo, M.; Cedola, A.

    2017-01-01

    The investigation of the neuronal network in mouse spinal cord models represents the basis for the research on neurodegenerative diseases. In this framework, the quantitative analysis of the single elements in different districts is a crucial task. However, conventional 3D imaging techniques do not have enough spatial resolution and contrast to allow for a quantitative investigation of the neuronal network. Exploiting the high coherence and the high flux of synchrotron sources, X-ray Phase-Contrast multiscale-Tomography allows for the 3D investigation of the neuronal microanatomy without any aggressive sample preparation or sectioning. We investigated healthy-mouse neuronal architecture by imaging the 3D distribution of the neuronal-network with a spatial resolution of 640 nm. The high quality of the obtained images enables a quantitative study of the neuronal structure on a subject-by-subject basis. We developed and applied a spatial statistical analysis on the motor neurons to obtain quantitative information on their 3D arrangement in the healthy-mice spinal cord. Then, we compared the obtained results with a mouse model of multiple sclerosis. Our approach paves the way to the creation of a “database” for the characterization of the neuronal network main features for a comparative investigation of neurodegenerative diseases and therapies.

  12. Combinatorial modification of human histone H4 quantitated by two-dimensional liquid chromatography coupled with top down mass spectrometry.

    PubMed

    Pesavento, James J; Bullock, Courtney R; LeDuc, Richard D; Mizzen, Craig A; Kelleher, Neil L

    2008-05-30

    Quantitative proteomics has focused heavily on correlating protein abundances, ratios, and dynamics by developing methods that are protein expression-centric (e.g. isotope coded affinity tag, isobaric tag for relative and absolute quantification, etc.). These methods effectively detect changes in protein abundance but fail to provide a comprehensive perspective of the diversity of proteins such as histones, which are regulated by post-translational modifications. Here, we report the characterization of modified forms of HeLa cell histone H4 with a dynamic range >10(4) using a strictly Top Down mass spectrometric approach coupled with two dimensions of liquid chromatography. This enhanced dynamic range enabled the precise characterization and quantitation of 42 forms uniquely modified by combinations of methylation and acetylation, including those with trimethylated Lys-20, monomethylated Arg-3, and the novel dimethylated Arg-3 (each <1% of all H4 forms). Quantitative analyses revealed distinct trends in acetylation site occupancy depending on Lys-20 methylation state. Because both modifications are dynamically regulated through the cell cycle, we simultaneously investigated acetylation and methylation kinetics through three cell cycle phases and used these data to statistically assess the robustness of our quantitative analysis. This work represents the most comprehensive analysis of histone H4 forms present in human cells reported to date.

  13. Quantitation of influenza virus using field flow fractionation and multi-angle light scattering for quantifying influenza A particles

    PubMed Central

    Bousse, Tatiana; Shore, David A.; Goldsmith, Cynthia S.; Hossain, M. Jaber; Jang, Yunho; Davis, Charles T.; Donis, Ruben O.; Stevens, James

    2017-01-01

    Summary Recent advances in instrumentation and data analysis in field flow fractionation and multi-angle light scattering (FFF-MALS) have enabled greater use of this technique to characterize and quantitate viruses. In this study, the FFF-MALS technique was applied to the characterization and quantitation of type A influenza virus particles to assess its usefulness for vaccine preparation. The use of FFF-MALS for quantitation and measurement of control particles provided data accurate to within 5% of known values, reproducible with a coefficient of variation of 1.9 %. The methods, sensitivity and limit of detection were established by analyzing different volumes of purified virus, which produced a linear regression with fitting value R2 of 0.99. FFF-MALS was further applied to detect and quantitate influenza virus in the supernatant of infected MDCK cells and allantoic fluids of infected eggs. FFF fractograms of the virus present in these different fluids revealed similar distribution of monomeric and oligomeric virions. However, the monomer fraction of cell grown virus has greater size variety. Notably, β-propialactone (BPL) inactivation of influenza viruses did not influence any of the FFF-MALS measurements. Quantitation analysis by FFF-MALS was compared to infectivity assays and real-time RT-PCR (qRT-PCR) and the limitations of each assay were discussed. PMID:23916678

  14. Software for Automated Image-to-Image Co-registration

    NASA Technical Reports Server (NTRS)

    Benkelman, Cody A.; Hughes, Heidi

    2007-01-01

    The project objectives are: a) Develop software to fine-tune image-to-image co-registration, presuming images are orthorectified prior to input; b) Create a reusable software development kit (SDK) to enable incorporation of these tools into other software; d) provide automated testing for quantitative analysis; and e) Develop software that applies multiple techniques to achieve subpixel precision in the co-registration of image pairs.

  15. iss047e066248

    NASA Image and Video Library

    2016-04-19

    ISS047e066248 (04/19/2016) --- NASA astronaut and Expedition 47 Flight Engineer Jeff Williams works with the Wet Lab RNA SmartCycler on-board the International Space Station. Wetlab RNA SmartCycler is a research platform for conducting real-time quantitative gene expression analysis aboard the ISS. The system enables spaceflight genomic studies involving a wide variety of biospecimen types in the unique microgravity environment of space.

  16. Live-cell confocal microscopy and quantitative 4D image analysis of anchor cell invasion through the basement membrane in C. elegans

    PubMed Central

    Kelley, Laura C.; Wang, Zheng; Hagedorn, Elliott J.; Wang, Lin; Shen, Wanqing; Lei, Shijun; Johnson, Sam A.; Sherwood, David R.

    2018-01-01

    Cell invasion through basement membrane (BM) barriers is crucial during development, leukocyte trafficking, and for the spread of cancer. Despite its importance in normal and diseased states, the mechanisms that direct invasion are poorly understood, in large part because of the inability to visualize dynamic cell-basement membrane interactions in vivo. This protocol describes multi-channel time-lapse confocal imaging of anchor cell invasion in live C. elegans. Methods presented include outline slide preparation and worm growth synchronization (15 min), mounting (20 min), image acquisition (20-180 min), image processing (20 min), and quantitative analysis (variable timing). Images acquired enable direct measurement of invasive dynamics including invadopodia formation, cell membrane protrusions, and BM removal. This protocol can be combined with genetic analysis, molecular activity probes, and optogenetic approaches to uncover molecular mechanisms underlying cell invasion. These methods can also be readily adapted for real-time analysis of cell migration, basement membrane turnover, and cell membrane dynamics by any worm laboratory. PMID:28880279

  17. Combination of nano-material enrichment and dead-end filtration for uniform and rapid sample preparation in matrix-assisted laser desorption/ionization mass spectrometry.

    PubMed

    Wu, Zengnan; Khan, Mashooq; Mao, Sifeng; Lin, Ling; Lin, Jin-Ming

    2018-05-01

    Matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) is a fast analysis tool for the detection of a wide range of analytes. However, heterogeneous distribution of matrix/analyte cocrystal, variation in signal intensity and poor experimental reproducibility at different locations of the same spot means difficulty in quantitative analysis. In this work, carbon nanotubes (CNTs) were employed as adsorbent for analyte cum matrix on a conductive porous membrane as a novel mass target plate. The sample pretreatment step was achieved by enrichment and dead-end filtration and dried by a solid-liquid separation. This approach enables the homogeneous distribution of analyte in the matrix, good shot-to-shot reproducibility in signals and quantitative detection of peptide and protein at different concentrations with correlation coefficient (R 2 ) of 0.9920 and 0.9909, respectively. The simple preparation of sample in a short time, uniform distribution of analyte, easy quantitative detection, and high reproducibility makes this technique useful and may diversify the application of MALDI-MS for quantitative detection of a variety of proteins. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. PIQMIe: a web server for semi-quantitative proteomics data management and analysis

    PubMed Central

    Kuzniar, Arnold; Kanaar, Roland

    2014-01-01

    We present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates peptide and (non-redundant) protein identifications and quantitations from multiple experiments with additional biological information on the protein entries, and makes the linked data available in the form of a light-weight relational database, which enables dedicated data analyses (e.g. in R) and user-driven queries. Using the web interface, users are presented with a concise summary of their proteomics experiments in numerical and graphical forms, as well as with a searchable protein grid and interactive visualization tools to aid in the rapid assessment of the experiments and in the identification of proteins of interest. The web server not only provides data access through a web interface but also supports programmatic access through RESTful web service. The web server is available at http://piqmie.semiqprot-emc.cloudlet.sara.nl or http://www.bioinformatics.nl/piqmie. This website is free and open to all users and there is no login requirement. PMID:24861615

  19. PIQMIe: a web server for semi-quantitative proteomics data management and analysis.

    PubMed

    Kuzniar, Arnold; Kanaar, Roland

    2014-07-01

    We present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates peptide and (non-redundant) protein identifications and quantitations from multiple experiments with additional biological information on the protein entries, and makes the linked data available in the form of a light-weight relational database, which enables dedicated data analyses (e.g. in R) and user-driven queries. Using the web interface, users are presented with a concise summary of their proteomics experiments in numerical and graphical forms, as well as with a searchable protein grid and interactive visualization tools to aid in the rapid assessment of the experiments and in the identification of proteins of interest. The web server not only provides data access through a web interface but also supports programmatic access through RESTful web service. The web server is available at http://piqmie.semiqprot-emc.cloudlet.sara.nl or http://www.bioinformatics.nl/piqmie. This website is free and open to all users and there is no login requirement. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. Toward Accurate and Quantitative Comparative Metagenomics

    PubMed Central

    Nayfach, Stephen; Pollard, Katherine S.

    2016-01-01

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  1. Toward Accurate and Quantitative Comparative Metagenomics.

    PubMed

    Nayfach, Stephen; Pollard, Katherine S

    2016-08-25

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Affinity Proteomics for Fast, Sensitive, Quantitative Analysis of Proteins in Plasma.

    PubMed

    O'Grady, John P; Meyer, Kevin W; Poe, Derrick N

    2017-01-01

    The improving efficacy of many biological therapeutics and identification of low-level biomarkers are driving the analytical proteomics community to deal with extremely high levels of sample complexity relative to their analytes. Many protein quantitation and biomarker validation procedures utilize an immunoaffinity enrichment step to purify the sample and maximize the sensitivity of the corresponding liquid chromatography tandem mass spectrometry measurements. In order to generate surrogate peptides with better mass spectrometric properties, protein enrichment is followed by a proteolytic cleavage step. This is often a time-consuming multistep process. Presented here is a workflow which enables rapid protein enrichment and proteolytic cleavage to be performed in a single, easy-to-use reactor. Using this strategy Klotho, a low-abundance biomarker found in plasma, can be accurately quantitated using a protocol that takes under 5 h from start to finish.

  3. Nanoparticle surface characterization and clustering through concentration-dependent surface adsorption modeling.

    PubMed

    Chen, Ran; Zhang, Yuntao; Sahneh, Faryad Darabi; Scoglio, Caterina M; Wohlleben, Wendel; Haase, Andrea; Monteiro-Riviere, Nancy A; Riviere, Jim E

    2014-09-23

    Quantitative characterization of nanoparticle interactions with their surrounding environment is vital for safe nanotechnological development and standardization. A recent quantitative measure, the biological surface adsorption index (BSAI), has demonstrated promising applications in nanomaterial surface characterization and biological/environmental prediction. This paper further advances the approach beyond the application of five descriptors in the original BSAI to address the concentration dependence of the descriptors, enabling better prediction of the adsorption profile and more accurate categorization of nanomaterials based on their surface properties. Statistical analysis on the obtained adsorption data was performed based on three different models: the original BSAI, a concentration-dependent polynomial model, and an infinite dilution model. These advancements in BSAI modeling showed a promising development in the application of quantitative predictive modeling in biological applications, nanomedicine, and environmental safety assessment of nanomaterials.

  4. Tandem transmission/reflection mode XRD instrument including XRF for in situ measurement of Martian rocks and soils

    NASA Astrophysics Data System (ADS)

    Delhez, Robert; Van der Gaast, S. J.; Wielders, Arno; de Boer, J. L.; Helmholdt, R. B.; van Mechelen, J.; Reiss, C.; Woning, L.; Schenk, H.

    2003-02-01

    The mineralogy of the surface material of Mars is the key to disclose its present and past life and climates. Clay mineral species, carbonates, and ice (water and CO2) are and/or contain their witnesses. X-ray powder diffraction (XRPD) is the most powerful analytical method to identify and quantitatively characterize minerals in complex mixtures. This paper discusses the development of a working model of an instrument consisting of a reflection mode diffractometer and a transmission mode CCD-XRPD instrument, combined with an XRF module. The CCD-XRD/XRF instrument is analogous to the instrument for Mars missions developed by Sarrazin et al. (1998). This part of the tandem instrument enables "quick and dirty" analysis of powdered (!) matter to monitor semi-quantitatively the presence of clay minerals as a group, carbonates, and ices and yields semi-quantitative chemical information from X-ray fluorescence (XRF). The reflection mode instrument (i) enables in-situ measurements of rocks and soils and quantitative information on the compounds identified, (ii) has a high resolution and reveals large spacings for accurate identification, in particular of clay mineral species, and (iii) the shape of the line profiles observed reveals the kind and approximate amounts of lattice imperfections present. It will be shown that the information obtained with the reflection mode diffractometer is crucial for finding signs of life and changes in the climate on Mars. Obviously this instrument can also be used for other extra-terrestrial research.

  5. Quantitative analysis of the local phase transitions induced by the laser heating

    DOE PAGES

    Levlev, Anton V.; Susner, Michael A.; McGuire, Michael A.; ...

    2015-11-04

    Functional imaging enabled by scanning probe microscopy (SPM) allows investigations of nanoscale material properties under a wide range of external conditions, including temperature. However, a number of shortcomings preclude the use of the most common material heating techniques, thereby limiting precise temperature measurements. Here we discuss an approach to local laser heating on the micron scale and its applicability for SPM. We applied local heating coupled with piezoresponse force microscopy and confocal Raman spectroscopy for nanoscale investigations of a ferroelectric-paraelectric phase transition in the copper indium thiophosphate layered ferroelectric. Bayesian linear unmixing applied to experimental results allowed extraction of themore » Raman spectra of different material phases and enabled temperature calibration in the heated region. Lastly, the obtained results enable a systematic approach for studying temperature-dependent material functionalities in heretofore unavailable temperature regimes.« less

  6. A Century of Enzyme Kinetic Analysis, 1913 to 2013

    PubMed Central

    Johnson, Kenneth A.

    2013-01-01

    This review traces the history and logical progression of methods for quantitative analysis of enzyme kinetics from the 1913 Michaelis and Menten paper to the application of modern computational methods today. Following a brief review of methods for fitting steady state kinetic data, modern methods are highlighted for fitting full progress curve kinetics based upon numerical integration of rate equations, including a re-analysis of the original Michaelis-Menten full time course kinetic data. Finally, several illustrations of modern transient state kinetic methods of analysis are shown which enable the elucidation of reactions occurring at the active sites of enzymes in order to relate structure and function. PMID:23850893

  7. Analyzing the texture changes in the quantitative phase maps of adipocytes

    NASA Astrophysics Data System (ADS)

    Roitshtain, Darina; Sharabani-Yosef, Orna; Gefen, Amit; Shaked, Natan T.

    2016-03-01

    We present a new analysis tool for studying texture changes in the quantitative phase maps of live cells acquired by wide-field interferometry. The sensitivity of wide-field interferometry systems to small changes in refractive index enables visualizing cells and inner cell organelles without the using fluorescent dyes or other cell-invasive approaches, which may affect the measurement and require external labeling. Our label-free texture-analysis tool is based directly on the optical path delay profile of the sample and does not necessitate decoupling refractive index and thickness in the cell quantitative phase profile; thus, relevant parameters can be calculated using a single-frame acquisition. Our experimental system includes low-coherence wide-field interferometer, combined with simultaneous florescence microscopy system for validation. We used this system and analysis tool for studying lipid droplets formation in adipocytes. The latter demonstration is relevant for various cellular functions such as lipid metabolism, protein storage and degradation to viral replication. These processes are functionally linked to several physiological and pathological conditions, including obesity and metabolic diseases. Quantification of these biological phenomena based on the texture changes in the cell phase map has a potential as a new cellular diagnosis tool.

  8. Application of pheB as a Reporter Gene for Geobacillus spp., Enabling Qualitative Colony Screening and Quantitative Analysis of Promoter Strength

    PubMed Central

    Bartosiak-Jentys, Jeremy; Eley, Kirstin

    2012-01-01

    The pheB gene from Geobacillus stearothermophilus DSM6285 has been exploited as a reporter gene for Geobacillus spp. The gene product, catechol 2,3-dioxygenase (C23O), catalyzes the formation of 2-hydroxymuconic semialdehyde, which can be readily assayed. The reporter was used to examine expression from the ldh promoter associated with fermentative metabolism. PMID:22685159

  9. ELITE S2 - A Facility for Quantitative Human Movement Analysis on Board the ISS

    NASA Astrophysics Data System (ADS)

    Neri, Gianluca; Mascetti, Gabriele; Zolesi, Valfredo

    2014-11-01

    This paper describes the activities for utilization and control of ELITE S2 on board the International Space Station (ISS). ELITE S2 is a payload of the Italian Space Agency (ASI) for quantitative human movement analysis in weightlessness. Within the frame of a bilateral agreement with NASA, ASI has funded a number of facilities, enabling different scientific experiments on board the ISS. ELITE S2 has been developed by the ASI contractor Kayser Italia, delivered to the Kennedy Space Center in 2006 for pre-flight processing, launched in 2007 by the Space Shuttle Endeavour (STS-118), integrated in the U.S. lab and used during the Increments 16/17 (2008) and 33/34 (2012/2013). The ELITE S2 flight segment comprises equipment mounted into an Express Rack and a number of stowed items to be deployed for experiment performance (video cameras and accessories). The ground segment consists in a User Support Operations Center (based at Kayser Italia) enabling real-time payload control and a number of User Home Bases (located at the ASI and PIs premises), for the scientific assessment of the experiment performance. Two scientific protocols on reaching and cognitive processing have been successfully performed in eight sessions involving three ISS crewmembers: IMAGINE 2 and MOVE.

  10. Boolean logic analysis for flow regime recognition of gas-liquid horizontal flow

    NASA Astrophysics Data System (ADS)

    Ramskill, Nicholas P.; Wang, Mi

    2011-10-01

    In order to develop a flowmeter for the accurate measurement of multiphase flows, it is of the utmost importance to correctly identify the flow regime present to enable the selection of the optimal method for metering. In this study, the horizontal flow of air and water in a pipeline was studied under a multitude of conditions using electrical resistance tomography but the flow regimes that are presented in this paper have been limited to plug and bubble air-water flows. This study proposes a novel method for recognition of the prevalent flow regime using only a fraction of the data, thus rendering the analysis more efficient. By considering the average conductivity of five zones along the central axis of the tomogram, key features can be identified, thus enabling the recognition of the prevalent flow regime. Boolean logic and frequency spectrum analysis has been applied for flow regime recognition. Visualization of the flow using the reconstructed images provides a qualitative comparison between different flow regimes. Application of the Boolean logic scheme enables a quantitative comparison of the flow patterns, thus reducing the subjectivity in the identification of the prevalent flow regime.

  11. Systems-Level Analysis of Innate Immunity

    PubMed Central

    Zak, Daniel E.; Tam, Vincent C.; Aderem, Alan

    2014-01-01

    Systems-level analysis of biological processes strives to comprehensively and quantitatively evaluate the interactions between the relevant molecular components over time, thereby enabling development of models that can be employed to ultimately predict behavior. Rapid development in measurement technologies (omics), when combined with the accessible nature of the cellular constituents themselves, is allowing the field of innate immunity to take significant strides toward this lofty goal. In this review, we survey exciting results derived from systems biology analyses of the immune system, ranging from gene regulatory networks to influenza pathogenesis and systems vaccinology. PMID:24655298

  12. Investigating the quality of mental models deployed by undergraduate engineering students in creating explanations: The case of thermally activated phenomena

    NASA Astrophysics Data System (ADS)

    Fazio, Claudio; Battaglia, Onofrio Rosario; Di Paola, Benedetto

    2013-12-01

    This paper describes a method aimed at pointing out the quality of the mental models undergraduate engineering students deploy when asked to create explanations for phenomena or processes and/or use a given model in the same context. Student responses to a specially designed written questionnaire are quantitatively analyzed using researcher-generated categories of reasoning, based on the physics education research literature on student understanding of the relevant physics content. The use of statistical implicative analysis tools allows us to successfully identify clusters of students with respect to the similarity to the reasoning categories, defined as “practical or everyday,” “descriptive,” or “explicative.” Through the use of similarity and implication indexes our method also enables us to study the consistency in students’ deployment of mental models. A qualitative analysis of interviews conducted with students after they had completed the questionnaire is used to clarify some aspects which emerged from the quantitative analysis and validate the results obtained. Some implications of this joint use of quantitative and qualitative analysis for the design of a learning environment focused on the understanding of some aspects of the world at the level of causation and mechanisms of functioning are discussed.

  13. Quantitative mass spectrometry of unconventional human biological matrices

    NASA Astrophysics Data System (ADS)

    Dutkiewicz, Ewelina P.; Urban, Pawel L.

    2016-10-01

    The development of sensitive and versatile mass spectrometric methodology has fuelled interest in the analysis of metabolites and drugs in unconventional biological specimens. Here, we discuss the analysis of eight human matrices-hair, nail, breath, saliva, tears, meibum, nasal mucus and skin excretions (including sweat)-by mass spectrometry (MS). The use of such specimens brings a number of advantages, the most important being non-invasive sampling, the limited risk of adulteration and the ability to obtain information that complements blood and urine tests. The most often studied matrices are hair, breath and saliva. This review primarily focuses on endogenous (e.g. potential biomarkers, hormones) and exogenous (e.g. drugs, environmental contaminants) small molecules. The majority of analytical methods used chromatographic separation prior to MS; however, such a hyphenated methodology greatly limits analytical throughput. On the other hand, the mass spectrometric methods that exclude chromatographic separation are fast but suffer from matrix interferences. To enable development of quantitative assays for unconventional matrices, it is desirable to standardize the protocols for the analysis of each specimen and create appropriate certified reference materials. Overcoming these challenges will make analysis of unconventional human biological matrices more common in a clinical setting. This article is part of the themed issue 'Quantitative mass spectrometry'.

  14. Current trends in quantitative proteomics - an update.

    PubMed

    Li, H; Han, J; Pan, J; Liu, T; Parker, C E; Borchers, C H

    2017-05-01

    Proteins can provide insights into biological processes at the functional level, so they are very promising biomarker candidates. The quantification of proteins in biological samples has been routinely used for the diagnosis of diseases and monitoring the treatment. Although large-scale protein quantification in complex samples is still a challenging task, a great amount of effort has been made to advance the technologies that enable quantitative proteomics. Seven years ago, in 2009, we wrote an article about the current trends in quantitative proteomics. In writing this current paper, we realized that, today, we have an even wider selection of potential tools for quantitative proteomics. These tools include new derivatization reagents, novel sampling formats, new types of analyzers and scanning techniques, and recently developed software to assist in assay development and data analysis. In this review article, we will discuss these innovative methods, and their current and potential applications in proteomics. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  15. Digital pathology and image analysis for robust high-throughput quantitative assessment of Alzheimer disease neuropathologic changes.

    PubMed

    Neltner, Janna Hackett; Abner, Erin Lynn; Schmitt, Frederick A; Denison, Stephanie Kay; Anderson, Sonya; Patel, Ela; Nelson, Peter T

    2012-12-01

    Quantitative neuropathologic methods provide information that is important for both research and clinical applications. The technologic advancement of digital pathology and image analysis offers new solutions to enable valid quantification of pathologic severity that is reproducible between raters regardless of experience. Using an Aperio ScanScope XT and its accompanying image analysis software, we designed algorithms for quantitation of amyloid and tau pathologies on 65 β-amyloid (6F/3D antibody) and 48 phospho-tau (PHF-1)-immunostained sections of human temporal neocortex. Quantitative digital pathologic data were compared with manual pathology counts. There were excellent correlations between manually counted and digitally analyzed neuropathologic parameters (R² = 0.56-0.72). Data were highly reproducible among 3 participants with varying degrees of expertise in neuropathology (intraclass correlation coefficient values, >0.910). Digital quantification also provided additional parameters, including average plaque area, which shows statistically significant differences when samples are stratified according to apolipoprotein E allele status (average plaque area, 380.9 μm² in apolipoprotein E [Latin Small Letter Open E]4 carriers vs 274.4 μm² for noncarriers; p < 0.001). Thus, digital pathology offers a rigorous and reproducible method for quantifying Alzheimer disease neuropathologic changes and may provide additional insights into morphologic characteristics that were previously more challenging to assess because of technical limitations.

  16. Use of the pH sensitive fluorescence probe pyranine to monitor internal pH changes in Escherichia coli membrane vesicles.

    PubMed

    Damiano, E; Bassilana, M; Rigaud, J L; Leblanc, G

    1984-01-23

    Measurements of the fluorescent properties of 8-hydroxy-1,3,6-pyrenetrisulfonate (pyranine) enclosed within the internal space of Escherichia coli membrane vesicles enable recordings and quantitative analysis of: (i) changes in intravesicular pH taking place during oxidation of electron donors by the membrane respiratory chain; (ii) transient alkalization of the internal aqueous space resulting from the creation of outwardly directed acetate diffusion gradients across the vesicular membrane. Quantitation of the fluorescence variations recorded during the creation of transmembrane acetate gradients shows a close correspondence between the measured shifts in internal pH value and those expected from the amplitude of the imposed acetate gradients.

  17. Guidance for using mixed methods design in nursing practice research.

    PubMed

    Chiang-Hanisko, Lenny; Newman, David; Dyess, Susan; Piyakong, Duangporn; Liehr, Patricia

    2016-08-01

    The mixed methods approach purposefully combines both quantitative and qualitative techniques, enabling a multi-faceted understanding of nursing phenomena. The purpose of this article is to introduce three mixed methods designs (parallel; sequential; conversion) and highlight interpretive processes that occur with the synthesis of qualitative and quantitative findings. Real world examples of research studies conducted by the authors will demonstrate the processes leading to the merger of data. The examples include: research questions; data collection procedures and analysis with a focus on synthesizing findings. Based on experience with mixed methods studied, the authors introduce two synthesis patterns (complementary; contrasting), considering application for practice and implications for research. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Three-dimensional segmentation of luminal and adventitial borders in serial intravascular ultrasound images

    NASA Technical Reports Server (NTRS)

    Shekhar, R.; Cothren, R. M.; Vince, D. G.; Chandra, S.; Thomas, J. D.; Cornhill, J. F.

    1999-01-01

    Intravascular ultrasound (IVUS) provides exact anatomy of arteries, allowing accurate quantitative analysis. Automated segmentation of IVUS images is a prerequisite for routine quantitative analyses. We present a new three-dimensional (3D) segmentation technique, called active surface segmentation, which detects luminal and adventitial borders in IVUS pullback examinations of coronary arteries. The technique was validated against expert tracings by computing correlation coefficients (range 0.83-0.97) and William's index values (range 0.37-0.66). The technique was statistically accurate, robust to image artifacts, and capable of segmenting a large number of images rapidly. Active surface segmentation enabled geometrically accurate 3D reconstruction and visualization of coronary arteries and volumetric measurements.

  19. Visual Aggregate Analysis of Eligibility Features of Clinical Trials

    PubMed Central

    He, Zhe; Carini, Simona; Sim, Ida; Weng, Chunhua

    2015-01-01

    Objective To develop a method for profiling the collective populations targeted for recruitment by multiple clinical studies addressing the same medical condition using one eligibility feature each time. Methods Using a previously published database COMPACT as the backend, we designed a scalable method for visual aggregate analysis of clinical trial eligibility features. This method consists of four modules for eligibility feature frequency analysis, query builder, distribution analysis, and visualization, respectively. This method is capable of analyzing (1) frequently used qualitative and quantitative features for recruiting subjects for a selected medical condition, (2) distribution of study enrollment on consecutive value points or value intervals of each quantitative feature, and (3) distribution of studies on the boundary values, permissible value ranges, and value range widths of each feature. All analysis results were visualized using Google Charts API. Five recruited potential users assessed the usefulness of this method for identifying common patterns in any selected eligibility feature for clinical trial participant selection. Results We implemented this method as a Web-based analytical system called VITTA (Visual Analysis Tool of Clinical Study Target Populations). We illustrated the functionality of VITTA using two sample queries involving quantitative features BMI and HbA1c for conditions “hypertension” and “Type 2 diabetes”, respectively. The recruited potential users rated the user-perceived usefulness of VITTA with an average score of 86.4/100. Conclusions We contributed a novel aggregate analysis method to enable the interrogation of common patterns in quantitative eligibility criteria and the collective target populations of multiple related clinical studies. A larger-scale study is warranted to formally assess the usefulness of VITTA among clinical investigators and sponsors in various therapeutic areas. PMID:25615940

  20. Visual aggregate analysis of eligibility features of clinical trials.

    PubMed

    He, Zhe; Carini, Simona; Sim, Ida; Weng, Chunhua

    2015-04-01

    To develop a method for profiling the collective populations targeted for recruitment by multiple clinical studies addressing the same medical condition using one eligibility feature each time. Using a previously published database COMPACT as the backend, we designed a scalable method for visual aggregate analysis of clinical trial eligibility features. This method consists of four modules for eligibility feature frequency analysis, query builder, distribution analysis, and visualization, respectively. This method is capable of analyzing (1) frequently used qualitative and quantitative features for recruiting subjects for a selected medical condition, (2) distribution of study enrollment on consecutive value points or value intervals of each quantitative feature, and (3) distribution of studies on the boundary values, permissible value ranges, and value range widths of each feature. All analysis results were visualized using Google Charts API. Five recruited potential users assessed the usefulness of this method for identifying common patterns in any selected eligibility feature for clinical trial participant selection. We implemented this method as a Web-based analytical system called VITTA (Visual Analysis Tool of Clinical Study Target Populations). We illustrated the functionality of VITTA using two sample queries involving quantitative features BMI and HbA1c for conditions "hypertension" and "Type 2 diabetes", respectively. The recruited potential users rated the user-perceived usefulness of VITTA with an average score of 86.4/100. We contributed a novel aggregate analysis method to enable the interrogation of common patterns in quantitative eligibility criteria and the collective target populations of multiple related clinical studies. A larger-scale study is warranted to formally assess the usefulness of VITTA among clinical investigators and sponsors in various therapeutic areas. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Evaluation of empirical rule of linearly correlated peptide selection (ERLPS) for proteotypic peptide-based quantitative proteomics.

    PubMed

    Liu, Kehui; Zhang, Jiyang; Fu, Bin; Xie, Hongwei; Wang, Yingchun; Qian, Xiaohong

    2014-07-01

    Precise protein quantification is essential in comparative proteomics. Currently, quantification bias is inevitable when using proteotypic peptide-based quantitative proteomics strategy for the differences in peptides measurability. To improve quantification accuracy, we proposed an "empirical rule for linearly correlated peptide selection (ERLPS)" in quantitative proteomics in our previous work. However, a systematic evaluation on general application of ERLPS in quantitative proteomics under diverse experimental conditions needs to be conducted. In this study, the practice workflow of ERLPS was explicitly illustrated; different experimental variables, such as, different MS systems, sample complexities, sample preparations, elution gradients, matrix effects, loading amounts, and other factors were comprehensively investigated to evaluate the applicability, reproducibility, and transferability of ERPLS. The results demonstrated that ERLPS was highly reproducible and transferable within appropriate loading amounts and linearly correlated response peptides should be selected for each specific experiment. ERLPS was used to proteome samples from yeast to mouse and human, and in quantitative methods from label-free to O18/O16-labeled and SILAC analysis, and enabled accurate measurements for all proteotypic peptide-based quantitative proteomics over a large dynamic range. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Spin-polarized scanning tunneling microscopy with quantitative insights into magnetic probes

    NASA Astrophysics Data System (ADS)

    Phark, Soo-hyon; Sander, Dirk

    2017-04-01

    Spin-polarized scanning tunneling microscopy and spectroscopy (spin-STM/S) have been successfully applied to magnetic characterizations of individual nanostructures. Spin-STM/S is often performed in magnetic fields of up to some Tesla, which may strongly influence the tip state. In spite of the pivotal role of the tip in spin-STM/S, the contribution of the tip to the differential conductance d I/d V signal in an external field has rarely been investigated in detail. In this review, an advanced analysis of spin-STM/S data measured on magnetic nanoislands, which relies on a quantitative magnetic characterization of tips, is discussed. Taking advantage of the uniaxial out-of-plane magnetic anisotropy of Co bilayer nanoisland on Cu(111), in-field spin-STM on this system has enabled a quantitative determination, and thereby, a categorization of the magnetic states of the tips. The resulting in-depth and conclusive analysis of magnetic characterization of the tip opens new venues for a clear-cut sub-nanometer scale spin ordering and spin-dependent electronic structure of the non-collinear magnetic state in bilayer high Fe nanoislands on Cu(111).

  3. Hd6, a rice quantitative trait locus involved in photoperiod sensitivity, encodes the α subunit of protein kinase CK2

    PubMed Central

    Takahashi, Yuji; Shomura, Ayahiko; Sasaki, Takuji; Yano, Masahiro

    2001-01-01

    Hd6 is a quantitative trait locus involved in rice photoperiod sensitivity. It was detected in backcross progeny derived from a cross between the japonica variety Nipponbare and the indica variety Kasalath. To isolate a gene at Hd6, we used a large segregating population for the high-resolution and fine-scale mapping of Hd6 and constructed genomic clone contigs around the Hd6 region. Linkage analysis with P1-derived artificial chromosome clone-derived DNA markers delimited Hd6 to a 26.4-kb genomic region. We identified a gene encoding the α subunit of protein kinase CK2 (CK2α) in this region. The Nipponbare allele of CK2α contains a premature stop codon, and the resulting truncated product is undoubtedly nonfunctional. Genetic complementation analysis revealed that the Kasalath allele of CK2α increases days-to-heading. Map-based cloning with advanced backcross progeny enabled us to identify a gene underlying a quantitative trait locus even though it exhibited a relatively small effect on the phenotype. PMID:11416158

  4. Toward quantitative estimation of material properties with dynamic mode atomic force microscopy: a comparative study.

    PubMed

    Ghosal, Sayan; Gannepalli, Anil; Salapaka, Murti

    2017-08-11

    In this article, we explore methods that enable estimation of material properties with the dynamic mode atomic force microscopy suitable for soft matter investigation. The article presents the viewpoint of casting the system, comprising of a flexure probe interacting with the sample, as an equivalent cantilever system and compares a steady-state analysis based method with a recursive estimation technique for determining the parameters of the equivalent cantilever system in real time. The steady-state analysis of the equivalent cantilever model, which has been implicitly assumed in studies on material property determination, is validated analytically and experimentally. We show that the steady-state based technique yields results that quantitatively agree with the recursive method in the domain of its validity. The steady-state technique is considerably simpler to implement, however, slower compared to the recursive technique. The parameters of the equivalent system are utilized to interpret storage and dissipative properties of the sample. Finally, the article identifies key pitfalls that need to be avoided toward the quantitative estimation of material properties.

  5. Quantitative model analysis with diverse biological data: applications in developmental pattern formation.

    PubMed

    Pargett, Michael; Umulis, David M

    2013-07-15

    Mathematical modeling of transcription factor and signaling networks is widely used to understand if and how a mechanism works, and to infer regulatory interactions that produce a model consistent with the observed data. Both of these approaches to modeling are informed by experimental data, however, much of the data available or even acquirable are not quantitative. Data that is not strictly quantitative cannot be used by classical, quantitative, model-based analyses that measure a difference between the measured observation and the model prediction for that observation. To bridge the model-to-data gap, a variety of techniques have been developed to measure model "fitness" and provide numerical values that can subsequently be used in model optimization or model inference studies. Here, we discuss a selection of traditional and novel techniques to transform data of varied quality and enable quantitative comparison with mathematical models. This review is intended to both inform the use of these model analysis methods, focused on parameter estimation, and to help guide the choice of method to use for a given study based on the type of data available. Applying techniques such as normalization or optimal scaling may significantly improve the utility of current biological data in model-based study and allow greater integration between disparate types of data. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. Improved sample preparation of glyphosate and methylphosphonic acid by EPA method 6800A and time-of-flight mass spectrometry using novel solid-phase extraction.

    PubMed

    Wagner, Rebecca; Wetzel, Stephanie J; Kern, John; Kingston, H M Skip

    2012-02-01

    The employment of chemical weapons by rogue states and/or terrorist organizations is an ongoing concern in the United States. The quantitative analysis of nerve agents must be rapid and reliable for use in the private and public sectors. Current methods describe a tedious and time-consuming derivatization for gas chromatography-mass spectrometry and liquid chromatography in tandem with mass spectrometry. Two solid-phase extraction (SPE) techniques for the analysis of glyphosate and methylphosphonic acid are described with the utilization of isotopically enriched analytes for quantitation via atmospheric pressure chemical ionization-quadrupole time-of-flight mass spectrometry (APCI-Q-TOF-MS) that does not require derivatization. Solid-phase extraction-isotope dilution mass spectrometry (SPE-IDMS) involves pre-equilibration of a naturally occurring sample with an isotopically enriched standard. The second extraction method, i-Spike, involves loading an isotopically enriched standard onto the SPE column before the naturally occurring sample. The sample and the spike are then co-eluted from the column enabling precise and accurate quantitation via IDMS. The SPE methods in conjunction with IDMS eliminate concerns of incomplete elution, matrix and sorbent effects, and MS drift. For accurate quantitation with IDMS, the isotopic contribution of all atoms in the target molecule must be statistically taken into account. This paper describes two newly developed sample preparation techniques for the analysis of nerve agent surrogates in drinking water as well as statistical probability analysis for proper molecular IDMS. The methods described in this paper demonstrate accurate molecular IDMS using APCI-Q-TOF-MS with limits of quantitation as low as 0.400 mg/kg for glyphosate and 0.031 mg/kg for methylphosphonic acid. Copyright © 2012 John Wiley & Sons, Ltd.

  7. Comparison of three‐dimensional analysis and stereological techniques for quantifying lithium‐ion battery electrode microstructures

    PubMed Central

    TAIWO, OLUWADAMILOLA O.; FINEGAN, DONAL P.; EASTWOOD, DAVID S.; FIFE, JULIE L.; BROWN, LEON D.; DARR, JAWWAD A.; LEE, PETER D.; BRETT, DANIEL J.L.

    2016-01-01

    Summary Lithium‐ion battery performance is intrinsically linked to electrode microstructure. Quantitative measurement of key structural parameters of lithium‐ion battery electrode microstructures will enable optimization as well as motivate systematic numerical studies for the improvement of battery performance. With the rapid development of 3‐D imaging techniques, quantitative assessment of 3‐D microstructures from 2‐D image sections by stereological methods appears outmoded; however, in spite of the proliferation of tomographic imaging techniques, it remains significantly easier to obtain two‐dimensional (2‐D) data sets. In this study, stereological prediction and three‐dimensional (3‐D) analysis techniques for quantitative assessment of key geometric parameters for characterizing battery electrode microstructures are examined and compared. Lithium‐ion battery electrodes were imaged using synchrotron‐based X‐ray tomographic microscopy. For each electrode sample investigated, stereological analysis was performed on reconstructed 2‐D image sections generated from tomographic imaging, whereas direct 3‐D analysis was performed on reconstructed image volumes. The analysis showed that geometric parameter estimation using 2‐D image sections is bound to be associated with ambiguity and that volume‐based 3‐D characterization of nonconvex, irregular and interconnected particles can be used to more accurately quantify spatially‐dependent parameters, such as tortuosity and pore‐phase connectivity. PMID:26999804

  8. Comparison of three-dimensional analysis and stereological techniques for quantifying lithium-ion battery electrode microstructures.

    PubMed

    Taiwo, Oluwadamilola O; Finegan, Donal P; Eastwood, David S; Fife, Julie L; Brown, Leon D; Darr, Jawwad A; Lee, Peter D; Brett, Daniel J L; Shearing, Paul R

    2016-09-01

    Lithium-ion battery performance is intrinsically linked to electrode microstructure. Quantitative measurement of key structural parameters of lithium-ion battery electrode microstructures will enable optimization as well as motivate systematic numerical studies for the improvement of battery performance. With the rapid development of 3-D imaging techniques, quantitative assessment of 3-D microstructures from 2-D image sections by stereological methods appears outmoded; however, in spite of the proliferation of tomographic imaging techniques, it remains significantly easier to obtain two-dimensional (2-D) data sets. In this study, stereological prediction and three-dimensional (3-D) analysis techniques for quantitative assessment of key geometric parameters for characterizing battery electrode microstructures are examined and compared. Lithium-ion battery electrodes were imaged using synchrotron-based X-ray tomographic microscopy. For each electrode sample investigated, stereological analysis was performed on reconstructed 2-D image sections generated from tomographic imaging, whereas direct 3-D analysis was performed on reconstructed image volumes. The analysis showed that geometric parameter estimation using 2-D image sections is bound to be associated with ambiguity and that volume-based 3-D characterization of nonconvex, irregular and interconnected particles can be used to more accurately quantify spatially-dependent parameters, such as tortuosity and pore-phase connectivity. © 2016 The Authors. Journal of Microscopy published by John Wiley & Sons Ltd on behalf of Royal Microscopical Society.

  9. Quantitative RNA-seq analysis of the Campylobacter jejuni transcriptome

    PubMed Central

    Chaudhuri, Roy R.; Yu, Lu; Kanji, Alpa; Perkins, Timothy T.; Gardner, Paul P.; Choudhary, Jyoti; Maskell, Duncan J.

    2011-01-01

    Campylobacter jejuni is the most common bacterial cause of foodborne disease in the developed world. Its general physiology and biochemistry, as well as the mechanisms enabling it to colonize and cause disease in various hosts, are not well understood, and new approaches are required to understand its basic biology. High-throughput sequencing technologies provide unprecedented opportunities for functional genomic research. Recent studies have shown that direct Illumina sequencing of cDNA (RNA-seq) is a useful technique for the quantitative and qualitative examination of transcriptomes. In this study we report RNA-seq analyses of the transcriptomes of C. jejuni (NCTC11168) and its rpoN mutant. This has allowed the identification of hitherto unknown transcriptional units, and further defines the regulon that is dependent on rpoN for expression. The analysis of the NCTC11168 transcriptome was supplemented by additional proteomic analysis using liquid chromatography-MS. The transcriptomic and proteomic datasets represent an important resource for the Campylobacter research community. PMID:21816880

  10. General description and understanding of the nonlinear dynamics of mode-locked fiber lasers.

    PubMed

    Wei, Huai; Li, Bin; Shi, Wei; Zhu, Xiushan; Norwood, Robert A; Peyghambarian, Nasser; Jian, Shuisheng

    2017-05-02

    As a type of nonlinear system with complexity, mode-locked fiber lasers are known for their complex behaviour. It is a challenging task to understand the fundamental physics behind such complex behaviour, and a unified description for the nonlinear behaviour and the systematic and quantitative analysis of the underlying mechanisms of these lasers have not been developed. Here, we present a complexity science-based theoretical framework for understanding the behaviour of mode-locked fiber lasers by going beyond reductionism. This hierarchically structured framework provides a model with variable dimensionality, resulting in a simple view that can be used to systematically describe complex states. Moreover, research into the attractors' basins reveals the origin of stochasticity, hysteresis and multistability in these systems and presents a new method for quantitative analysis of these nonlinear phenomena. These findings pave the way for dynamics analysis and system designs of mode-locked fiber lasers. We expect that this paradigm will also enable potential applications in diverse research fields related to complex nonlinear phenomena.

  11. Single and two-shot quantitative phase imaging using Hilbert-Huang Transform based fringe pattern analysis

    NASA Astrophysics Data System (ADS)

    Trusiak, Maciej; Micó, Vicente; Patorski, Krzysztof; García-Monreal, Javier; Sluzewski, Lukasz; Ferreira, Carlos

    2016-08-01

    In this contribution we propose two Hilbert-Huang Transform based algorithms for fast and accurate single-shot and two-shot quantitative phase imaging applicable in both on-axis and off-axis configurations. In the first scheme a single fringe pattern containing information about biological phase-sample under study is adaptively pre-filtered using empirical mode decomposition based approach. Further it is phase demodulated by the Hilbert Spiral Transform aided by the Principal Component Analysis for the local fringe orientation estimation. Orientation calculation enables closed fringes efficient analysis and can be avoided using arbitrary phase-shifted two-shot Gram-Schmidt Orthonormalization scheme aided by Hilbert-Huang Transform pre-filtering. This two-shot approach is a trade-off between single-frame and temporal phase shifting demodulation. Robustness of the proposed techniques is corroborated using experimental digital holographic microscopy studies of polystyrene micro-beads and red blood cells. Both algorithms compare favorably with the temporal phase shifting scheme which is used as a reference method.

  12. Quantitative, equal carbon response HSQC experiment, QEC-HSQC

    NASA Astrophysics Data System (ADS)

    Mäkelä, Valtteri; Helminen, Jussi; Kilpeläinen, Ilkka; Heikkinen, Sami

    2016-10-01

    Quantitative NMR has become increasingly useful and popular in recent years, with many new and emerging applications in metabolomics, quality control, reaction monitoring and other types of mixture analysis. While sensitive and simple to acquire, the low resolving power of 1D 1H NMR spectra can be a limiting factor when analyzing complex mixtures. This drawback can be solved by observing a different type of nuclei offering improved resolution or with multidimensional experiments, such as HSQC. In this paper, we present a novel Quantitative, Equal Carbon HSQC (QEC-HSQC) experiment providing an equal response across different type of carbons regardless of the number of attached protons, in addition to an uniform response over a wide range of 1JCH couplings. This enables rapid quantification and integration over multiple signals without the need for complete resonance assignments and simplifies the integration of overlapping signals.

  13. Global, quantitative and dynamic mapping of protein subcellular localization.

    PubMed

    Itzhak, Daniel N; Tyanova, Stefka; Cox, Jürgen; Borner, Georg Hh

    2016-06-09

    Subcellular localization critically influences protein function, and cells control protein localization to regulate biological processes. We have developed and applied Dynamic Organellar Maps, a proteomic method that allows global mapping of protein translocation events. We initially used maps statically to generate a database with localization and absolute copy number information for over 8700 proteins from HeLa cells, approaching comprehensive coverage. All major organelles were resolved, with exceptional prediction accuracy (estimated at >92%). Combining spatial and abundance information yielded an unprecedented quantitative view of HeLa cell anatomy and organellar composition, at the protein level. We subsequently demonstrated the dynamic capabilities of the approach by capturing translocation events following EGF stimulation, which we integrated into a quantitative model. Dynamic Organellar Maps enable the proteome-wide analysis of physiological protein movements, without requiring any reagents specific to the investigated process, and will thus be widely applicable in cell biology.

  14. Quantitative Image Restoration in Bright Field Optical Microscopy.

    PubMed

    Gutiérrez-Medina, Braulio; Sánchez Miranda, Manuel de Jesús

    2017-11-07

    Bright field (BF) optical microscopy is regarded as a poor method to observe unstained biological samples due to intrinsic low image contrast. We introduce quantitative image restoration in bright field (QRBF), a digital image processing method that restores out-of-focus BF images of unstained cells. Our procedure is based on deconvolution, using a point spread function modeled from theory. By comparing with reference images of bacteria observed in fluorescence, we show that QRBF faithfully recovers shape and enables quantify size of individual cells, even from a single input image. We applied QRBF in a high-throughput image cytometer to assess shape changes in Escherichia coli during hyperosmotic shock, finding size heterogeneity. We demonstrate that QRBF is also applicable to eukaryotic cells (yeast). Altogether, digital restoration emerges as a straightforward alternative to methods designed to generate contrast in BF imaging for quantitative analysis. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  15. Quantitative evaluation of bone resorption activity of osteoclast-like cells by measuring calcium phosphate resorbing area using incubator-facilitated and video-enhanced microscopy.

    PubMed

    Morimoto, Yoshitaka; Hoshino, Hironobu; Sakurai, Takashi; Terakawa, Susumu; Nagano, Akira

    2009-04-01

    Quantitative evaluation of the ability of bone resorption activity in live osteoclast-like cells (OCLs) has not yet been reported on. In this study, we observed the sequential morphological change of OCLs and measured the resorbing calcium phosphate (CP) area made by OCLs alone and with the addition of elcatonin utilizing incubator facilitated video-enhanced microscopy. OCLs, which were obtained from a coculture of ddy-mouse osteoblastic cells and bone marrow cells, were cultured on CP-coated quartz cover slips. The CP-free area increased constantly in the OCLs alone, whereas it did not increase after the addition of elcatonin. This study showed that analysis of the resorbed areas under the OCL body using this method enables the sequential quantitative evaluation of the bone resorption activity and the effect of several therapeutic agents on bone resorption in vitro.

  16. High-Throughput RT-PCR for small-molecule screening assays

    PubMed Central

    Bittker, Joshua A.

    2012-01-01

    Quantitative measurement of the levels of mRNA expression using real-time reverse transcription polymerase chain reaction (RT-PCR) has long been used for analyzing expression differences in tissue or cell lines of interest. This method has been used somewhat less frequently to measure the changes in gene expression due to perturbagens such as small molecules or siRNA. The availability of new instrumentation for liquid handling and real-time PCR analysis as well as the commercial availability of start-to-finish kits for RT-PCR has enabled the use of this method for high-throughput small-molecule screening on a scale comparable to traditional high-throughput screening (HTS) assays. This protocol focuses on the special considerations necessary for using quantitative RT-PCR as a primary small-molecule screening assay, including the different methods available for mRNA isolation and analysis. PMID:23487248

  17. Energetic Passivity of the Human Ankle Joint.

    PubMed

    Lee, Hyunglae; Hogan, Neville

    2016-12-01

    Understanding the passive or nonpassive behavior of the neuromuscular system is important to design and control robots that physically interact with humans, since it provides quantitative information to secure coupled stability while maximizing performance. This has become more important than ever apace with the increasing demand for robotic technologies in neurorehabilitation. This paper presents a quantitative characterization of passive and nonpassive behavior of the ankle of young healthy subjects, which provides a baseline for future studies in persons with neurological impairments and information for future developments of rehabilitation robots, such as exoskeletal devices and powered prostheses. Measurements using a wearable ankle robot actuating 2 degrees-of-freedom of the ankle combined with curl analysis and passivity analysis enabled characterization of both quasi-static and steady-state dynamic behavior of the ankle, unavailable from single DOF studies. Despite active neuromuscular control over a wide range of muscle activation, in young healthy subjects passive or dissipative ankle behavior predominated.

  18. Quantitative proteomics in cardiovascular research: global and targeted strategies

    PubMed Central

    Shen, Xiaomeng; Young, Rebeccah; Canty, John M.; Qu, Jun

    2014-01-01

    Extensive technical advances in the past decade have substantially expanded quantitative proteomics in cardiovascular research. This has great promise for elucidating the mechanisms of cardiovascular diseases (CVD) and the discovery of cardiac biomarkers used for diagnosis and treatment evaluation. Global and targeted proteomics are the two major avenues of quantitative proteomics. While global approaches enable unbiased discovery of altered proteins via relative quantification at the proteome level, targeted techniques provide higher sensitivity and accuracy, and are capable of multiplexed absolute quantification in numerous clinical/biological samples. While promising, technical challenges need to be overcome to enable full utilization of these techniques in cardiovascular medicine. Here we discuss recent advances in quantitative proteomics and summarize applications in cardiovascular research with an emphasis on biomarker discovery and elucidating molecular mechanisms of disease. We propose the integration of global and targeted strategies as a high-throughput pipeline for cardiovascular proteomics. Targeted approaches enable rapid, extensive validation of biomarker candidates discovered by global proteomics. These approaches provide a promising alternative to immunoassays and other low-throughput means currently used for limited validation. PMID:24920501

  19. Real-Time Mapping Spectroscopy on the Ground, in the Air, and in Space

    NASA Astrophysics Data System (ADS)

    Thompson, D. R.; Allwood, A.; Chien, S.; Green, R. O.; Wettergreen, D. S.

    2016-12-01

    Real-time data interpretation can benefit both remote in situ exploration and remote sensing. Basic analyses at the sensor can monitor instrument performance and reveal invisible science phenomena in real time. This promotes situational awareness for remote robotic explorers or campaign decision makers, enabling adaptive data collection, reduced downlink requirements, and coordinated multi-instrument observations. Fast analysis is ideal for mapping spectrometers providing unambiguous, quantitative geophysical measurements. This presentation surveys recent computational advances in real-time spectroscopic analysis for Earth science and planetary exploration. Spectral analysis at the sensor enables new operations concepts that significantly improve science yield. Applications include real-time detection of fugitive greenhouse emissions by airborne monitoring, real-time cloud screening and mineralogical mapping by orbital spectrometers, and adaptive measurement by the PIXL instrument on the Mars 2020 rover. Copyright 2016 California Institute of Technology. All Rights Reserved. We acknowledge support of the US Government, NASA, the Earth Science Division and Terrestrial Ecology program.

  20. The identification and characterization of non-coding and coding RNAs and their modified nucleosides by mass spectrometry

    PubMed Central

    Gaston, Kirk W; Limbach, Patrick A

    2014-01-01

    The analysis of ribonucleic acids (RNA) by mass spectrometry has been a valuable analytical approach for more than 25 years. In fact, mass spectrometry has become a method of choice for the analysis of modified nucleosides from RNA isolated out of biological samples. This review summarizes recent progress that has been made in both nucleoside and oligonucleotide mass spectral analysis. Applications of mass spectrometry in the identification, characterization and quantification of modified nucleosides are discussed. At the oligonucleotide level, advances in modern mass spectrometry approaches combined with the standard RNA modification mapping protocol enable the characterization of RNAs of varying lengths ranging from low molecular weight short interfering RNAs (siRNAs) to the extremely large 23 S rRNAs. New variations and improvements to this protocol are reviewed, including top-down strategies, as these developments now enable qualitative and quantitative measurements of RNA modification patterns in a variety of biological systems. PMID:25616408

  1. The identification and characterization of non-coding and coding RNAs and their modified nucleosides by mass spectrometry.

    PubMed

    Gaston, Kirk W; Limbach, Patrick A

    2014-01-01

    The analysis of ribonucleic acids (RNA) by mass spectrometry has been a valuable analytical approach for more than 25 years. In fact, mass spectrometry has become a method of choice for the analysis of modified nucleosides from RNA isolated out of biological samples. This review summarizes recent progress that has been made in both nucleoside and oligonucleotide mass spectral analysis. Applications of mass spectrometry in the identification, characterization and quantification of modified nucleosides are discussed. At the oligonucleotide level, advances in modern mass spectrometry approaches combined with the standard RNA modification mapping protocol enable the characterization of RNAs of varying lengths ranging from low molecular weight short interfering RNAs (siRNAs) to the extremely large 23 S rRNAs. New variations and improvements to this protocol are reviewed, including top-down strategies, as these developments now enable qualitative and quantitative measurements of RNA modification patterns in a variety of biological systems.

  2. 75 FR 373 - Agency Information Collection Activities; Submission for Office of Management and Budget Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-05

    ... Request; Experimental Study: Presentation of Quantitative Effectiveness Information to Consumers in Direct... clearance. Experimental Study: Presentation of Quantitative Effectiveness Information to Consumers in Direct... research has proposed that providing quantitative information about product efficacy enables consumers to...

  3. P-MartCancer-Interactive Online Software to Enable Analysis of Shotgun Cancer Proteomic Datasets.

    PubMed

    Webb-Robertson, Bobbie-Jo M; Bramer, Lisa M; Jensen, Jeffrey L; Kobold, Markus A; Stratton, Kelly G; White, Amanda M; Rodland, Karin D

    2017-11-01

    P-MartCancer is an interactive web-based software environment that enables statistical analyses of peptide or protein data, quantitated from mass spectrometry-based global proteomics experiments, without requiring in-depth knowledge of statistical programming. P-MartCancer offers a series of statistical modules associated with quality assessment, peptide and protein statistics, protein quantification, and exploratory data analyses driven by the user via customized workflows and interactive visualization. Currently, P-MartCancer offers access and the capability to analyze multiple cancer proteomic datasets generated through the Clinical Proteomics Tumor Analysis Consortium at the peptide, gene, and protein levels. P-MartCancer is deployed as a web service (https://pmart.labworks.org/cptac.html), alternatively available via Docker Hub (https://hub.docker.com/r/pnnl/pmart-web/). Cancer Res; 77(21); e47-50. ©2017 AACR . ©2017 American Association for Cancer Research.

  4. Fast Metabolic Response to Drug Intervention through Analysis on a Miniaturized, Highly Integrated Molecular Imaging System

    PubMed Central

    Wang, Jun; Hwang, Kiwook; Braas, Daniel; Dooraghi, Alex; Nathanson, David; Campbell, Dean O.; Gu, Yuchao; Sandberg, Troy; Mischel, Paul; Radu, Caius; Chatziioannou, Arion F.; Phelps, Michael E.; Christofk, Heather; Heath, James R.

    2014-01-01

    We report on a radiopharmaceutical imaging platform designed to capture the kinetics of cellular responses to drugs. Methods A portable in vitro molecular imaging system, comprised of a microchip and a beta-particle imaging camera, permits routine cell-based radioassays on small number of either suspension or adherent cells. We investigate the response kinetics of model lymphoma and glioblastoma cancer cell lines to [18F]fluorodeoxyglucose ([18F]FDG) uptake following drug exposure. Those responses are correlated with kinetic changes in the cell cycle, or with changes in receptor-tyrosine kinase signaling. Results The platform enables radioassays directly on multiple cell types, and yields results comparable to conventional approaches, but uses smaller sample sizes, permits a higher level of quantitation, and doesn’t require cell lysis. Conclusion The kinetic analysis enabled by the platform provides a rapid (~1 hour) drug screening assay. PMID:23978446

  5. Computational Analysis of Behavior.

    PubMed

    Egnor, S E Roian; Branson, Kristin

    2016-07-08

    In this review, we discuss the emerging field of computational behavioral analysis-the use of modern methods from computer science and engineering to quantitatively measure animal behavior. We discuss aspects of experiment design important to both obtaining biologically relevant behavioral data and enabling the use of machine vision and learning techniques for automation. These two goals are often in conflict. Restraining or restricting the environment of the animal can simplify automatic behavior quantification, but it can also degrade the quality or alter important aspects of behavior. To enable biologists to design experiments to obtain better behavioral measurements, and computer scientists to pinpoint fruitful directions for algorithm improvement, we review known effects of artificial manipulation of the animal on behavior. We also review machine vision and learning techniques for tracking, feature extraction, automated behavior classification, and automated behavior discovery, the assumptions they make, and the types of data they work best with.

  6. Quantitative Imaging in Cancer Evolution and Ecology

    PubMed Central

    Grove, Olya; Gillies, Robert J.

    2013-01-01

    Cancer therapy, even when highly targeted, typically fails because of the remarkable capacity of malignant cells to evolve effective adaptations. These evolutionary dynamics are both a cause and a consequence of cancer system heterogeneity at many scales, ranging from genetic properties of individual cells to large-scale imaging features. Tumors of the same organ and cell type can have remarkably diverse appearances in different patients. Furthermore, even within a single tumor, marked variations in imaging features, such as necrosis or contrast enhancement, are common. Similar spatial variations recently have been reported in genetic profiles. Radiologic heterogeneity within tumors is usually governed by variations in blood flow, whereas genetic heterogeneity is typically ascribed to random mutations. However, evolution within tumors, as in all living systems, is subject to Darwinian principles; thus, it is governed by predictable and reproducible interactions between environmental selection forces and cell phenotype (not genotype). This link between regional variations in environmental properties and cellular adaptive strategies may permit clinical imaging to be used to assess and monitor intratumoral evolution in individual patients. This approach is enabled by new methods that extract, report, and analyze quantitative, reproducible, and mineable clinical imaging data. However, most current quantitative metrics lack spatialness, expressing quantitative radiologic features as a single value for a region of interest encompassing the whole tumor. In contrast, spatially explicit image analysis recognizes that tumors are heterogeneous but not well mixed and defines regionally distinct habitats, some of which appear to harbor tumor populations that are more aggressive and less treatable than others. By identifying regional variations in key environmental selection forces and evidence of cellular adaptation, clinical imaging can enable us to define intratumoral Darwinian dynamics before and during therapy. Advances in image analysis will place clinical imaging in an increasingly central role in the development of evolution-based patient-specific cancer therapy. © RSNA, 2013 PMID:24062559

  7. Protocol for Standardizing High-to-Moderate Abundance Protein Biomarker Assessments Through an MRM-with-Standard-Peptides Quantitative Approach.

    PubMed

    Percy, Andrew J; Yang, Juncong; Chambers, Andrew G; Mohammed, Yassene; Miliotis, Tasso; Borchers, Christoph H

    2016-01-01

    Quantitative mass spectrometry (MS)-based approaches are emerging as a core technology for addressing health-related queries in systems biology and in the biomedical and clinical fields. In several 'omics disciplines (proteomics included), an approach centered on selected or multiple reaction monitoring (SRM or MRM)-MS with stable isotope-labeled standards (SIS), at the protein or peptide level, has emerged as the most precise technique for quantifying and screening putative analytes in biological samples. To enable the widespread use of MRM-based protein quantitation for disease biomarker assessment studies and its ultimate acceptance for clinical analysis, the technique must be standardized to facilitate precise and accurate protein quantitation. To that end, we have developed a number of kits for assessing method/platform performance, as well as for screening proposed candidate protein biomarkers in various human biofluids. Collectively, these kits utilize a bottom-up LC-MS methodology with SIS peptides as internal standards and quantify proteins using regression analysis of standard curves. This chapter details the methodology used to quantify 192 plasma proteins of high-to-moderate abundance (covers a 6 order of magnitude range from 31 mg/mL for albumin to 18 ng/mL for peroxidredoxin-2), and a 21-protein subset thereof. We also describe the application of this method to patient samples for biomarker discovery and verification studies. Additionally, we introduce our recently developed Qualis-SIS software, which is used to expedite the analysis and assessment of protein quantitation data in control and patient samples.

  8. Chiral Analysis of Isopulegol by Fourier Transform Molecular Rotational Spectroscopy

    NASA Astrophysics Data System (ADS)

    Evangelisti, Luca; Seifert, Nathan A.; Spada, Lorenzo; Pate, Brooks

    2016-06-01

    Chiral analysis on molecules with multiple chiral centers can be performed using pulsed-jet Fourier transform rotational spectroscopy. This analysis includes quantitative measurement of diastereomer products and, with the three wave mixing methods developed by Patterson, Schnell, and Doyle (Nature 497, 475-477 (2013)), quantitative determination of the enantiomeric excess of each diastereomer. The high resolution features enable to perform the analysis directly on complex samples without the need for chromatographic separation. Isopulegol has been chosen to show the capabilities of Fourier transform rotational spectroscopy for chiral analysis. Broadband rotational spectroscopy produces spectra with signal-to-noise ratio exceeding 1000:1. The ability to identify low-abundance (0.1-1%) diastereomers in the sample will be described. Methods to rapidly identify rotational spectra from isotopologues at natural abundance will be shown and the molecular structures obtained from this analysis will be compared to theory. The role that quantum chemistry calculations play in identifying structural minima and estimating their spectroscopic properties to aid spectral analysis will be described. Finally, the implementation of three wave mixing techniques to measure the enantiomeric excess of each diastereomer and determine the absolute configuration of the enantiomer in excess will be described.

  9. MATtrack: A MATLAB-Based Quantitative Image Analysis Platform for Investigating Real-Time Photo-Converted Fluorescent Signals in Live Cells.

    PubMed

    Courtney, Jane; Woods, Elena; Scholz, Dimitri; Hall, William W; Gautier, Virginie W

    2015-01-01

    We introduce here MATtrack, an open source MATLAB-based computational platform developed to process multi-Tiff files produced by a photo-conversion time lapse protocol for live cell fluorescent microscopy. MATtrack automatically performs a series of steps required for image processing, including extraction and import of numerical values from Multi-Tiff files, red/green image classification using gating parameters, noise filtering, background extraction, contrast stretching and temporal smoothing. MATtrack also integrates a series of algorithms for quantitative image analysis enabling the construction of mean and standard deviation images, clustering and classification of subcellular regions and injection point approximation. In addition, MATtrack features a simple user interface, which enables monitoring of Fluorescent Signal Intensity in multiple Regions of Interest, over time. The latter encapsulates a region growing method to automatically delineate the contours of Regions of Interest selected by the user, and performs background and regional Average Fluorescence Tracking, and automatic plotting. Finally, MATtrack computes convenient visualization and exploration tools including a migration map, which provides an overview of the protein intracellular trajectories and accumulation areas. In conclusion, MATtrack is an open source MATLAB-based software package tailored to facilitate the analysis and visualization of large data files derived from real-time live cell fluorescent microscopy using photoconvertible proteins. It is flexible, user friendly, compatible with Windows, Mac, and Linux, and a wide range of data acquisition software. MATtrack is freely available for download at eleceng.dit.ie/courtney/MATtrack.zip.

  10. MATtrack: A MATLAB-Based Quantitative Image Analysis Platform for Investigating Real-Time Photo-Converted Fluorescent Signals in Live Cells

    PubMed Central

    Courtney, Jane; Woods, Elena; Scholz, Dimitri; Hall, William W.; Gautier, Virginie W.

    2015-01-01

    We introduce here MATtrack, an open source MATLAB-based computational platform developed to process multi-Tiff files produced by a photo-conversion time lapse protocol for live cell fluorescent microscopy. MATtrack automatically performs a series of steps required for image processing, including extraction and import of numerical values from Multi-Tiff files, red/green image classification using gating parameters, noise filtering, background extraction, contrast stretching and temporal smoothing. MATtrack also integrates a series of algorithms for quantitative image analysis enabling the construction of mean and standard deviation images, clustering and classification of subcellular regions and injection point approximation. In addition, MATtrack features a simple user interface, which enables monitoring of Fluorescent Signal Intensity in multiple Regions of Interest, over time. The latter encapsulates a region growing method to automatically delineate the contours of Regions of Interest selected by the user, and performs background and regional Average Fluorescence Tracking, and automatic plotting. Finally, MATtrack computes convenient visualization and exploration tools including a migration map, which provides an overview of the protein intracellular trajectories and accumulation areas. In conclusion, MATtrack is an open source MATLAB-based software package tailored to facilitate the analysis and visualization of large data files derived from real-time live cell fluorescent microscopy using photoconvertible proteins. It is flexible, user friendly, compatible with Windows, Mac, and Linux, and a wide range of data acquisition software. MATtrack is freely available for download at eleceng.dit.ie/courtney/MATtrack.zip. PMID:26485569

  11. In vivo estimation of target registration errors during augmented reality laparoscopic surgery.

    PubMed

    Thompson, Stephen; Schneider, Crispin; Bosi, Michele; Gurusamy, Kurinchi; Ourselin, Sébastien; Davidson, Brian; Hawkes, David; Clarkson, Matthew J

    2018-06-01

    Successful use of augmented reality for laparoscopic surgery requires that the surgeon has a thorough understanding of the likely accuracy of any overlay. Whilst the accuracy of such systems can be estimated in the laboratory, it is difficult to extend such methods to the in vivo clinical setting. Herein we describe a novel method that enables the surgeon to estimate in vivo errors during use. We show that the method enables quantitative evaluation of in vivo data gathered with the SmartLiver image guidance system. The SmartLiver system utilises an intuitive display to enable the surgeon to compare the positions of landmarks visible in both a projected model and in the live video stream. From this the surgeon can estimate the system accuracy when using the system to locate subsurface targets not visible in the live video. Visible landmarks may be either point or line features. We test the validity of the algorithm using an anatomically representative liver phantom, applying simulated perturbations to achieve clinically realistic overlay errors. We then apply the algorithm to in vivo data. The phantom results show that using projected errors of surface features provides a reliable predictor of subsurface target registration error for a representative human liver shape. Applying the algorithm to in vivo data gathered with the SmartLiver image-guided surgery system shows that the system is capable of accuracies around 12 mm; however, achieving this reliably remains a significant challenge. We present an in vivo quantitative evaluation of the SmartLiver image-guided surgery system, together with a validation of the evaluation algorithm. This is the first quantitative in vivo analysis of an augmented reality system for laparoscopic surgery.

  12. The design analysis of a rechargeable lithium cell for space applications

    NASA Technical Reports Server (NTRS)

    Subba Rao, S.; Shen, D. H.; Yen, S. P. S.; Somoano, R. B.

    1986-01-01

    Ambient temperature rechargeable lithium batteries are needed by NASA for advanced space power applications for future missions. Specific energies of not less than 100 Wh/kg and long cycle life are critical performance goals. A design analysis of a 35 Ah Li-TiS2 cell was carried out using literature and experimental data to identify key design parameters governing specific energy. It is found that high specific energies are achievable in prismatic cells, especially with the use of advanced hardware materials. There is a serious need for a greatly expanded engineering database in order to enable more quantitative design analysis.

  13. A joint analysis of the Drake equation and the Fermi paradox

    NASA Astrophysics Data System (ADS)

    Prantzos, Nikos

    2013-07-01

    I propose a unified framework for a joint analysis of the Drake equation and the Fermi paradox, which enables a simultaneous, quantitative study of both of them. The analysis is based on a simplified form of the Drake equation and on a fairly simple scheme for the colonization of the Milky Way. It appears that for sufficiently long-lived civilizations, colonization of the Galaxy is the only reasonable option to gain knowledge about other life forms. This argument allows one to define a region in the parameter space of the Drake equation, where the Fermi paradox definitely holds (`Strong Fermi paradox').

  14. A Description of the Clinical Proteomic Tumor Analysis Consortium (CPTAC) Common Data Analysis Pipeline

    PubMed Central

    Rudnick, Paul A.; Markey, Sanford P.; Roth, Jeri; Mirokhin, Yuri; Yan, Xinjian; Tchekhovskoi, Dmitrii V.; Edwards, Nathan J.; Thangudu, Ratna R.; Ketchum, Karen A.; Kinsinger, Christopher R.; Mesri, Mehdi; Rodriguez, Henry; Stein, Stephen E.

    2016-01-01

    The Clinical Proteomic Tumor Analysis Consortium (CPTAC) has produced large proteomics datasets from the mass spectrometric interrogation of tumor samples previously analyzed by The Cancer Genome Atlas (TCGA) program. The availability of the genomic and proteomic data is enabling proteogenomic study for both reference (i.e., contained in major sequence databases) and non-reference markers of cancer. The CPTAC labs have focused on colon, breast, and ovarian tissues in the first round of analyses; spectra from these datasets were produced from 2D LC-MS/MS analyses and represent deep coverage. To reduce the variability introduced by disparate data analysis platforms (e.g., software packages, versions, parameters, sequence databases, etc.), the CPTAC Common Data Analysis Platform (CDAP) was created. The CDAP produces both peptide-spectrum-match (PSM) reports and gene-level reports. The pipeline processes raw mass spectrometry data according to the following: (1) Peak-picking and quantitative data extraction, (2) database searching, (3) gene-based protein parsimony, and (4) false discovery rate (FDR)-based filtering. The pipeline also produces localization scores for the phosphopeptide enrichment studies using the PhosphoRS program. Quantitative information for each of the datasets is specific to the sample processing, with PSM and protein reports containing the spectrum-level or gene-level (“rolled-up”) precursor peak areas and spectral counts for label-free or reporter ion log-ratios for 4plex iTRAQ™. The reports are available in simple tab-delimited formats and, for the PSM-reports, in mzIdentML. The goal of the CDAP is to provide standard, uniform reports for all of the CPTAC data, enabling comparisons between different samples and cancer types as well as across the major ‘omics fields. PMID:26860878

  15. A Description of the Clinical Proteomic Tumor Analysis Consortium (CPTAC) Common Data Analysis Pipeline.

    PubMed

    Rudnick, Paul A; Markey, Sanford P; Roth, Jeri; Mirokhin, Yuri; Yan, Xinjian; Tchekhovskoi, Dmitrii V; Edwards, Nathan J; Thangudu, Ratna R; Ketchum, Karen A; Kinsinger, Christopher R; Mesri, Mehdi; Rodriguez, Henry; Stein, Stephen E

    2016-03-04

    The Clinical Proteomic Tumor Analysis Consortium (CPTAC) has produced large proteomics data sets from the mass spectrometric interrogation of tumor samples previously analyzed by The Cancer Genome Atlas (TCGA) program. The availability of the genomic and proteomic data is enabling proteogenomic study for both reference (i.e., contained in major sequence databases) and nonreference markers of cancer. The CPTAC laboratories have focused on colon, breast, and ovarian tissues in the first round of analyses; spectra from these data sets were produced from 2D liquid chromatography-tandem mass spectrometry analyses and represent deep coverage. To reduce the variability introduced by disparate data analysis platforms (e.g., software packages, versions, parameters, sequence databases, etc.), the CPTAC Common Data Analysis Platform (CDAP) was created. The CDAP produces both peptide-spectrum-match (PSM) reports and gene-level reports. The pipeline processes raw mass spectrometry data according to the following: (1) peak-picking and quantitative data extraction, (2) database searching, (3) gene-based protein parsimony, and (4) false-discovery rate-based filtering. The pipeline also produces localization scores for the phosphopeptide enrichment studies using the PhosphoRS program. Quantitative information for each of the data sets is specific to the sample processing, with PSM and protein reports containing the spectrum-level or gene-level ("rolled-up") precursor peak areas and spectral counts for label-free or reporter ion log-ratios for 4plex iTRAQ. The reports are available in simple tab-delimited formats and, for the PSM-reports, in mzIdentML. The goal of the CDAP is to provide standard, uniform reports for all of the CPTAC data to enable comparisons between different samples and cancer types as well as across the major omics fields.

  16. Indirect Observation in Everyday Contexts: Concepts and Methodological Guidelines within a Mixed Methods Framework

    PubMed Central

    Anguera, M. Teresa; Portell, Mariona; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2018-01-01

    Indirect observation is a recent concept in systematic observation. It largely involves analyzing textual material generated either indirectly from transcriptions of audio recordings of verbal behavior in natural settings (e.g., conversation, group discussions) or directly from narratives (e.g., letters of complaint, tweets, forum posts). It may also feature seemingly unobtrusive objects that can provide relevant insights into daily routines. All these materials constitute an extremely rich source of information for studying everyday life, and they are continuously growing with the burgeoning of new technologies for data recording, dissemination, and storage. Narratives are an excellent vehicle for studying everyday life, and quantitization is proposed as a means of integrating qualitative and quantitative elements. However, this analysis requires a structured system that enables researchers to analyze varying forms and sources of information objectively. In this paper, we present a methodological framework detailing the steps and decisions required to quantitatively analyze a set of data that was originally qualitative. We provide guidelines on study dimensions, text segmentation criteria, ad hoc observation instruments, data quality controls, and coding and preparation of text for quantitative analysis. The quality control stage is essential to ensure that the code matrices generated from the qualitative data are reliable. We provide examples of how an indirect observation study can produce data for quantitative analysis and also describe the different software tools available for the various stages of the process. The proposed method is framed within a specific mixed methods approach that involves collecting qualitative data and subsequently transforming these into matrices of codes (not frequencies) for quantitative analysis to detect underlying structures and behavioral patterns. The data collection and quality control procedures fully meet the requirement of flexibility and provide new perspectives on data integration in the study of biopsychosocial aspects in everyday contexts. PMID:29441028

  17. Advances in multiplexed MRM-based protein biomarker quantitation toward clinical utility.

    PubMed

    Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Hardie, Darryl B; Borchers, Christoph H

    2014-05-01

    Accurate and rapid protein quantitation is essential for screening biomarkers for disease stratification and monitoring, and to validate the hundreds of putative markers in human biofluids, including blood plasma. An analytical method that utilizes stable isotope-labeled standard (SIS) peptides and selected/multiple reaction monitoring-mass spectrometry (SRM/MRM-MS) has emerged as a promising technique for determining protein concentrations. This targeted approach has analytical merit, but its true potential (in terms of sensitivity and multiplexing) has yet to be realized. Described herein is a method that extends the multiplexing ability of the MRM method to enable the quantitation 142 high-to-moderate abundance proteins (from 31mg/mL to 44ng/mL) in undepleted and non-enriched human plasma in a single run. The proteins have been reported to be associated to a wide variety of non-communicable diseases (NCDs), from cardiovascular disease (CVD) to diabetes. The concentrations of these proteins in human plasma are inferred from interference-free peptides functioning as molecular surrogates (2 peptides per protein, on average). A revised data analysis strategy, involving the linear regression equation of normal control plasma, has been instituted to enable the facile application to patient samples, as demonstrated in separate nutrigenomics and CVD studies. The exceptional robustness of the LC/MS platform and the quantitative method, as well as its high throughput, makes the assay suitable for application to patient samples for the verification of a condensed or complete protein panel. This article is part of a Special Issue entitled: Biomarkers: A Proteomic Challenge. © 2013.

  18. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less

  19. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    DOE PAGES

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; ...

    2015-12-07

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less

  20. Quantitation of heat-shock proteins in clinical samples using mass spectrometry.

    PubMed

    Kaur, Punit; Asea, Alexzander

    2011-01-01

    Mass spectrometry (MS) is a powerful analytical tool for proteomics research and drug and biomarker discovery. MS enables identification and quantification of known and unknown compounds by revealing their structural and chemical properties. Proper sample preparation for MS-based analysis is a critical step in the proteomics workflow because the quality and reproducibility of sample extraction and preparation for downstream analysis significantly impact the separation and identification capabilities of mass spectrometers. The highly expressed proteins represent potential biomarkers that could aid in diagnosis, therapy, or drug development. Because the proteome is so complex, there is no one standard method for preparing protein samples for MS analysis. Protocols differ depending on the type of sample, source, experiment, and method of analysis. Molecular chaperones play significant roles in almost all biological functions due to their capacity for detecting intracellular denatured/unfolded proteins, initiating refolding or denaturation of such malfolded protein sequences and more recently for their role in the extracellular milieu as chaperokines. In this chapter, we describe the latest techniques for quantitating the expression of molecular chaperones in human clinical samples.

  1. Image analysis and modeling in medical image computing. Recent developments and advances.

    PubMed

    Handels, H; Deserno, T M; Meinzer, H-P; Tolxdorff, T

    2012-01-01

    Medical image computing is of growing importance in medical diagnostics and image-guided therapy. Nowadays, image analysis systems integrating advanced image computing methods are used in practice e.g. to extract quantitative image parameters or to support the surgeon during a navigated intervention. However, the grade of automation, accuracy, reproducibility and robustness of medical image computing methods has to be increased to meet the requirements in clinical routine. In the focus theme, recent developments and advances in the field of modeling and model-based image analysis are described. The introduction of models in the image analysis process enables improvements of image analysis algorithms in terms of automation, accuracy, reproducibility and robustness. Furthermore, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients. Selected contributions are assembled to present latest advances in the field. The authors were invited to present their recent work and results based on their outstanding contributions to the Conference on Medical Image Computing BVM 2011 held at the University of Lübeck, Germany. All manuscripts had to pass a comprehensive peer review. Modeling approaches and model-based image analysis methods showing new trends and perspectives in model-based medical image computing are described. Complex models are used in different medical applications and medical images like radiographic images, dual-energy CT images, MR images, diffusion tensor images as well as microscopic images are analyzed. The applications emphasize the high potential and the wide application range of these methods. The use of model-based image analysis methods can improve segmentation quality as well as the accuracy and reproducibility of quantitative image analysis. Furthermore, image-based models enable new insights and can lead to a deeper understanding of complex dynamic mechanisms in the human body. Hence, model-based image computing methods are important tools to improve medical diagnostics and patient treatment in future.

  2. Simultaneous achiral-chiral analysis of pharmaceutical compounds using two-dimensional reversed phase liquid chromatography-supercritical fluid chromatography.

    PubMed

    Venkatramani, C J; Al-Sayah, Mohammad; Li, Guannan; Goel, Meenakshi; Girotti, James; Zang, Lisa; Wigman, Larry; Yehl, Peter; Chetwyn, Nik

    2016-02-01

    A new interface was designed to enable the coupling of reversed phase liquid chromatography (RPLC) and supercritical fluid chromatography (SFC). This online two-dimensional chromatographic system utilizing RPLC in the first dimension and SFC in the second was developed to achieve simultaneous achiral and chiral analysis of pharmaceutical compounds. The interface consists of an eight-port, dual-position switching valve with small volume C-18 trapping columns. The peaks of interest eluting from the first RPLC dimension column were effectively focused as sharp concentration pulses on small volume C-18 trapping column/s and then injected onto the second dimension SFC column. The first dimension RPLC separation provides the achiral purity result, and the second dimension SFC separation provides the chiral purity result (enantiomeric excess). The results are quantitative enabling simultaneous achiral, chiral analysis of compounds. The interface design and proof of concept demonstration are presented. Additionally, comparative studies to conventional SFC and case studies of the applications of 2D LC-SFC in pharmaceutical analysis is presented. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Metal speciation of environmental samples using SPE and SFC-AED analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, S.C.; Burford, M.D.; Robson, M.

    1995-12-31

    Due to growing public concern over heavy metals in the environment, soil, water and air particulate samples azre now routinely screened for their metal content. Conventional metal analysis typically involves acid digestion extraction and results in the generation of large aqueous and organic solvent waste. This harsh extraction process is usually used to obtain the total metal content of the sample, the extract being analysed by atomic emission or absorption spectroscoply techniques. A more selective method of metal extraction has been investigated which uses a supercritical fluid modified with a complexing agent. The relatively mild extraction method enables both organometallicmore » and inorganic metal species to be recovered intact. The various components from the supercritical fluid extract can be chromatographically separated using supercritical fluid chromatography (SFC) and positive identification of the metals achieved using atomic emission detection (AED). The aim of the study is to develop an analytical extraction procedure which enables a rapid, sensitive and quantitative analysis of metals in environmental samples, using just one extraction (eg SFE) and one analysis (eg SFC-AED) procedure.« less

  4. 3D Material Response Analysis of PICA Pyrolysis Experiments

    NASA Technical Reports Server (NTRS)

    Oliver, A. Brandon

    2017-01-01

    The PICA decomposition experiments of Bessire and Minton are investigated using 3D material response analysis. The steady thermoelectric equations have been added to the CHAR code to enable analysis of the Joule-heated experiments and the DAKOTA optimization code is used to define the voltage boundary condition that yields the experimentally observed temperature response. This analysis has identified a potential spatial non-uniformity in the PICA sample temperature driven by the cooled copper electrodes and thermal radiation from the surface of the test article (Figure 1). The non-uniformity leads to a variable heating rate throughout the sample volume that has an effect on the quantitative results of the experiment. Averaging the results of integrating a kinetic reaction mechanism with the heating rates seen across the sample volume yield a shift of peak species production to lower temperatures that is more significant for higher heating rates (Figure 2) when compared to integrating the same mechanism at the reported heating rate. The analysis supporting these conclusions will be presented along with a proposed analysis procedure that permits quantitative use of the existing data. Time permitting, a status on the in-development kinetic decomposition mechanism based on this data will be presented as well.

  5. Quantitative computer-aided diagnostic algorithm for automated detection of peak lesion attenuation in differentiating clear cell from papillary and chromophobe renal cell carcinoma, oncocytoma, and fat-poor angiomyolipoma on multiphasic multidetector computed tomography.

    PubMed

    Coy, Heidi; Young, Jonathan R; Douek, Michael L; Brown, Matthew S; Sayre, James; Raman, Steven S

    2017-07-01

    To evaluate the performance of a novel, quantitative computer-aided diagnostic (CAD) algorithm on four-phase multidetector computed tomography (MDCT) to detect peak lesion attenuation to enable differentiation of clear cell renal cell carcinoma (ccRCC) from chromophobe RCC (chRCC), papillary RCC (pRCC), oncocytoma, and fat-poor angiomyolipoma (fp-AML). We queried our clinical databases to obtain a cohort of histologically proven renal masses with preoperative MDCT with four phases [unenhanced (U), corticomedullary (CM), nephrographic (NP), and excretory (E)]. A whole lesion 3D contour was obtained in all four phases. The CAD algorithm determined a region of interest (ROI) of peak lesion attenuation within the 3D lesion contour. For comparison, a manual ROI was separately placed in the most enhancing portion of the lesion by visual inspection for a reference standard, and in uninvolved renal cortex. Relative lesion attenuation for both CAD and manual methods was obtained by normalizing the CAD peak lesion attenuation ROI (and the reference standard manually placed ROI) to uninvolved renal cortex with the formula [(peak lesion attenuation ROI - cortex ROI)/cortex ROI] × 100%. ROC analysis and area under the curve (AUC) were used to assess diagnostic performance. Bland-Altman analysis was used to compare peak ROI between CAD and manual method. The study cohort comprised 200 patients with 200 unique renal masses: 106 (53%) ccRCC, 32 (16%) oncocytomas, 18 (9%) chRCCs, 34 (17%) pRCCs, and 10 (5%) fp-AMLs. In the CM phase, CAD-derived ROI enabled characterization of ccRCC from chRCC, pRCC, oncocytoma, and fp-AML with AUCs of 0.850 (95% CI 0.732-0.968), 0.959 (95% CI 0.930-0.989), 0.792 (95% CI 0.716-0.869), and 0.825 (95% CI 0.703-0.948), respectively. On Bland-Altman analysis, there was excellent agreement of CAD and manual methods with mean differences between 14 and 26 HU in each phase. A novel, quantitative CAD algorithm enabled robust peak HU lesion detection and discrimination of ccRCC from other renal lesions with similar performance compared to the manual method.

  6. Microscopic optical path length difference and polarization measurement system for cell analysis

    NASA Astrophysics Data System (ADS)

    Satake, H.; Ikeda, K.; Kowa, H.; Hoshiba, T.; Watanabe, E.

    2018-03-01

    In recent years, noninvasive, nonstaining, and nondestructive quantitative cell measurement techniques have become increasingly important in the medical field. These cell measurement techniques enable the quantitative analysis of living cells, and are therefore applied to various cell identification processes, such as those determining the passage number limit during cell culturing in regenerative medicine. To enable cell measurement, we developed a quantitative microscopic phase imaging system based on a Mach-Zehnder interferometer that measures the optical path length difference distribution without phase unwrapping using optical phase locking. The applicability of our phase imaging system was demonstrated by successful identification of breast cancer cells amongst normal cells. However, the cell identification method using this phase imaging system exhibited a false identification rate of approximately 7%. In this study, we implemented a polarimetric imaging system by introducing a polarimetric module to one arm of the Mach-Zehnder interferometer of our conventional phase imaging system. This module was comprised of a quarter wave plate and a rotational polarizer on the illumination side of the sample, and a linear polarizer on the optical detector side. In addition, we developed correction methods for the measurement errors of the optical path length and birefringence phase differences that arose through the influence of elements other than cells, such as the Petri dish. As the Petri dish holding the fluid specimens was transparent, it did not affect the amplitude information; however, the optical path length and birefringence phase differences were affected. Therefore, we proposed correction of the optical path length and birefringence phase for the influence of elements other than cells, as a prerequisite for obtaining highly precise phase and polarimetric images.

  7. Full-field optical coherence microscopy is a novel technique for imaging enteric ganglia in the gastrointestinal tract

    PubMed Central

    CORON, E.; AUKSORIUS, E.; PIERETTI, A.; MAHÉ, M. M.; LIU, L.; STEIGER, C.; BROMBERG, Y.; BOUMA, B.; TEARNEY, G.; NEUNLIST, M.; GOLDSTEIN, A. M.

    2013-01-01

    Background Noninvasive methods are needed to improve the diagnosis of enteric neuropathies. Full-field optical coherence microscopy (FFOCM) is a novel optical microscopy modality that can acquire 1 μm resolution images of tissue. The objective of this research was to demonstrate FFOCM imaging for the characterization of the enteric nervous system (ENS). Methods Normal mice and EdnrB−/− mice, a model of Hirschsprung’s disease (HD), were imaged in three-dimensions ex vivo using FFOCM through the entire thickness and length of the gut. Quantitative analysis of myenteric ganglia was performed on FFOCM images obtained from whole-mount tissues and compared with immunohistochemistry imaged by confocal microscopy. Key Results Full-field optical coherence microscopy enabled visualization of the full thickness gut wall from serosa to mucosa. Images of the myenteric plexus were successfully acquired from the stomach, duodenum, colon, and rectum. Quantification of ganglionic neuronal counts on FFOCM images revealed strong interobserver agreement and identical values to those obtained by immunofluorescence microscopy. In EdnrB−/− mice, FFOCM analysis revealed a significant decrease in ganglia density along the colorectum and a significantly lower density of ganglia in all colorectal segments compared with normal mice. Conclusions & Inferences Full-field optical coherence microscopy enables optical microscopic imaging of the ENS within the bowel wall along the entire intestine. FFOCM is able to differentiate ganglionic from aganglionic colon in a mouse model of HD, and can provide quantitative assessment of ganglionic density. With further refinements that enable bowel wall imaging in vivo, this technology has the potential to revolutionize the characterization of the ENS and the diagnosis of enteric neuropathies. PMID:23106847

  8. Digital biology and chemistry.

    PubMed

    Witters, Daan; Sun, Bing; Begolo, Stefano; Rodriguez-Manzano, Jesus; Robles, Whitney; Ismagilov, Rustem F

    2014-09-07

    This account examines developments in "digital" biology and chemistry within the context of microfluidics, from a personal perspective. Using microfluidics as a frame of reference, we identify two areas of research within digital biology and chemistry that are of special interest: (i) the study of systems that switch between discrete states in response to changes in chemical concentration of signals, and (ii) the study of single biological entities such as molecules or cells. In particular, microfluidics accelerates analysis of switching systems (i.e., those that exhibit a sharp change in output over a narrow range of input) by enabling monitoring of multiple reactions in parallel over a range of concentrations of signals. Conversely, such switching systems can be used to create new kinds of microfluidic detection systems that provide "analog-to-digital" signal conversion and logic. Microfluidic compartmentalization technologies for studying and isolating single entities can be used to reconstruct and understand cellular processes, study interactions between single biological entities, and examine the intrinsic heterogeneity of populations of molecules, cells, or organisms. Furthermore, compartmentalization of single cells or molecules in "digital" microfluidic experiments can induce switching in a range of reaction systems to enable sensitive detection of cells or biomolecules, such as with digital ELISA or digital PCR. This "digitizing" offers advantages in terms of robustness, assay design, and simplicity because quantitative information can be obtained with qualitative measurements. While digital formats have been shown to improve the robustness of existing chemistries, we anticipate that in the future they will enable new chemistries to be used for quantitative measurements, and that digital biology and chemistry will continue to provide further opportunities for measuring biomolecules, understanding natural systems more deeply, and advancing molecular and cellular analysis. Microfluidics will impact digital biology and chemistry and will also benefit from them if it becomes massively distributed.

  9. Customized Molecular Phenotyping by Quantitative Gene Expression and Pattern Recognition Analysis

    PubMed Central

    Akilesh, Shreeram; Shaffer, Daniel J.; Roopenian, Derry

    2003-01-01

    Description of the molecular phenotypes of pathobiological processes in vivo is a pressing need in genomic biology. We have implemented a high-throughput real-time PCR strategy to establish quantitative expression profiles of a customized set of target genes. It enables rapid, reproducible data acquisition from limited quantities of RNA, permitting serial sampling of mouse blood during disease progression. We developed an easy to use statistical algorithm—Global Pattern Recognition—to readily identify genes whose expression has changed significantly from healthy baseline profiles. This approach provides unique molecular signatures for rheumatoid arthritis, systemic lupus erythematosus, and graft versus host disease, and can also be applied to defining the molecular phenotype of a variety of other normal and pathological processes. PMID:12840047

  10. Methodology for determining the investment attractiveness of construction of high-rise buildings

    NASA Astrophysics Data System (ADS)

    Nezhnikova, Ekaterina; Kashirin, Valentin; Davydova, Yana; Kazakova, Svetlana

    2018-03-01

    The article presents the analysis of the existing methods for assessing the investment attractiveness of high-rise construction. The authors determined and justified the primary choice of objects and territories that are the most attractive for the development of high-rise construction. A system of risk indicators has been developed that allow making a quantitative adjustment for a particular project in the evaluation of the efficiency of investment projects. The study is aimed at developing basic methodological concepts for a comparative evaluation of the prospects of construction of high-rise facilities that allow to take into consideration the features of investment in construction and to enable quantitative evaluation of the investment effectiveness in high-rise construction.

  11. Quantitative nondestructive evaluation: Requirements for tomorrow's reliability

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.

    1991-01-01

    Quantitative Nondestructive Evaluation (QNDE) is the technology of measurement, analysis, and prediction of the state of material/structural systems for safety, reliability, and mission assurance. QNDE has impact on everyday life from the cars we drive, the planes we fly, the buildings we work or live in, literally to the infrastructure of our world. Here, researchers highlight some of the new sciences and technologies that are part of a safer, cost effective tomorrow. Specific technologies that are discussed are thermal QNDE of aircraft structural integrity, ultrasonic QNDE for materials characterization, and technology spinoffs from aerospace to the medical sector. In each case, examples are given of how new requirements result in enabling measurement technologies, which in turn change the boundaries of design/practice.

  12. Focus characterization at an X-ray free-electron laser by coherent scattering and speckle analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sikorski, Marcin; Song, Sanghoon; Schropp, Andreas

    2015-04-14

    X-ray focus optimization and characterization based on coherent scattering and quantitative speckle size measurements was demonstrated at the Linac Coherent Light Source. Its performance as a single-pulse free-electron laser beam diagnostic was tested for two typical focusing configurations. The results derived from the speckle size/shape analysis show the effectiveness of this technique in finding the focus' location, size and shape. In addition, its single-pulse compatibility enables users to capture pulse-to-pulse fluctuations in focus properties compared with other techniques that require scanning and averaging.

  13. Atlas of computerized blood flow analysis in bone disease.

    PubMed

    Gandsman, E J; Deutsch, S D; Tyson, I B

    1983-11-01

    The role of computerized blood flow analysis in routine bone scanning is reviewed. Cases illustrating the technique include proven diagnoses of toxic synovitis, Legg-Perthes disease, arthritis, avascular necrosis of the hip, fractures, benign and malignant tumors, Paget's disease, cellulitis, osteomyelitis, and shin splints. Several examples also show the use of the technique in monitoring treatment. The use of quantitative data from the blood flow, bone uptake phase, and static images suggests specific diagnostic patterns for each of the diseases presented in this atlas. Thus, this technique enables increased accuracy in the interpretation of the radionuclide bone scan.

  14. Analysis of fatty acids by graphite plate laser desorption/ionization time-of-flight mass spectrometry.

    PubMed

    Park, K H; Kim, H J

    2001-01-01

    Fatty acids obtained from triglycerides (trioelin, tripalmitin), foods (milk, corn oil), and phospholipids (phosphotidylcholine, phosphotidylserine, phosphatidic acid) upon alkaline hydrolysis were observed directly without derivatization by graphite plate laser desorption/ionization time-of-flight mass spectrometry (GPLDI-TOFMS). Mass-to-charge ratios predicted for sodium adducts of expected fatty acids (e.g. palmitic, oleic, linoleic and arachidonic acids) were observed without interference. Although at present no quantitation is possible, the graphite plate method enables a simple and rapid qualitative analysis of fatty acids. Copyright 2001 John Wiley & Sons, Ltd.

  15. Systems Toxicology: From Basic Research to Risk Assessment

    PubMed Central

    2014-01-01

    Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment. PMID:24446777

  16. Label-free protein profiling of formalin-fixed paraffin-embedded (FFPE) heart tissue reveals immediate mitochondrial impairment after ionising radiation.

    PubMed

    Azimzadeh, Omid; Scherthan, Harry; Yentrapalli, Ramesh; Barjaktarovic, Zarko; Ueffing, Marius; Conrad, Marcus; Neff, Frauke; Calzada-Wack, Julia; Aubele, Michaela; Buske, Christian; Atkinson, Michael J; Hauck, Stefanie M; Tapio, Soile

    2012-04-18

    Qualitative proteome profiling of formalin-fixed, paraffin-embedded (FFPE) tissue is advancing the field of clinical proteomics. However, quantitative proteome analysis of FFPE tissue is hampered by the lack of an efficient labelling method. The usage of conventional protein labelling on FFPE tissue has turned out to be inefficient. Classical labelling targets lysine residues that are blocked by the formalin treatment. The aim of this study was to establish a quantitative proteomics analysis of FFPE tissue by combining the label-free approach with optimised protein extraction and separation conditions. As a model system we used FFPE heart tissue of control and exposed C57BL/6 mice after total body irradiation using a gamma ray dose of 3 gray. We identified 32 deregulated proteins (p≤0.05) in irradiated hearts 24h after the exposure. The proteomics data were further evaluated and validated by bioinformatics and immunoblotting investigation. In good agreement with our previous results using fresh-frozen tissue, the analysis indicated radiation-induced alterations in three main biological pathways: respiratory chain, lipid metabolism and pyruvate metabolism. The label-free approach enables the quantitative measurement of radiation-induced alterations in FFPE tissue and facilitates retrospective biomarker identification using clinical archives. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. Systems toxicology: from basic research to risk assessment.

    PubMed

    Sturla, Shana J; Boobis, Alan R; FitzGerald, Rex E; Hoeng, Julia; Kavlock, Robert J; Schirmer, Kristin; Whelan, Maurice; Wilks, Martin F; Peitsch, Manuel C

    2014-03-17

    Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment.

  18. Classification-based quantitative analysis of stable isotope labeling by amino acids in cell culture (SILAC) data.

    PubMed

    Kim, Seongho; Carruthers, Nicholas; Lee, Joohyoung; Chinni, Sreenivasa; Stemmer, Paul

    2016-12-01

    Stable isotope labeling by amino acids in cell culture (SILAC) is a practical and powerful approach for quantitative proteomic analysis. A key advantage of SILAC is the ability to simultaneously detect the isotopically labeled peptides in a single instrument run and so guarantee relative quantitation for a large number of peptides without introducing any variation caused by separate experiment. However, there are a few approaches available to assessing protein ratios and none of the existing algorithms pays considerable attention to the proteins having only one peptide hit. We introduce new quantitative approaches to dealing with SILAC protein-level summary using classification-based methodologies, such as Gaussian mixture models with EM algorithms and its Bayesian approach as well as K-means clustering. In addition, a new approach is developed using Gaussian mixture model and a stochastic, metaheuristic global optimization algorithm, particle swarm optimization (PSO), to avoid either a premature convergence or being stuck in a local optimum. Our simulation studies show that the newly developed PSO-based method performs the best among others in terms of F1 score and the proposed methods further demonstrate the ability of detecting potential markers through real SILAC experimental data. No matter how many peptide hits the protein has, the developed approach can be applicable, rescuing many proteins doomed to removal. Furthermore, no additional correction for multiple comparisons is necessary for the developed methods, enabling direct interpretation of the analysis outcomes. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  19. Multidimensional electrostatic repulsion-hydrophilic interaction chromatography (ERLIC) for quantitative analysis of the proteome and phosphoproteome in clinical and biomedical research.

    PubMed

    Loroch, Stefan; Schommartz, Tim; Brune, Wolfram; Zahedi, René Peiman; Sickmann, Albert

    2015-05-01

    Quantitative proteomics and phosphoproteomics have become key disciplines in understanding cellular processes. Fundamental research can be done using cell culture providing researchers with virtually infinite sample amounts. In contrast, clinical, pre-clinical and biomedical research is often restricted to minute sample amounts and requires an efficient analysis with only micrograms of protein. To address this issue, we generated a highly sensitive workflow for combined LC-MS-based quantitative proteomics and phosphoproteomics by refining an ERLIC-based 2D phosphoproteomics workflow into an ERLIC-based 3D workflow covering the global proteome as well. The resulting 3D strategy was successfully used for an in-depth quantitative analysis of both, the proteome and the phosphoproteome of murine cytomegalovirus-infected mouse fibroblasts, a model system for host cell manipulation by a virus. In a 2-plex SILAC experiment with 150 μg of a tryptic digest per condition, the 3D strategy enabled the quantification of ~75% more proteins and even ~134% more peptides compared to the 2D strategy. Additionally, we could quantify ~50% more phosphoproteins by non-phosphorylated peptides, concurrently yielding insights into changes on the levels of protein expression and phosphorylation. Beside its sensitivity, our novel three-dimensional ERLIC-strategy has the potential for semi-automated sample processing rendering it a suitable future perspective for clinical, pre-clinical and biomedical research. Copyright © 2015. Published by Elsevier B.V.

  20. Experimental design and data-analysis in label-free quantitative LC/MS proteomics: A tutorial with MSqRob.

    PubMed

    Goeminne, Ludger J E; Gevaert, Kris; Clement, Lieven

    2018-01-16

    Label-free shotgun proteomics is routinely used to assess proteomes. However, extracting relevant information from the massive amounts of generated data remains difficult. This tutorial provides a strong foundation on analysis of quantitative proteomics data. We provide key statistical concepts that help researchers to design proteomics experiments and we showcase how to analyze quantitative proteomics data using our recent free and open-source R package MSqRob, which was developed to implement the peptide-level robust ridge regression method for relative protein quantification described by Goeminne et al. MSqRob can handle virtually any experimental proteomics design and outputs proteins ordered by statistical significance. Moreover, its graphical user interface and interactive diagnostic plots provide easy inspection and also detection of anomalies in the data and flaws in the data analysis, allowing deeper assessment of the validity of results and a critical review of the experimental design. Our tutorial discusses interactive preprocessing, data analysis and visualization of label-free MS-based quantitative proteomics experiments with simple and more complex designs. We provide well-documented scripts to run analyses in bash mode on GitHub, enabling the integration of MSqRob in automated pipelines on cluster environments (https://github.com/statOmics/MSqRob). The concepts outlined in this tutorial aid in designing better experiments and analyzing the resulting data more appropriately. The two case studies using the MSqRob graphical user interface will contribute to a wider adaptation of advanced peptide-based models, resulting in higher quality data analysis workflows and more reproducible results in the proteomics community. We also provide well-documented scripts for experienced users that aim at automating MSqRob on cluster environments. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Systems cell biology

    PubMed Central

    Mast, Fred D.; Ratushny, Alexander V.

    2014-01-01

    Systems cell biology melds high-throughput experimentation with quantitative analysis and modeling to understand many critical processes that contribute to cellular organization and dynamics. Recently, there have been several advances in technology and in the application of modeling approaches that enable the exploration of the dynamic properties of cells. Merging technology and computation offers an opportunity to objectively address unsolved cellular mechanisms, and has revealed emergent properties and helped to gain a more comprehensive and fundamental understanding of cell biology. PMID:25225336

  2. Experimental verification of Pyragas-Schöll-Fiedler control.

    PubMed

    von Loewenich, Clemens; Benner, Hartmut; Just, Wolfram

    2010-09-01

    We present an experimental realization of time-delayed feedback control proposed by Schöll and Fiedler. The scheme enables us to stabilize torsion-free periodic orbits in autonomous systems, and to overcome the so-called odd number limitation. The experimental control performance is in quantitative agreement with the bifurcation analysis of simple model systems. The results uncover some general features of the control scheme which are deemed to be relevant for a large class of setups.

  3. Quantitative Imaging In Pathology (QUIP) | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    This site hosts web accessible applications, tools and data designed to support analysis, management, and exploration of whole slide tissue images for cancer research. The following tools are included: caMicroscope: A digital pathology data management and visualization plaform that enables interactive viewing of whole slide tissue images and segmentation results. caMicroscope can be also used independently of QUIP. FeatureExplorer: An interactive tool to allow patient-level feature exploration across multiple dimensions.

  4. Correlative SEM SERS for quantitative analysis of dimer nanoparticles.

    PubMed

    Timmermans, F J; Lenferink, A T M; van Wolferen, H A G M; Otto, C

    2016-11-14

    A Raman microscope integrated with a scanning electron microscope was used to investigate plasmonic structures by correlative SEM-SERS analysis. The integrated Raman-SEM microscope combines high-resolution electron microscopy information with SERS signal enhancement from selected nanostructures with adsorbed Raman reporter molecules. Correlative analysis is performed for dimers of two gold nanospheres. Dimers were selected on the basis of SEM images from multi aggregate samples. The effect of the orientation of the dimer with respect to the polarization state of the laser light and the effect of the particle gap size on the Raman signal intensity is observed. Additionally, calculations are performed to simulate the electric near field enhancement. These simulations are based on the morphologies observed by electron microscopy. In this way the experiments are compared with the enhancement factor calculated with near field simulations and are subsequently used to quantify the SERS enhancement factor. Large differences between experimentally observed and calculated enhancement factors are regularly detected, a phenomenon caused by nanoscale differences between the real and 'simplified' simulated structures. Quantitative SERS experiments reveal the structure induced enhancement factor, ranging from ∼200 to ∼20 000, averaged over the full nanostructure surface. The results demonstrate correlative Raman-SEM microscopy for the quantitative analysis of plasmonic particles and structures, thus enabling a new analytical method in the field of SERS and plasmonics.

  5. Amorphous solid dispersions of piroxicam and Soluplus(®): Qualitative and quantitative analysis of piroxicam recrystallization during storage.

    PubMed

    Lust, Andres; Strachan, Clare J; Veski, Peep; Aaltonen, Jaakko; Heinämäki, Jyrki; Yliruusi, Jouko; Kogermann, Karin

    2015-01-01

    The conversion of active pharmaceutical ingredient (API) from amorphous to crystalline form is the primary stability issue in formulating amorphous solid dispersions (SDs). The aim of the present study was to carry out qualitative and quantitative analysis of the physical solid-state stability of the SDs of poorly water-soluble piroxicam (PRX) and polyvinyl caprolactam-polyvinyl acetate-polyethylene-glycol graft copolymer (Soluplus(®)). The SDs were prepared by a solvent evaporation method and stored for six months at 0% RH/6 °C, 0% RH/25 °C, 40% RH/25 °C and 75% RH/25 °C. Fourier transform infrared spectroscopy equipped with attenuated total reflection accessory (ATR-FTIR) and Raman spectroscopy were used for characterizing the physical solid-state changes and drug-polymer interactions. The principal component analysis (PCA) and multivariate curve resolution alternating least squares (MCR-ALS) were used for the qualitative and quantitative analysis of Raman spectra collected during storage. When stored at 0% RH/6 °C and at 0% RH/25 °C, PRX in SDs remained in an amorphous form since no recrystallization was observed by ATR-FTIR and Raman spectroscopy. Raman spectroscopy coupled with PCA and MCR-ALS and ATR-FTIR spectroscopy enabled to detect the recrystallization of amorphous PRX in the samples stored at higher humidity. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Quantitative determination of low-Z elements in single atmospheric particles on boron substrates by automated scanning electron microscopy-energy-dispersive X-ray spectrometry.

    PubMed

    Choël, Marie; Deboudt, Karine; Osán, János; Flament, Pascal; Van Grieken, René

    2005-09-01

    Atmospheric aerosols consist of a complex heterogeneous mixture of particles. Single-particle analysis techniques are known to provide unique information on the size-resolved chemical composition of aerosols. A scanning electron microscope (SEM) combined with a thin-window energy-dispersive X-ray (EDX) detector enables the morphological and elemental analysis of single particles down to 0.1 microm with a detection limit of 1-10 wt %, low-Z elements included. To obtain data statistically representative of the air masses sampled, a computer-controlled procedure can be implemented in order to run hundreds of single-particle analyses (typically 1000-2000) automatically in a relatively short period of time (generally 4-8 h, depending on the setup and on the particle loading). However, automated particle analysis by SEM-EDX raises two practical challenges: the accuracy of the particle recognition and the reliability of the quantitative analysis, especially for micrometer-sized particles with low atomic number contents. Since low-Z analysis is hampered by the use of traditional polycarbonate membranes, an alternate choice of substrate is a prerequisite. In this work, boron is being studied as a promising material for particle microanalysis. As EDX is generally said to probe a volume of approximately 1 microm3, geometry effects arise from the finite size of microparticles. These particle geometry effects must be corrected by means of a robust concentration calculation procedure. Conventional quantitative methods developed for bulk samples generate elemental concentrations considerably in error when applied to microparticles. A new methodology for particle microanalysis, combining the use of boron as the substrate material and a reverse Monte Carlo quantitative program, was tested on standard particles ranging from 0.25 to 10 microm. We demonstrate that the quantitative determination of low-Z elements in microparticles is achievable and that highly accurate results can be obtained using the automatic data processing described here compared to conventional methods.

  7. MsViz: A Graphical Software Tool for In-Depth Manual Validation and Quantitation of Post-translational Modifications.

    PubMed

    Martín-Campos, Trinidad; Mylonas, Roman; Masselot, Alexandre; Waridel, Patrice; Petricevic, Tanja; Xenarios, Ioannis; Quadroni, Manfredo

    2017-08-04

    Mass spectrometry (MS) has become the tool of choice for the large scale identification and quantitation of proteins and their post-translational modifications (PTMs). This development has been enabled by powerful software packages for the automated analysis of MS data. While data on PTMs of thousands of proteins can nowadays be readily obtained, fully deciphering the complexity and combinatorics of modification patterns even on a single protein often remains challenging. Moreover, functional investigation of PTMs on a protein of interest requires validation of the localization and the accurate quantitation of its changes across several conditions, tasks that often still require human evaluation. Software tools for large scale analyses are highly efficient but are rarely conceived for interactive, in-depth exploration of data on individual proteins. We here describe MsViz, a web-based and interactive software tool that supports manual validation of PTMs and their relative quantitation in small- and medium-size experiments. The tool displays sequence coverage information, peptide-spectrum matches, tandem MS spectra and extracted ion chromatograms through a single, highly intuitive interface. We found that MsViz greatly facilitates manual data inspection to validate PTM location and quantitate modified species across multiple samples.

  8. Nanoscale nuclear architecture for cancer diagnosis beyond pathology via spatial-domain low-coherence quantitative phase microscopy

    NASA Astrophysics Data System (ADS)

    Wang, Pin; Bista, Rajan K.; Khalbuss, Walid E.; Qiu, Wei; Uttam, Shikhar; Staton, Kevin; Zhang, Lin; Brentnall, Teresa A.; Brand, Randall E.; Liu, Yang

    2010-11-01

    Definitive diagnosis of malignancy is often challenging due to limited availability of human cell or tissue samples and morphological similarity with certain benign conditions. Our recently developed novel technology-spatial-domain low-coherence quantitative phase microscopy (SL-QPM)-overcomes the technical difficulties and enables us to obtain quantitative information about cell nuclear architectural characteristics with nanoscale sensitivity. We explore its ability to improve the identification of malignancy, especially in cytopathologically non-cancerous-appearing cells. We perform proof-of-concept experiments with an animal model of colorectal carcinogenesis-APCMin mouse model and human cytology specimens of colorectal cancer. We show the ability of in situ nanoscale nuclear architectural characteristics in identifying cancerous cells, especially in those labeled as ``indeterminate or normal'' by expert cytopathologists. Our approach is based on the quantitative analysis of the cell nucleus on the original cytology slides without additional processing, which can be readily applied in a conventional clinical setting. Our simple and practical optical microscopy technique may lead to the development of novel methods for early detection of cancer.

  9. Alzheimer disease: Quantitative analysis of I-123-iodoamphetamine SPECT brain imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hellman, R.S.; Tikofsky, R.S.; Collier, B.D.

    1989-07-01

    To enable a more quantitative diagnosis of senile dementia of the Alzheimer type (SDAT), the authors developed and tested a semiautomated method to define regions of interest (ROIs) to be used in quantitating results from single photon emission computed tomography (SPECT) of regional cerebral blood flow performed with N-isopropyl iodine-123-iodoamphetamine. SPECT/IMP imaging was performed in ten patients with probable SDAT and seven healthy subjects. Multiple ROIs were manually and semiautomatically generated, and uptake was quantitated for each ROI. Mean cortical activity was estimated as the average of the mean activity in 24 semiautomatically generated ROIs; mean cerebellar activity was determinedmore » from the mean activity in separate ROIs. A ratio of parietal to cerebellar activity less than 0.60 and a ratio of parietal to mean cortical activity less than 0.90 allowed correct categorization of nine of ten and eight of ten patients, respectively, with SDAT and all control subjects. The degree of diminished mental status observed in patients with SDAT correlated with both global and regional changes in IMP uptake.« less

  10. The Development of Mathematical Knowledge for Teaching for Quantitative Reasoning Using Video-Based Instruction

    ERIC Educational Resources Information Center

    Walters, Charles David

    2017-01-01

    Quantitative reasoning (P. W. Thompson, 1990, 1994) is a powerful mathematical tool that enables students to engage in rich problem solving across the curriculum. One way to support students' quantitative reasoning is to develop prospective secondary teachers' (PSTs) mathematical knowledge for teaching (MKT; Ball, Thames, & Phelps, 2008)…

  11. Enhancing the Characterization of Epistemic Uncertainties in PM2.5 Risk Analyses.

    PubMed

    Smith, Anne E; Gans, Will

    2015-03-01

    The Environmental Benefits Mapping and Analysis Program (BenMAP) is a software tool developed by the U.S. Environmental Protection Agency (EPA) that is widely used inside and outside of EPA to produce quantitative estimates of public health risks from fine particulate matter (PM2.5 ). This article discusses the purpose and appropriate role of a risk analysis tool to support risk management deliberations, and evaluates the functions of BenMAP in this context. It highlights the importance in quantitative risk analyses of characterization of epistemic uncertainty, or outright lack of knowledge, about the true risk relationships being quantified. This article describes and quantitatively illustrates sensitivities of PM2.5 risk estimates to several key forms of epistemic uncertainty that pervade those calculations: the risk coefficient, shape of the risk function, and the relative toxicity of individual PM2.5 constituents. It also summarizes findings from a review of U.S.-based epidemiological evidence regarding the PM2.5 risk coefficient for mortality from long-term exposure. That review shows that the set of risk coefficients embedded in BenMAP substantially understates the range in the literature. We conclude that BenMAP would more usefully fulfill its role as a risk analysis support tool if its functions were extended to better enable and prompt its users to characterize the epistemic uncertainties in their risk calculations. This requires expanded automatic sensitivity analysis functions and more recognition of the full range of uncertainty in risk coefficients. © 2014 Society for Risk Analysis.

  12. Label-free cell-cycle analysis by high-throughput quantitative phase time-stretch imaging flow cytometry

    NASA Astrophysics Data System (ADS)

    Mok, Aaron T. Y.; Lee, Kelvin C. M.; Wong, Kenneth K. Y.; Tsia, Kevin K.

    2018-02-01

    Biophysical properties of cells could complement and correlate biochemical markers to characterize a multitude of cellular states. Changes in cell size, dry mass and subcellular morphology, for instance, are relevant to cell-cycle progression which is prevalently evaluated by DNA-targeted fluorescence measurements. Quantitative-phase microscopy (QPM) is among the effective biophysical phenotyping tools that can quantify cell sizes and sub-cellular dry mass density distribution of single cells at high spatial resolution. However, limited camera frame rate and thus imaging throughput makes QPM incompatible with high-throughput flow cytometry - a gold standard in multiparametric cell-based assay. Here we present a high-throughput approach for label-free analysis of cell cycle based on quantitative-phase time-stretch imaging flow cytometry at a throughput of > 10,000 cells/s. Our time-stretch QPM system enables sub-cellular resolution even at high speed, allowing us to extract a multitude (at least 24) of single-cell biophysical phenotypes (from both amplitude and phase images). Those phenotypes can be combined to track cell-cycle progression based on a t-distributed stochastic neighbor embedding (t-SNE) algorithm. Using multivariate analysis of variance (MANOVA) discriminant analysis, cell-cycle phases can also be predicted label-free with high accuracy at >90% in G1 and G2 phase, and >80% in S phase. We anticipate that high throughput label-free cell cycle characterization could open new approaches for large-scale single-cell analysis, bringing new mechanistic insights into complex biological processes including diseases pathogenesis.

  13. The Focinator v2-0 - Graphical Interface, Four Channels, Colocalization Analysis and Cell Phase Identification.

    PubMed

    Oeck, Sebastian; Malewicz, Nathalie M; Hurst, Sebastian; Al-Refae, Klaudia; Krysztofiak, Adam; Jendrossek, Verena

    2017-07-01

    The quantitative analysis of foci plays an important role in various cell biological methods. In the fields of radiation biology and experimental oncology, the effect of ionizing radiation, chemotherapy or molecularly targeted drugs on DNA damage induction and repair is frequently performed by the analysis of protein clusters or phosphorylated proteins recruited to so called repair foci at DNA damage sites, involving for example γ-H2A.X, 53BP1 or RAD51. We recently developed "The Focinator" as a reliable and fast tool for automated quantitative and qualitative analysis of nuclei and DNA damage foci. The refined software is now even more user-friendly due to a graphical interface and further features. Thus, we included an R-script-based mode for automated image opening, file naming, progress monitoring and an error report. Consequently, the evaluation no longer required the attendance of the operator after initial parameter definition. Moreover, the Focinator v2-0 is now able to perform multi-channel analysis of four channels and evaluation of protein-protein colocalization by comparison of up to three foci channels. This enables for example the quantification of foci in cells of a specific cell cycle phase.

  14. SWATH2stats: An R/Bioconductor Package to Process and Convert Quantitative SWATH-MS Proteomics Data for Downstream Analysis Tools.

    PubMed

    Blattmann, Peter; Heusel, Moritz; Aebersold, Ruedi

    2016-01-01

    SWATH-MS is an acquisition and analysis technique of targeted proteomics that enables measuring several thousand proteins with high reproducibility and accuracy across many samples. OpenSWATH is popular open-source software for peptide identification and quantification from SWATH-MS data. For downstream statistical and quantitative analysis there exist different tools such as MSstats, mapDIA and aLFQ. However, the transfer of data from OpenSWATH to the downstream statistical tools is currently technically challenging. Here we introduce the R/Bioconductor package SWATH2stats, which allows convenient processing of the data into a format directly readable by the downstream analysis tools. In addition, SWATH2stats allows annotation, analyzing the variation and the reproducibility of the measurements, FDR estimation, and advanced filtering before submitting the processed data to downstream tools. These functionalities are important to quickly analyze the quality of the SWATH-MS data. Hence, SWATH2stats is a new open-source tool that summarizes several practical functionalities for analyzing, processing, and converting SWATH-MS data and thus facilitates the efficient analysis of large-scale SWATH/DIA datasets.

  15. Retrieval of complex χ(2) parts for quantitative analysis of sum-frequency generation intensity spectra

    PubMed Central

    Hofmann, Matthias J.; Koelsch, Patrick

    2015-01-01

    Vibrational sum-frequency generation (SFG) spectroscopy has become an established technique for in situ surface analysis. While spectral recording procedures and hardware have been optimized, unique data analysis routines have yet to be established. The SFG intensity is related to probing geometries and properties of the system under investigation such as the absolute square of the second-order susceptibility χ(2)2. A conventional SFG intensity measurement does not grant access to the complex parts of χ(2) unless further assumptions have been made. It is therefore difficult, sometimes impossible, to establish a unique fitting solution for SFG intensity spectra. Recently, interferometric phase-sensitive SFG or heterodyne detection methods have been introduced to measure real and imaginary parts of χ(2) experimentally. Here, we demonstrate that iterative phase-matching between complex spectra retrieved from maximum entropy method analysis and fitting of intensity SFG spectra (iMEMfit) leads to a unique solution for the complex parts of χ(2) and enables quantitative analysis of SFG intensity spectra. A comparison between complex parts retrieved by iMEMfit applied to intensity spectra and phase sensitive experimental data shows excellent agreement between the two methods. PMID:26450297

  16. Toward standardized quantitative image quality (IQ) assessment in computed tomography (CT): A comprehensive framework for automated and comparative IQ analysis based on ICRU Report 87.

    PubMed

    Pahn, Gregor; Skornitzke, Stephan; Schlemmer, Hans-Peter; Kauczor, Hans-Ulrich; Stiller, Wolfram

    2016-01-01

    Based on the guidelines from "Report 87: Radiation Dose and Image-quality Assessment in Computed Tomography" of the International Commission on Radiation Units and Measurements (ICRU), a software framework for automated quantitative image quality analysis was developed and its usability for a variety of scientific questions demonstrated. The extendable framework currently implements the calculation of the recommended Fourier image quality (IQ) metrics modulation transfer function (MTF) and noise-power spectrum (NPS), and additional IQ quantities such as noise magnitude, CT number accuracy, uniformity across the field-of-view, contrast-to-noise ratio (CNR) and signal-to-noise ratio (SNR) of simulated lesions for a commercially available cone-beam phantom. Sample image data were acquired with different scan and reconstruction settings on CT systems from different manufacturers. Spatial resolution is analyzed in terms of edge-spread function, line-spread-function, and MTF. 3D NPS is calculated according to ICRU Report 87, and condensed to 2D and radially averaged 1D representations. Noise magnitude, CT numbers, and uniformity of these quantities are assessed on large samples of ROIs. Low-contrast resolution (CNR, SNR) is quantitatively evaluated as a function of lesion contrast and diameter. Simultaneous automated processing of several image datasets allows for straightforward comparative assessment. The presented framework enables systematic, reproducible, automated and time-efficient quantitative IQ analysis. Consistent application of the ICRU guidelines facilitates standardization of quantitative assessment not only for routine quality assurance, but for a number of research questions, e.g. the comparison of different scanner models or acquisition protocols, and the evaluation of new technology or reconstruction methods. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  17. Highly sensitive and quantitative detection of rare pathogens through agarose droplet microfluidic emulsion PCR at the single-cell level.

    PubMed

    Zhu, Zhi; Zhang, Wenhua; Leng, Xuefei; Zhang, Mingxia; Guan, Zhichao; Lu, Jiangquan; Yang, Chaoyong James

    2012-10-21

    Genetic alternations can serve as highly specific biomarkers to distinguish fatal bacteria or cancer cells from their normal counterparts. However, these mutations normally exist in very rare amount in the presence of a large excess of non-mutated analogs. Taking the notorious pathogen E. coli O157:H7 as the target analyte, we have developed an agarose droplet-based microfluidic ePCR method for highly sensitive, specific and quantitative detection of rare pathogens in the high background of normal bacteria. Massively parallel singleplex and multiplex PCR at the single-cell level in agarose droplets have been successfully established. Moreover, we challenged the system with rare pathogen detection and realized the sensitive and quantitative analysis of a single E. coli O157:H7 cell in the high background of 100,000 excess normal K12 cells. For the first time, we demonstrated rare pathogen detection through agarose droplet microfluidic ePCR. Such a multiplex single-cell agarose droplet amplification method enables ultra-high throughput and multi-parameter genetic analysis of large population of cells at the single-cell level to uncover the stochastic variations in biological systems.

  18. High-Throughput Quantitative Lipidomics Analysis of Nonesterified Fatty Acids in Human Plasma.

    PubMed

    Christinat, Nicolas; Morin-Rivron, Delphine; Masoodi, Mojgan

    2016-07-01

    We present a high-throughput, nontargeted lipidomics approach using liquid chromatography coupled to high-resolution mass spectrometry for quantitative analysis of nonesterified fatty acids. We applied this method to screen a wide range of fatty acids from medium-chain to very long-chain (8 to 24 carbon atoms) in human plasma samples. The method enables us to chromatographically separate branched-chain species from their straight-chain isomers as well as separate biologically important ω-3 and ω-6 polyunsaturated fatty acids. We used 51 fatty acid species to demonstrate the quantitative capability of this method with quantification limits in the nanomolar range; however, this method is not limited only to these fatty acid species. High-throughput sample preparation was developed and carried out on a robotic platform that allows extraction of 96 samples simultaneously within 3 h. This high-throughput platform was used to assess the influence of different types of human plasma collection and preparation on the nonesterified fatty acid profile of healthy donors. Use of the anticoagulants EDTA and heparin has been compared with simple clotting, and only limited changes have been detected in most nonesterified fatty acid concentrations.

  19. Major advances in testing of dairy products: milk component and dairy product attribute testing.

    PubMed

    Barbano, D M; Lynch, J M

    2006-04-01

    Milk component analysis is relatively unusual in the field of quantitative analytical chemistry because an analytical test result determines the allocation of very large amounts of money between buyers and sellers of milk. Therefore, there is high incentive to develop and refine these methods to achieve a level of analytical performance rarely demanded of most methods or laboratory staff working in analytical chemistry. In the last 25 yr, well-defined statistical methods to characterize and validate analytical method performance combined with significant improvements in both the chemical and instrumental methods have allowed achievement of improved analytical performance for payment testing. A shift from marketing commodity dairy products to the development, manufacture, and marketing of value added dairy foods for specific market segments has created a need for instrumental and sensory approaches and quantitative data to support product development and marketing. Bringing together sensory data from quantitative descriptive analysis and analytical data from gas chromatography olfactometry for identification of odor-active compounds in complex natural dairy foods has enabled the sensory scientist and analytical chemist to work together to improve the consistency and quality of dairy food flavors.

  20. Quantitative analysis of phenolic metabolites from different parts of Angelica keiskei by HPLC-ESI MS/MS and their xanthine oxidase inhibition.

    PubMed

    Kim, Dae Wook; Curtis-Long, Marcus J; Yuk, Heung Joo; Wang, Yan; Song, Yeong Hun; Jeong, Seong Hun; Park, Ki Hun

    2014-06-15

    Angelica keiskei is used as popular functional food stuff. However, quantitative analysis of this plant's metabolites has not yet been disclosed. The principal phenolic compounds (1-16) within A. keiskei were isolated, enabling us to quantify the metabolites within different parts of the plant. The specific quantification of metabolites (1-16) was accomplished by multiple reaction monitoring (MRM) using a quadruple tandem mass spectrometer. The limit of detection and limit of quantitation were calculated as 0.4-44 μg/kg and 1.5-148 μg/kg, respectively. Abundance and composition of these metabolites varied significantly across different parts of plant. For example, the abundance of chalcones (12-16) decreased as follows: root bark (10.51 mg/g)>stems (8.52 mg/g)>leaves (2.63 mg/g)>root cores (1.44 mg/g). The chalcones were found to be responsible for the xanthine oxidase (XO) inhibition shown by this plant. The most potent inhibitor, xanthoangelol inhibited XO with an IC50 of 8.5 μM. Chalcones (12-16) exhibited mixed-type inhibition characteristics. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Analysis of specific RNA in cultured cells through quantitative integration of q-PCR and N-SIM single cell FISH images: Application to hormonal stimulation of StAR transcription.

    PubMed

    Lee, Jinwoo; Foong, Yee Hoon; Musaitif, Ibrahim; Tong, Tiegang; Jefcoate, Colin

    2016-07-05

    The steroidogenic acute regulatory protein (StAR) has been proposed to serve as the switch that can turn on/off steroidogenesis. We investigated the events that facilitate dynamic StAR transcription in response to cAMP stimulation in MA-10 Leydig cells, focusing on splicing anomalies at StAR gene loci. We used 3' reverse primers in a single reaction to respectively quantify StAR primary (p-RNA), spliced (sp-RNA/mRNA), and extended 3' untranslated region (UTR) transcripts, which were quantitatively imaged by high-resolution fluorescence in situ hybridization (FISH). This approach delivers spatio-temporal resolution of initiation and splicing at single StAR loci, and transfers individual mRNA molecules to cytoplasmic sites. Gene expression was biphasic, initially showing slow splicing, transitioning to concerted splicing. The alternative 3.5-kb mRNAs were distinguished through the use of extended 3'UTR probes, which exhibited distinctive mitochondrial distribution. Combining quantitative PCR and FISH enables imaging of localization of RNA expression and analysis of RNA processing rates. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  2. Analysis of transport eco-efficiency scenarios to support sustainability assessment: a study on Dhaka City, Bangladesh.

    PubMed

    Iqbal, Asif; Allan, Andrew; Afroze, Shirina

    2017-08-01

    The study focused to assess the level of efficiency (of both emissions and service quality) that can be achieved for the transport system in Dhaka City, Bangladesh. The assessment technique attempted to quantify the extent of eco-efficiency achievable for the system modifications due to planning or strategy. The eco-efficiency analysis was facilitated with a detailed survey data on Dhaka City transport system, which was conducted for 9 months in 2012-2013. Line source modelling (CALINE4) was incorporated to estimate the on-road emission concentration. The eco-efficiency of the transport systems was assessed with the 'multi-criteria analysis' (MCA) technique that enabled the valuation of systems' qualitative and quantitative parameters. As per the analysis, driving indiscipline on road can alone promise about 47% reductions in emissions, which along with the number of private vehicles were the important stressors that restrict achieving eco-efficiency in Dhaka City. Detailed analysis of the transport system together with the potential transport system scenarios can offer a checklist to the policy makers enabling to identify the possible actions needed that can offer greater services to the dwellers against lesser emissions, which in turn can bring sustainability of the system.

  3. Microwave-assisted deuterium exchange: the convenient preparation of isotopically labelled analogues for stable isotope dilution analysis of volatile wine phenols.

    PubMed

    Crump, Anna M; Sefton, Mark A; Wilkinson, Kerry L

    2014-11-01

    This study reports the convenient, low cost, one-step synthesis of labelled analogues of six volatile phenols, guaiacol, 4-methylguaiacol, 4-ethylguaiacol, 4-ethylphenol, eugenol and vanillin, using microwave-assisted deuterium exchange, for use as internal standards for stable isotope dilution analysis. The current method improves on previous strategies in that it enables incorporation of deuterium atoms on the aromatic ring, thereby ensuring retention of the isotope label during mass spectrometry fragmentation. When used as standards for SIDA, these labelled volatile phenols will improve the accuracy and reproducibility of quantitative food and beverage analysis. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Conducting On-orbit Gene Expression Analysis on ISS: WetLab-2

    NASA Technical Reports Server (NTRS)

    Parra, Macarena; Almeida, Eduardo; Boone, Travis; Jung, Jimmy; Lera, Matthew P.; Ricco, Antonio; Souza, Kenneth; Wu, Diana; Richey, C. Scott

    2013-01-01

    WetLab-2 will enable expanded genomic research on orbit by developing tools that support in situ sample collection, processing, and analysis on ISS. This capability will reduce the time-to-results for investigators and define new pathways for discovery on the ISS National Lab. The primary objective is to develop a research platform on ISS that will facilitate real-time quantitative gene expression analysis of biological samples collected on orbit. WetLab-2 will be capable of processing multiple sample types ranging from microbial cultures to animal tissues dissected on orbit. WetLab-2 will significantly expand the analytical capabilities onboard ISS and enhance science return from ISS.

  5. A century of enzyme kinetic analysis, 1913 to 2013.

    PubMed

    Johnson, Kenneth A

    2013-09-02

    This review traces the history and logical progression of methods for quantitative analysis of enzyme kinetics from the 1913 Michaelis and Menten paper to the application of modern computational methods today. Following a brief review of methods for fitting steady state kinetic data, modern methods are highlighted for fitting full progress curve kinetics based upon numerical integration of rate equations, including a re-analysis of the original Michaelis-Menten full time course kinetic data. Finally, several illustrations of modern transient state kinetic methods of analysis are shown which enable the elucidation of reactions occurring at the active sites of enzymes in order to relate structure and function. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  6. Using Fault Trees to Advance Understanding of Diagnostic Errors.

    PubMed

    Rogith, Deevakar; Iyengar, M Sriram; Singh, Hardeep

    2017-11-01

    Diagnostic errors annually affect at least 5% of adults in the outpatient setting in the United States. Formal analytic techniques are only infrequently used to understand them, in part because of the complexity of diagnostic processes and clinical work flows involved. In this article, diagnostic errors were modeled using fault tree analysis (FTA), a form of root cause analysis that has been successfully used in other high-complexity, high-risk contexts. How factors contributing to diagnostic errors can be systematically modeled by FTA to inform error understanding and error prevention is demonstrated. A team of three experts reviewed 10 published cases of diagnostic error and constructed fault trees. The fault trees were modeled according to currently available conceptual frameworks characterizing diagnostic error. The 10 trees were then synthesized into a single fault tree to identify common contributing factors and pathways leading to diagnostic error. FTA is a visual, structured, deductive approach that depicts the temporal sequence of events and their interactions in a formal logical hierarchy. The visual FTA enables easier understanding of causative processes and cognitive and system factors, as well as rapid identification of common pathways and interactions in a unified fashion. In addition, it enables calculation of empirical estimates for causative pathways. Thus, fault trees might provide a useful framework for both quantitative and qualitative analysis of diagnostic errors. Future directions include establishing validity and reliability by modeling a wider range of error cases, conducting quantitative evaluations, and undertaking deeper exploration of other FTA capabilities. Copyright © 2017 The Joint Commission. Published by Elsevier Inc. All rights reserved.

  7. An integrated enhancement and reconstruction strategy for the quantitative extraction of actin stress fibers from fluorescence micrographs.

    PubMed

    Zhang, Zhen; Xia, Shumin; Kanchanawong, Pakorn

    2017-05-22

    The stress fibers are prominent organization of actin filaments that perform important functions in cellular processes such as migration, polarization, and traction force generation, and whose collective organization reflects the physiological and mechanical activities of the cells. Easily visualized by fluorescence microscopy, the stress fibers are widely used as qualitative descriptors of cell phenotypes. However, due to the complexity of the stress fibers and the presence of other actin-containing cellular features, images of stress fibers are relatively challenging to quantitatively analyze using previously developed approaches, requiring significant user intervention. This poses a challenge for the automation of their detection, segmentation, and quantitative analysis. Here we describe an open-source software package, SFEX (Stress Fiber Extractor), which is geared for efficient enhancement, segmentation, and analysis of actin stress fibers in adherent tissue culture cells. Our method made use of a carefully chosen image filtering technique to enhance filamentous structures, effectively facilitating the detection and segmentation of stress fibers by binary thresholding. We subdivided the skeletons of stress fiber traces into piecewise-linear fragments, and used a set of geometric criteria to reconstruct the stress fiber networks by pairing appropriate fiber fragments. Our strategy enables the trajectory of a majority of stress fibers within the cells to be comprehensively extracted. We also present a method for quantifying the dimensions of the stress fibers using an image gradient-based approach. We determine the optimal parameter space using sensitivity analysis, and demonstrate the utility of our approach by analyzing actin stress fibers in cells cultured on various micropattern substrates. We present an open-source graphically-interfaced computational tool for the extraction and quantification of stress fibers in adherent cells with minimal user input. This facilitates the automated extraction of actin stress fibers from fluorescence images. We highlight their potential uses by analyzing images of cells with shapes constrained by fibronectin micropatterns. The method we reported here could serve as the first step in the detection and characterization of the spatial properties of actin stress fibers to enable further detailed morphological analysis.

  8. Software For Design Of Life-Support Systems

    NASA Technical Reports Server (NTRS)

    Rudokas, Mary R.; Cantwell, Elizabeth R.; Robinson, Peter I.; Shenk, Timothy W.

    1991-01-01

    Design Assistant Workstation (DAWN) computer program is prototype of expert software system for analysis and design of regenerative, physical/chemical life-support systems that revitalize air, reclaim water, produce food, and treat waste. Incorporates both conventional software for quantitative mathematical modeling of physical, chemical, and biological processes and expert system offering user stored knowledge about materials and processes. Constructs task tree as it leads user through simulated process, offers alternatives, and indicates where alternative not feasible. Also enables user to jump from one design level to another.

  9. Systems cell biology.

    PubMed

    Mast, Fred D; Ratushny, Alexander V; Aitchison, John D

    2014-09-15

    Systems cell biology melds high-throughput experimentation with quantitative analysis and modeling to understand many critical processes that contribute to cellular organization and dynamics. Recently, there have been several advances in technology and in the application of modeling approaches that enable the exploration of the dynamic properties of cells. Merging technology and computation offers an opportunity to objectively address unsolved cellular mechanisms, and has revealed emergent properties and helped to gain a more comprehensive and fundamental understanding of cell biology. © 2014 Mast et al.

  10. Methods and new approaches to the calculation of physiological parameters by videodensitometry

    NASA Technical Reports Server (NTRS)

    Kedem, D.; Londstrom, D. P.; Rhea, T. C., Jr.; Nelson, J. H.; Price, R. R.; Smith, C. W.; Graham, T. P., Jr.; Brill, A. B.; Kedem, D.

    1976-01-01

    A complex system featuring a video-camera connected to a video disk, cine (medical motion picture) camera and PDP-9 computer with various input/output facilities has been developed. This system enables the performance of quantitative analysis of various functions recorded in clinical studies. Several studies are described, such as heart chamber volume calculations, left ventricle ejection fraction, blood flow through the lungs and also the possibility of obtaining information about blood flow and constrictions in small cross-section vessels

  11. Diagnostics based on nucleic acid sequence variant profiling: PCR, hybridization, and NGS approaches.

    PubMed

    Khodakov, Dmitriy; Wang, Chunyan; Zhang, David Yu

    2016-10-01

    Nucleic acid sequence variations have been implicated in many diseases, and reliable detection and quantitation of DNA/RNA biomarkers can inform effective therapeutic action, enabling precision medicine. Nucleic acid analysis technologies being translated into the clinic can broadly be classified into hybridization, PCR, and sequencing, as well as their combinations. Here we review the molecular mechanisms of popular commercial assays, and their progress in translation into in vitro diagnostics. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  12. Psycho-Motor and Error Enabled Simulations Modeling Vulnerable Skills in the Pre-Mastery Phase - Medical Practice Initiative Procedural Skill Decay and Maintenance (MPI-PSD)

    DTIC Science & Technology

    2015-04-01

    and execution of Performance Review Tool; Organization, coding, and transcribing of collected data; Analysis of qualitative survey and quantitative...University of Wisconsin System Madison, WI 53715-1218 REPORT DATE: April 2015 TYPE OF REPORT: Annual PREPARED FOR: U.S. Army Medical Research and...MONITOR’S ACRONYM(S) U.S. Army Medical Research and Material Command Fort Detrick, Maryland 21702-5012 11. SPONSOR/MONITOR’S REPORT NUMBER

  13. Corrosion Control through a Better Understanding of the Metallic Substrate/Organic Coating/Interface.

    DTIC Science & Technology

    1982-12-01

    run to run. A Karl Fischer automatic titrimeter has been ordered to enable routine analysis of water in both the inlet and exit streams to determine...Block-Styrene)," M.S. Thesis, Chemical Engineering, June 1982, by D. E. Zurawski. "Electron Optical Methods and the Study of Corrosion," M.S. Thesis...interface as viewed through a thin transparent metal deposited onto glass. The latter method will permit quantitative studies of the corrosion and

  14. Targeted quantitative analysis of Streptococcus pyogenes virulence factors by multiple reaction monitoring.

    PubMed

    Lange, Vinzenz; Malmström, Johan A; Didion, John; King, Nichole L; Johansson, Björn P; Schäfer, Juliane; Rameseder, Jonathan; Wong, Chee-Hong; Deutsch, Eric W; Brusniak, Mi-Youn; Bühlmann, Peter; Björck, Lars; Domon, Bruno; Aebersold, Ruedi

    2008-08-01

    In many studies, particularly in the field of systems biology, it is essential that identical protein sets are precisely quantified in multiple samples such as those representing differentially perturbed cell states. The high degree of reproducibility required for such experiments has not been achieved by classical mass spectrometry-based proteomics methods. In this study we describe the implementation of a targeted quantitative approach by which predetermined protein sets are first identified and subsequently quantified at high sensitivity reliably in multiple samples. This approach consists of three steps. First, the proteome is extensively mapped out by multidimensional fractionation and tandem mass spectrometry, and the data generated are assembled in the PeptideAtlas database. Second, based on this proteome map, peptides uniquely identifying the proteins of interest, proteotypic peptides, are selected, and multiple reaction monitoring (MRM) transitions are established and validated by MS2 spectrum acquisition. This process of peptide selection, transition selection, and validation is supported by a suite of software tools, TIQAM (Targeted Identification for Quantitative Analysis by MRM), described in this study. Third, the selected target protein set is quantified in multiple samples by MRM. Applying this approach we were able to reliably quantify low abundance virulence factors from cultures of the human pathogen Streptococcus pyogenes exposed to increasing amounts of plasma. The resulting quantitative protein patterns enabled us to clearly define the subset of virulence proteins that is regulated upon plasma exposure.

  15. Surface plasmon resonance microscopy: achieving a quantitative optical response

    PubMed Central

    Peterson, Alexander W.; Halter, Michael; Plant, Anne L.; Elliott, John T.

    2016-01-01

    Surface plasmon resonance (SPR) imaging allows real-time label-free imaging based on index of refraction, and changes in index of refraction at an interface. Optical parameter analysis is achieved by application of the Fresnel model to SPR data typically taken by an instrument in a prism based configuration. We carry out SPR imaging on a microscope by launching light into a sample, and collecting reflected light through a high numerical aperture microscope objective. The SPR microscope enables spatial resolution that approaches the diffraction limit, and has a dynamic range that allows detection of subnanometer to submicrometer changes in thickness of biological material at a surface. However, unambiguous quantitative interpretation of SPR changes using the microscope system could not be achieved using the Fresnel model because of polarization dependent attenuation and optical aberration that occurs in the high numerical aperture objective. To overcome this problem, we demonstrate a model to correct for polarization diattenuation and optical aberrations in the SPR data, and develop a procedure to calibrate reflectivity to index of refraction values. The calibration and correction strategy for quantitative analysis was validated by comparing the known indices of refraction of bulk materials with corrected SPR data interpreted with the Fresnel model. Subsequently, we applied our SPR microscopy method to evaluate the index of refraction for a series of polymer microspheres in aqueous media and validated the quality of the measurement with quantitative phase microscopy. PMID:27782542

  16. Framework for the quantitative weight-of-evidence analysis of 'omics data for regulatory purposes.

    PubMed

    Bridges, Jim; Sauer, Ursula G; Buesen, Roland; Deferme, Lize; Tollefsen, Knut E; Tralau, Tewes; van Ravenzwaay, Ben; Poole, Alan; Pemberton, Mark

    2017-12-01

    A framework for the quantitative weight-of-evidence (QWoE) analysis of 'omics data for regulatory purposes is presented. The QWoE framework encompasses seven steps to evaluate 'omics data (also together with non-'omics data): (1) Hypothesis formulation, identification and weighting of lines of evidence (LoEs). LoEs conjoin different (types of) studies that are used to critically test the hypothesis. As an essential component of the QWoE framework, step 1 includes the development of templates for scoring sheets that predefine scoring criteria with scores of 0-4 to enable a quantitative determination of study quality and data relevance; (2) literature searches and categorisation of studies into the pre-defined LoEs; (3) and (4) quantitative assessment of study quality and data relevance using the respective pre-defined scoring sheets for each study; (5) evaluation of LoE-specific strength of evidence based upon the study quality and study relevance scores of the studies conjoined in the respective LoE; (6) integration of the strength of evidence from the individual LoEs to determine the overall strength of evidence; (7) characterisation of uncertainties and conclusion on the QWoE. To put the QWoE framework in practice, case studies are recommended to confirm the relevance of its different steps, or to adapt them as necessary. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Robust biological parametric mapping: an improved technique for multimodal brain image analysis

    NASA Astrophysics Data System (ADS)

    Yang, Xue; Beason-Held, Lori; Resnick, Susan M.; Landman, Bennett A.

    2011-03-01

    Mapping the quantitative relationship between structure and function in the human brain is an important and challenging problem. Numerous volumetric, surface, region of interest and voxelwise image processing techniques have been developed to statistically assess potential correlations between imaging and non-imaging metrics. Recently, biological parametric mapping has extended the widely popular statistical parametric approach to enable application of the general linear model to multiple image modalities (both for regressors and regressands) along with scalar valued observations. This approach offers great promise for direct, voxelwise assessment of structural and functional relationships with multiple imaging modalities. However, as presented, the biological parametric mapping approach is not robust to outliers and may lead to invalid inferences (e.g., artifactual low p-values) due to slight mis-registration or variation in anatomy between subjects. To enable widespread application of this approach, we introduce robust regression and robust inference in the neuroimaging context of application of the general linear model. Through simulation and empirical studies, we demonstrate that our robust approach reduces sensitivity to outliers without substantial degradation in power. The robust approach and associated software package provides a reliable way to quantitatively assess voxelwise correlations between structural and functional neuroimaging modalities.

  18. Rayleigh imaging in spectral mammography

    NASA Astrophysics Data System (ADS)

    Berggren, Karl; Danielsson, Mats; Fredenberg, Erik

    2016-03-01

    Spectral imaging is the acquisition of multiple images of an object at different energy spectra. In mammography, dual-energy imaging (spectral imaging with two energy levels) has been investigated for several applications, in particular material decomposition, which allows for quantitative analysis of breast composition and quantitative contrast-enhanced imaging. Material decomposition with dual-energy imaging is based on the assumption that there are two dominant photon interaction effects that determine linear attenuation: the photoelectric effect and Compton scattering. This assumption limits the number of basis materials, i.e. the number of materials that are possible to differentiate between, to two. However, Rayleigh scattering may account for more than 10% of the linear attenuation in the mammography energy range. In this work, we show that a modified version of a scanning multi-slit spectral photon-counting mammography system is able to acquire three images at different spectra and can be used for triple-energy imaging. We further show that triple-energy imaging in combination with the efficient scatter rejection of the system enables measurement of Rayleigh scattering, which adds an additional energy dependency to the linear attenuation and enables material decomposition with three basis materials. Three available basis materials have the potential to improve virtually all applications of spectral imaging.

  19. Data article on the effectiveness of entrepreneurship curriculum contents on entrepreneurial interest and knowledge of Nigerian university students.

    PubMed

    Olokundun, Maxwell; Iyiola, Oluwole; Ibidunni, Stephen; Ogbari, Mercy; Falola, Hezekiah; Salau, Odunayo; Peter, Fred; Borishade, Taiye

    2018-06-01

    The article presented data on the effectiveness of entrepreneurship curriculum contents on university students' entrepreneurial interest and knowledge. The study focused on the perceptions of Nigerian university students. Emphasis was laid on the first four universities in Nigeria to offer a degree programme in entrepreneurship. The study adopted quantitative approach with a descriptive research design to establish trends related to the objective of the study. Survey was be used as quantitative research method. The population of this study included all students in the selected universities. Data was analyzed with the use of Statistical Package for Social Sciences (SPSS). Mean score was used as statistical tool of analysis. The field data set is made widely accessible to enable critical or a more comprehensive investigation.

  20. Applications of mass spectrometry for quantitative protein analysis in formalin-fixed paraffin-embedded tissues

    PubMed Central

    Steiner, Carine; Ducret, Axel; Tille, Jean-Christophe; Thomas, Marlene; McKee, Thomas A; Rubbia-Brandt, Laura A; Scherl, Alexander; Lescuyer, Pierre; Cutler, Paul

    2014-01-01

    Proteomic analysis of tissues has advanced in recent years as instruments and methodologies have evolved. The ability to retrieve peptides from formalin-fixed paraffin-embedded tissues followed by shotgun or targeted proteomic analysis is offering new opportunities in biomedical research. In particular, access to large collections of clinically annotated samples should enable the detailed analysis of pathologically relevant tissues in a manner previously considered unfeasible. In this paper, we review the current status of proteomic analysis of formalin-fixed paraffin-embedded tissues with a particular focus on targeted approaches and the potential for this technique to be used in clinical research and clinical diagnosis. We also discuss the limitations and perspectives of the technique, particularly with regard to application in clinical diagnosis and drug discovery. PMID:24339433

  1. Semi-quantitative analysis of solid waste flows from nano-enabled consumer products in Europe, Denmark and the United Kingdom - Abundance, distribution and management.

    PubMed

    Heggelund, Laura; Hansen, Steffen Foss; Astrup, Thomas Fruergaard; Boldrin, Alessio

    2016-10-01

    Many nano-enabled consumer products are known to be in the global market. At the same, little is known about the quantity, type, location etc. of the engineered nanomaterials (ENMs) inside the products. This limits the scientific investigations of potential environmental effects of these materials, and especially the knowledge of ENM behaviour and potential effects at the end-of-life stage of the products is scarce. To gain a better understanding of the end-of-life waste treatment of nano-enabled consumer product, we provide an overview of the ENMs flowing into and throughout waste systems in Europe, Denmark and the United Kingdom. Using a nanoproduct inventory (nanodb.dk), we performed a four-step analysis to estimate the most abundant ENMs and in which waste fractions they are present. We found that in terms of number of products: (i) nano silver is the most used ENM in consumer products, and (ii) plastic from used product containers is the largest waste fraction also comprising a large variety of ENMs, though possibly in very small masses. Also, we showed that the local waste management system can influence the distribution of ENMs. It is recommended that future research focus on recycling and landfilling of nano-enabled products since these compartments represent hot spots for end-of-life nanoproducts. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Enabling comparative effectiveness research with informatics: show me the data!

    PubMed

    Safdar, Nabile M; Siegel, Eliot; Erickson, Bradley J; Nagy, Paul

    2011-09-01

    Both outcomes researchers and informaticians are concerned with information and data. As such, some of the central challenges to conducting successful comparative effectiveness research can be addressed with informatics solutions. Specific informatics solutions which address how data in comparative effectiveness research are enriched, stored, shared, and analyzed are reviewed. Imaging data can be made more quantitative, uniform, and structured for researchers through the use of lexicons and structured reporting. Secure and scalable storage of research data is enabled through data warehouses and cloud services. There are a number of national efforts to help researchers share research data and analysis tools. There is a diverse arsenal of informatics tools designed to meet the needs of comparative effective researchers. Copyright © 2011 AUR. Published by Elsevier Inc. All rights reserved.

  3. Human Connectome Project Informatics: quality control, database services, and data visualization

    PubMed Central

    Marcus, Daniel S.; Harms, Michael P.; Snyder, Abraham Z.; Jenkinson, Mark; Wilson, J Anthony; Glasser, Matthew F.; Barch, Deanna M.; Archie, Kevin A.; Burgess, Gregory C.; Ramaratnam, Mohana; Hodge, Michael; Horton, William; Herrick, Rick; Olsen, Timothy; McKay, Michael; House, Matthew; Hileman, Michael; Reid, Erin; Harwell, John; Coalson, Timothy; Schindler, Jon; Elam, Jennifer S.; Curtiss, Sandra W.; Van Essen, David C.

    2013-01-01

    The Human Connectome Project (HCP) has developed protocols, standard operating and quality control procedures, and a suite of informatics tools to enable high throughput data collection, data sharing, automated data processing and analysis, and data mining and visualization. Quality control procedures include methods to maintain data collection consistency over time, to measure head motion, and to establish quantitative modality-specific overall quality assessments. Database services developed as customizations of the XNAT imaging informatics platform support both internal daily operations and open access data sharing. The Connectome Workbench visualization environment enables user interaction with HCP data and is increasingly integrated with the HCP's database services. Here we describe the current state of these procedures and tools and their application in the ongoing HCP study. PMID:23707591

  4. An analysis toolbox to explore mesenchymal migration heterogeneity reveals adaptive switching between distinct modes

    PubMed Central

    Shafqat-Abbasi, Hamdah; Kowalewski, Jacob M; Kiss, Alexa; Gong, Xiaowei; Hernandez-Varas, Pablo; Berge, Ulrich; Jafari-Mamaghani, Mehrdad; Lock, John G; Strömblad, Staffan

    2016-01-01

    Mesenchymal (lamellipodial) migration is heterogeneous, although whether this reflects progressive variability or discrete, 'switchable' migration modalities, remains unclear. We present an analytical toolbox, based on quantitative single-cell imaging data, to interrogate this heterogeneity. Integrating supervised behavioral classification with multivariate analyses of cell motion, membrane dynamics, cell-matrix adhesion status and F-actin organization, this toolbox here enables the detection and characterization of two quantitatively distinct mesenchymal migration modes, termed 'Continuous' and 'Discontinuous'. Quantitative mode comparisons reveal differences in cell motion, spatiotemporal coordination of membrane protrusion/retraction, and how cells within each mode reorganize with changed cell speed. These modes thus represent distinctive migratory strategies. Additional analyses illuminate the macromolecular- and cellular-scale effects of molecular targeting (fibronectin, talin, ROCK), including 'adaptive switching' between Continuous (favored at high adhesion/full contraction) and Discontinuous (low adhesion/inhibited contraction) modes. Overall, this analytical toolbox now facilitates the exploration of both spontaneous and adaptive heterogeneity in mesenchymal migration. DOI: http://dx.doi.org/10.7554/eLife.11384.001 PMID:26821527

  5. A multi-center study benchmarks software tools for label-free proteome quantification

    PubMed Central

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  6. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  7. Global, quantitative and dynamic mapping of protein subcellular localization

    PubMed Central

    Itzhak, Daniel N; Tyanova, Stefka; Cox, Jürgen; Borner, Georg HH

    2016-01-01

    Subcellular localization critically influences protein function, and cells control protein localization to regulate biological processes. We have developed and applied Dynamic Organellar Maps, a proteomic method that allows global mapping of protein translocation events. We initially used maps statically to generate a database with localization and absolute copy number information for over 8700 proteins from HeLa cells, approaching comprehensive coverage. All major organelles were resolved, with exceptional prediction accuracy (estimated at >92%). Combining spatial and abundance information yielded an unprecedented quantitative view of HeLa cell anatomy and organellar composition, at the protein level. We subsequently demonstrated the dynamic capabilities of the approach by capturing translocation events following EGF stimulation, which we integrated into a quantitative model. Dynamic Organellar Maps enable the proteome-wide analysis of physiological protein movements, without requiring any reagents specific to the investigated process, and will thus be widely applicable in cell biology. DOI: http://dx.doi.org/10.7554/eLife.16950.001 PMID:27278775

  8. Large field of view quantitative phase imaging of induced pluripotent stem cells and optical pathlength reference materials

    NASA Astrophysics Data System (ADS)

    Kwee, Edward; Peterson, Alexander; Stinson, Jeffrey; Halter, Michael; Yu, Liya; Majurski, Michael; Chalfoun, Joe; Bajcsy, Peter; Elliott, John

    2018-02-01

    Induced pluripotent stem cells (iPSCs) are reprogrammed cells that can have heterogeneous biological potential. Quality assurance metrics of reprogrammed iPSCs will be critical to ensure reliable use in cell therapies and personalized diagnostic tests. We present a quantitative phase imaging (QPI) workflow which includes acquisition, processing, and stitching multiple adjacent image tiles across a large field of view (LFOV) of a culture vessel. Low magnification image tiles (10x) were acquired with a Phasics SID4BIO camera on a Zeiss microscope. iPSC cultures were maintained using a custom stage incubator on an automated stage. We implement an image acquisition strategy that compensates for non-flat illumination wavefronts to enable imaging of an entire well plate, including the meniscus region normally obscured in Zernike phase contrast imaging. Polynomial fitting and background mode correction was implemented to enable comparability and stitching between multiple tiles. LFOV imaging of reference materials indicated that image acquisition and processing strategies did not affect quantitative phase measurements across the LFOV. Analysis of iPSC colony images demonstrated mass doubling time was significantly different than area doubling time. These measurements were benchmarked with prototype microsphere beads and etched-glass gratings with specified spatial dimensions designed to be QPI reference materials with optical pathlength shifts suitable for cell microscopy. This QPI workflow and the use of reference materials can provide non-destructive traceable imaging method for novel iPSC heterogeneity characterization.

  9. Reliable gene expression analysis by reverse transcription-quantitative PCR: reporting and minimizing the uncertainty in data accuracy.

    PubMed

    Remans, Tony; Keunen, Els; Bex, Geert Jan; Smeets, Karen; Vangronsveld, Jaco; Cuypers, Ann

    2014-10-01

    Reverse transcription-quantitative PCR (RT-qPCR) has been widely adopted to measure differences in mRNA levels; however, biological and technical variation strongly affects the accuracy of the reported differences. RT-qPCR specialists have warned that, unless researchers minimize this variability, they may report inaccurate differences and draw incorrect biological conclusions. The Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) guidelines describe procedures for conducting and reporting RT-qPCR experiments. The MIQE guidelines enable others to judge the reliability of reported results; however, a recent literature survey found low adherence to these guidelines. Additionally, even experiments that use appropriate procedures remain subject to individual variation that statistical methods cannot correct. For example, since ideal reference genes do not exist, the widely used method of normalizing RT-qPCR data to reference genes generates background noise that affects the accuracy of measured changes in mRNA levels. However, current RT-qPCR data reporting styles ignore this source of variation. In this commentary, we direct researchers to appropriate procedures, outline a method to present the remaining uncertainty in data accuracy, and propose an intuitive way to select reference genes to minimize uncertainty. Reporting the uncertainty in data accuracy also serves for quality assessment, enabling researchers and peer reviewers to confidently evaluate the reliability of gene expression data. © 2014 American Society of Plant Biologists. All rights reserved.

  10. Large-Scale Interlaboratory Study to Develop, Analytically Validate and Apply Highly Multiplexed, Quantitative Peptide Assays to Measure Cancer-Relevant Proteins in Plasma*

    PubMed Central

    Abbatiello, Susan E.; Schilling, Birgit; Mani, D. R.; Zimmerman, Lisa J.; Hall, Steven C.; MacLean, Brendan; Albertolle, Matthew; Allen, Simon; Burgess, Michael; Cusack, Michael P.; Gosh, Mousumi; Hedrick, Victoria; Held, Jason M.; Inerowicz, H. Dorota; Jackson, Angela; Keshishian, Hasmik; Kinsinger, Christopher R.; Lyssand, John; Makowski, Lee; Mesri, Mehdi; Rodriguez, Henry; Rudnick, Paul; Sadowski, Pawel; Sedransk, Nell; Shaddox, Kent; Skates, Stephen J.; Kuhn, Eric; Smith, Derek; Whiteaker, Jeffery R.; Whitwell, Corbin; Zhang, Shucha; Borchers, Christoph H.; Fisher, Susan J.; Gibson, Bradford W.; Liebler, Daniel C.; MacCoss, Michael J.; Neubert, Thomas A.; Paulovich, Amanda G.; Regnier, Fred E.; Tempst, Paul; Carr, Steven A.

    2015-01-01

    There is an increasing need in biology and clinical medicine to robustly and reliably measure tens to hundreds of peptides and proteins in clinical and biological samples with high sensitivity, specificity, reproducibility, and repeatability. Previously, we demonstrated that LC-MRM-MS with isotope dilution has suitable performance for quantitative measurements of small numbers of relatively abundant proteins in human plasma and that the resulting assays can be transferred across laboratories while maintaining high reproducibility and quantitative precision. Here, we significantly extend that earlier work, demonstrating that 11 laboratories using 14 LC-MS systems can develop, determine analytical figures of merit, and apply highly multiplexed MRM-MS assays targeting 125 peptides derived from 27 cancer-relevant proteins and seven control proteins to precisely and reproducibly measure the analytes in human plasma. To ensure consistent generation of high quality data, we incorporated a system suitability protocol (SSP) into our experimental design. The SSP enabled real-time monitoring of LC-MRM-MS performance during assay development and implementation, facilitating early detection and correction of chromatographic and instrumental problems. Low to subnanogram/ml sensitivity for proteins in plasma was achieved by one-step immunoaffinity depletion of 14 abundant plasma proteins prior to analysis. Median intra- and interlaboratory reproducibility was <20%, sufficient for most biological studies and candidate protein biomarker verification. Digestion recovery of peptides was assessed and quantitative accuracy improved using heavy-isotope-labeled versions of the proteins as internal standards. Using the highly multiplexed assay, participating laboratories were able to precisely and reproducibly determine the levels of a series of analytes in blinded samples used to simulate an interlaboratory clinical study of patient samples. Our study further establishes that LC-MRM-MS using stable isotope dilution, with appropriate attention to analytical validation and appropriate quality control measures, enables sensitive, specific, reproducible, and quantitative measurements of proteins and peptides in complex biological matrices such as plasma. PMID:25693799

  11. ANTONIA perfusion and stroke. A software tool for the multi-purpose analysis of MR perfusion-weighted datasets and quantitative ischemic stroke assessment.

    PubMed

    Forkert, N D; Cheng, B; Kemmling, A; Thomalla, G; Fiehler, J

    2014-01-01

    The objective of this work is to present the software tool ANTONIA, which has been developed to facilitate a quantitative analysis of perfusion-weighted MRI (PWI) datasets in general as well as the subsequent multi-parametric analysis of additional datasets for the specific purpose of acute ischemic stroke patient dataset evaluation. Three different methods for the analysis of DSC or DCE PWI datasets are currently implemented in ANTONIA, which can be case-specifically selected based on the study protocol. These methods comprise a curve fitting method as well as a deconvolution-based and deconvolution-free method integrating a previously defined arterial input function. The perfusion analysis is extended for the purpose of acute ischemic stroke analysis by additional methods that enable an automatic atlas-based selection of the arterial input function, an analysis of the perfusion-diffusion and DWI-FLAIR mismatch as well as segmentation-based volumetric analyses. For reliability evaluation, the described software tool was used by two observers for quantitative analysis of 15 datasets from acute ischemic stroke patients to extract the acute lesion core volume, FLAIR ratio, perfusion-diffusion mismatch volume with manually as well as automatically selected arterial input functions, and follow-up lesion volume. The results of this evaluation revealed that the described software tool leads to highly reproducible results for all parameters if the automatic arterial input function selection method is used. Due to the broad selection of processing methods that are available in the software tool, ANTONIA is especially helpful to support image-based perfusion and acute ischemic stroke research projects.

  12. Multiplexed and Microparticle-based Analyses: Quantitative Tools for the Large-Scale Analysis of Biological Systems

    PubMed Central

    Nolan, John P.; Mandy, Francis

    2008-01-01

    While the term flow cytometry refers to the measurement of cells, the approach of making sensitive multiparameter optical measurements in a flowing sample stream is a very general analytical approach. The past few years have seen an explosion in the application of flow cytometry technology for molecular analysis and measurements using micro-particles as solid supports. While microsphere-based molecular analyses using flow cytometry date back three decades, the need for highly parallel quantitative molecular measurements that has arisen from various genomic and proteomic advances has driven the development in particle encoding technology to enable highly multiplexed assays. Multiplexed particle-based immunoassays are now common place, and new assays to study genes, protein function, and molecular assembly. Numerous efforts are underway to extend the multiplexing capabilities of microparticle-based assays through new approaches to particle encoding and analyte reporting. The impact of these developments will be seen in the basic research and clinical laboratories, as well as in drug development. PMID:16604537

  13. Improved hydrophilic interaction chromatography LC/MS of heparinoids using a chip with postcolumn makeup flow.

    PubMed

    Staples, Gregory O; Naimy, Hicham; Yin, Hongfeng; Kileen, Kevin; Kraiczek, Karsten; Costello, Catherine E; Zaia, Joseph

    2010-01-15

    Heparan sulfate (HS) and heparin are linear, heterogeneous carbohydrates of the glycosaminoglycan (GAG) family that are modified by N-acetylation, N-sulfation, O-sulfation, and uronic acid epimerization. HS interacts with growth factors in the extracellular matrix, thereby modulating signaling pathways that govern cell growth, development, differentiation, proliferation, and adhesion. High-performance liquid chromatography (HPLC)-chip-based hydrophilic interaction liquid chromatography/mass spectrometry has emerged as a method for analyzing the domain structure of GAGs. However, analysis of highly sulfated GAG structures decasaccharide or larger in size has been limited by spray instability in the negative-ion mode. This report demonstrates that addition of postcolumn makeup flow to the amide-HPLC-chip configuration permits robust and reproducible analysis of extended GAG domains (up to degree of polymerization 18) from HS and heparin. This platform provides quantitative information regarding the oligosaccharide profile, degree of sulfation, and nonreducing chain termini. It is expected that this technology will enable quantitative, comparative glycomics profiling of extended GAG oligosaccharide domains of functional interest.

  14. MetaPathways v2.5: quantitative functional, taxonomic and usability improvements.

    PubMed

    Konwar, Kishori M; Hanson, Niels W; Bhatia, Maya P; Kim, Dongjae; Wu, Shang-Ju; Hahn, Aria S; Morgan-Lang, Connor; Cheung, Hiu Kan; Hallam, Steven J

    2015-10-15

    Next-generation sequencing is producing vast amounts of sequence information from natural and engineered ecosystems. Although this data deluge has an enormous potential to transform our lives, knowledge creation and translation need software applications that scale with increasing data processing and analysis requirements. Here, we present improvements to MetaPathways, an annotation and analysis pipeline for environmental sequence information that expedites this transformation. We specifically address pathway prediction hazards through integration of a weighted taxonomic distance and enable quantitative comparison of assembled annotations through a normalized read-mapping measure. Additionally, we improve LAST homology searches through BLAST-equivalent E-values and output formats that are natively compatible with prevailing software applications. Finally, an updated graphical user interface allows for keyword annotation query and projection onto user-defined functional gene hierarchies, including the Carbohydrate-Active Enzyme database. MetaPathways v2.5 is available on GitHub: http://github.com/hallamlab/metapathways2. shallam@mail.ubc.ca Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  15. Note: An automated image analysis method for high-throughput classification of surface-bound bacterial cell motions.

    PubMed

    Shen, Simon; Syal, Karan; Tao, Nongjian; Wang, Shaopeng

    2015-12-01

    We present a Single-Cell Motion Characterization System (SiCMoCS) to automatically extract bacterial cell morphological features from microscope images and use those features to automatically classify cell motion for rod shaped motile bacterial cells. In some imaging based studies, bacteria cells need to be attached to the surface for time-lapse observation of cellular processes such as cell membrane-protein interactions and membrane elasticity. These studies often generate large volumes of images. Extracting accurate bacterial cell morphology features from these images is critical for quantitative assessment. Using SiCMoCS, we demonstrated simultaneous and automated motion tracking and classification of hundreds of individual cells in an image sequence of several hundred frames. This is a significant improvement from traditional manual and semi-automated approaches to segmenting bacterial cells based on empirical thresholds, and a first attempt to automatically classify bacterial motion types for motile rod shaped bacterial cells, which enables rapid and quantitative analysis of various types of bacterial motion.

  16. Systems analysis of the single photon response in invertebrate photoreceptors.

    PubMed

    Pumir, Alain; Graves, Jennifer; Ranganathan, Rama; Shraiman, Boris I

    2008-07-29

    Photoreceptors of Drosophila compound eye employ a G protein-mediated signaling pathway that transduces single photons into transient electrical responses called "quantum bumps" (QB). Although most of the molecular components of this pathway are already known, the system-level understanding of the mechanism of QB generation has remained elusive. Here, we present a quantitative model explaining how QBs emerge from stochastic nonlinear dynamics of the signaling cascade. The model shows that the cascade acts as an "integrate and fire" device and explains how photoreceptors achieve reliable responses to light although keeping low background in the dark. The model predicts the nontrivial behavior of mutants that enhance or suppress signaling and explains the dependence on external calcium, which controls feedback regulation. The results provide insight into physiological questions such as single-photon response efficiency and the adaptation of response to high incident-light level. The system-level analysis enabled by modeling phototransduction provides a foundation for understanding G protein signaling pathways less amenable to quantitative approaches.

  17. Wetlab-2 - Quantitative PCR Tools for Spaceflight Studies of Gene Expression Aboard the International Space Station

    NASA Technical Reports Server (NTRS)

    Schonfeld, Julie E.

    2015-01-01

    Wetlab-2 is a research platform for conducting real-time quantitative gene expression analysis aboard the International Space Station. The system enables spaceflight genomic studies involving a wide variety of biospecimen types in the unique microgravity environment of space. Currently, gene expression analyses of space flown biospecimens must be conducted post flight after living cultures or frozen or chemically fixed samples are returned to Earth from the space station. Post-flight analysis is limited for several reasons. First, changes in gene expression can be transient, changing over a timescale of minutes. The delay between sampling on Earth can range from days to months, and RNA may degrade during this period of time, even in fixed or frozen samples. Second, living organisms that return to Earth may quickly re-adapt to terrestrial conditions. Third, forces exerted on samples during reentry and return to Earth may affect results. Lastly, follow up experiments designed in response to post-flight results must wait for a new flight opportunity to be tested.

  18. Astrobiological complexity with probabilistic cellular automata.

    PubMed

    Vukotić, Branislav; Ćirković, Milan M

    2012-08-01

    The search for extraterrestrial life and intelligence constitutes one of the major endeavors in science, but has yet been quantitatively modeled only rarely and in a cursory and superficial fashion. We argue that probabilistic cellular automata (PCA) represent the best quantitative framework for modeling the astrobiological history of the Milky Way and its Galactic Habitable Zone. The relevant astrobiological parameters are to be modeled as the elements of the input probability matrix for the PCA kernel. With the underlying simplicity of the cellular automata constructs, this approach enables a quick analysis of large and ambiguous space of the input parameters. We perform a simple clustering analysis of typical astrobiological histories with "Copernican" choice of input parameters and discuss the relevant boundary conditions of practical importance for planning and guiding empirical astrobiological and SETI projects. In addition to showing how the present framework is adaptable to more complex situations and updated observational databases from current and near-future space missions, we demonstrate how numerical results could offer a cautious rationale for continuation of practical SETI searches.

  19. Birefringence measurement of retinal nerve fiber layer using polarization-sensitive spectral domain optical coherence tomography with Jones matrix based analysis

    NASA Astrophysics Data System (ADS)

    Yamanari, Masahiro; Miura, Masahiro; Makita, Shuichi; Yatagai, Toyohiko; Yasuno, Yoshiaki

    2007-02-01

    Birefringence of retinal nerve fiber layer is measured by polarization-sensitive spectral domain optical coherence tomography using the B-scan-oriented polarization modulation method. Birefringence of the optical fiber and the cornea is compensated by Jones matrix based analysis. Three-dimensional phase retardation map around the optic nerve head and en-face phase retardation map of the retinal nerve fiber layer are shown. Unlike scanning laser polarimetry, our system can measure the phase retardation quantitatively without using bow-tie pattern of the birefringence in the macular region, which enables diagnosis of glaucoma even if the patients have macular disease.

  20. 3-D interactive visualisation tools for Hi spectral line imaging

    NASA Astrophysics Data System (ADS)

    van der Hulst, J. M.; Punzo, D.; Roerdink, J. B. T. M.

    2017-06-01

    Upcoming HI surveys will deliver such large datasets that automated processing using the full 3-D information to find and characterize HI objects is unavoidable. Full 3-D visualization is an essential tool for enabling qualitative and quantitative inspection and analysis of the 3-D data, which is often complex in nature. Here we present SlicerAstro, an open-source extension of 3DSlicer, a multi-platform open source software package for visualization and medical image processing, which we developed for the inspection and analysis of HI spectral line data. We describe its initial capabilities, including 3-D filtering, 3-D selection and comparative modelling.

  1. Photocleavable DNA barcode-antibody conjugates allow sensitive and multiplexed protein analysis in single cells.

    PubMed

    Agasti, Sarit S; Liong, Monty; Peterson, Vanessa M; Lee, Hakho; Weissleder, Ralph

    2012-11-14

    DNA barcoding is an attractive technology, as it allows sensitive and multiplexed target analysis. However, DNA barcoding of cellular proteins remains challenging, primarily because barcode amplification and readout techniques are often incompatible with the cellular microenvironment. Here we describe the development and validation of a photocleavable DNA barcode-antibody conjugate method for rapid, quantitative, and multiplexed detection of proteins in single live cells. Following target binding, this method allows DNA barcodes to be photoreleased in solution, enabling easy isolation, amplification, and readout. As a proof of principle, we demonstrate sensitive and multiplexed detection of protein biomarkers in a variety of cancer cells.

  2. Defining and Enabling Resiliency of Electric Distribution Systems With Multiple Microgrids

    DOE PAGES

    Chanda, Sayonsom; Srivastava, Anurag K.

    2016-05-02

    This paper presents a method for quantifying and enabling the resiliency of a power distribution system (PDS) using analytical hierarchical process and percolation theory. Using this metric, quantitative analysis can be done to analyze the impact of possible control decisions to pro-actively enable the resilient operation of distribution system with multiple microgrids and other resources. Developed resiliency metric can also be used in short term distribution system planning. The benefits of being able to quantify resiliency can help distribution system planning engineers and operators to justify control actions, compare different reconfiguration algorithms, develop proactive control actions to avert power systemmore » outage due to impending catastrophic weather situations or other adverse events. Validation of the proposed method is done using modified CERTS microgrids and a modified industrial distribution system. Furthermore, simulation results show topological and composite metric considering power system characteristics to quantify the resiliency of a distribution system with the proposed methodology, and improvements in resiliency using two-stage reconfiguration algorithm and multiple microgrids.« less

  3. Compartmentalized microchannel array for high-throughput analysis of single cell polarized growth and dynamics

    DOE PAGES

    Geng, Tao; Bredeweg, Erin L.; Szymanski, Craig J.; ...

    2015-11-04

    Here, interrogating polarized growth is technologically challenging due to extensive cellular branching and uncontrollable environmental conditions in conventional assays. Here we present a robust and high-performance microfluidic system that enables observations of polarized growth with enhanced temporal and spatial control over prolonged periods. The system has built-in tunability and versatility to accommodate a variety of science applications requiring precisely controlled environments. Using the model filamentous fungus, Neurospora crassa, this microfluidic system enabled direct visualization and analysis of cellular heterogeneity in a clonal fungal cell population, nuclear distribution and dynamics at the subhyphal level, and quantitative dynamics of gene expression withmore » single hyphal compartment resolution in response to carbon source starvation and exchange experiments. Although the microfluidic device is demonstrated on filamentous fungi, our technology is immediately extensible to a wide array of other biosystems that exhibit similar polarized cell growth with applications ranging from bioenergy production to human health.« less

  4. Microfluidic PDMS on paper (POP) devices.

    PubMed

    Shangguan, Jin-Wen; Liu, Yu; Pan, Jian-Bin; Xu, Bi-Yi; Xu, Jing-Juan; Chen, Hong-Yuan

    2016-12-20

    In this paper, we propose a generalized concept of microfluidic polydimethylsiloxane (PDMS) on paper (POP) devices, which combines well the merits of paper chips and PDMS chips. First, we optimized the conditions for accurate PDMS spatial patterning on paper, based on screen printing and a high temperature enabled superfast curing technique, which enables PDMS patterning to an accuracy of tens of microns in less than ten seconds. This, in turn, makes it available for seamless, reversible and reliable integration of the resulting paper layer with other PDMS channel structures. The integrated POP devices allow for both porous paper and smooth channels to be spatially defined on the devices, greatly extending the flexibility for designers to be able to construct powerful functional structures. To demonstrate the versatility of this design, a prototype POP device for the colorimetric analysis of liver function markers, serum protein, alkaline phosphatase (ALP) and aspartate aminotransferase (AST), was constructed. On this POP device, quantitative sample loading, mixing and multiplex analysis have all been realized.

  5. Multiplexed mass cytometry profiling of cellular states perturbed by small-molecule regulators

    PubMed Central

    Bodenmiller, Bernd; Zunder, Eli R.; Finck, Rachel; Chen, Tiffany J.; Savig, Erica S.; Bruggner, Robert V.; Simonds, Erin F.; Bendall, Sean C.; Sachs, Karen; Krutzik, Peter O.; Nolan, Garry P.

    2013-01-01

    The ability to comprehensively explore the impact of bio-active molecules on human samples at the single-cell level can provide great insight for biomedical research. Mass cytometry enables quantitative single-cell analysis with deep dimensionality, but currently lacks high-throughput capability. Here we report a method termed mass-tag cellular barcoding (MCB) that increases mass cytometry throughput by sample multiplexing. 96-well format MCB was used to characterize human peripheral blood mononuclear cell (PBMC) signaling dynamics, cell-to-cell communication, the signaling variability between 8 donors, and to define the impact of 27 inhibitors on this system. For each compound, 14 phosphorylation sites were measured in 14 PBMC types, resulting in 18,816 quantified phosphorylation levels from each multiplexed sample. This high-dimensional systems-level inquiry allowed analysis across cell-type and signaling space, reclassified inhibitors, and revealed off-target effects. MCB enables high-content, high-throughput screening, with potential applications for drug discovery, pre-clinical testing, and mechanistic investigation of human disease. PMID:22902532

  6. Subject-specific longitudinal shape analysis by coupling spatiotemporal shape modeling with medial analysis

    NASA Astrophysics Data System (ADS)

    Hong, Sungmin; Fishbaugh, James; Rezanejad, Morteza; Siddiqi, Kaleem; Johnson, Hans; Paulsen, Jane; Kim, Eun Young; Gerig, Guido

    2017-02-01

    Modeling subject-specific shape change is one of the most important challenges in longitudinal shape analysis of disease progression. Whereas anatomical change over time can be a function of normal aging, anatomy can also be impacted by disease related degeneration. Anatomical shape change may also be affected by structural changes from neighboring shapes, which may cause non-linear variations in pose. In this paper, we propose a framework to analyze disease related shape changes by coupling extrinsic modeling of the ambient anatomical space via spatiotemporal deformations with intrinsic shape properties from medial surface analysis. We compare intrinsic shape properties of a subject-specific shape trajectory to a normative 4D shape atlas representing normal aging to isolate shape changes related to disease. The spatiotemporal shape modeling establishes inter/intra subject anatomical correspondence, which in turn enables comparisons between subjects and the 4D shape atlas, and also quantitative analysis of disease related shape change. The medial surface analysis captures intrinsic shape properties related to local patterns of deformation. The proposed framework jointly models extrinsic longitudinal shape changes in the ambient anatomical space, as well as intrinsic shape properties to give localized measurements of degeneration. Six high risk subjects and six controls are randomly sampled from a Huntington's disease image database for qualitative and quantitative comparison.

  7. Intramolecular carbon and nitrogen isotope analysis by quantitative dry fragmentation of the phenylurea herbicide isoproturon in a combined injector/capillary reactor prior to GC separation.

    PubMed

    Penning, Holger; Elsner, Martin

    2007-11-01

    Potentially, compound-specific isotope analysis may provide unique information on source and fate of pesticides in natural systems. Yet for isotope analysis, LC-based methods that are based on the use of organic solvents often cannot be used and GC-based analysis is frequently not possible due to thermolability of the analyte. A typical example of a compound with such properties is isoproturon (3-(4-isopropylphenyl)-1,1-dimethylurea), belonging to the worldwide extensively used phenylurea herbicides. To make isoproturon accessible to carbon and nitrogen isotope analysis, we developed a GC-based method during which isoproturon was quantitatively fragmented to dimethylamine and 4-isopropylphenylisocyanate. Fragmentation occurred only partially in the injector but was mainly achieved on a heated capillary column. The fragments were then chromatographically separated and individually measured by isotope ratio mass spectrometry. The reliability of the method was tested in hydrolysis experiments with three isotopically different batches of isoproturon. For all three products, the same isotope fractionation factors were observed during conversion and the difference in isotope composition between the batches was preserved. This study demonstrates that fragmentation of phenylurea herbicides does not only make them accessible to isotope analysis but even enables determination of intramolecular isotope fractionation.

  8. Rapid Analysis of Carbohydrates in Bioprocess Samples: An Evaluation of the CarboPac SA10 for HPAE-PAD Analysis by Interlaboratory Comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sevcik, R. S.; Hyman, D. A.; Basumallich, L.

    2013-01-01

    A technique for carbohydrate analysis for bioprocess samples has been developed, providing reduced analysis time compared to current practice in the biofuels R&D community. The Thermofisher CarboPac SA10 anion-exchange column enables isocratic separation of monosaccharides, sucrose and cellobiose in approximately 7 minutes. Additionally, use of a low-volume (0.2 mL) injection valve in combination with a high-volume detection cell minimizes the extent of sample dilution required to bring sugar concentrations into the linear range of the pulsed amperometric detector (PAD). Three laboratories, representing academia, industry, and government, participated in an interlaboratory study which analyzed twenty-one opportunistic samples representing biomass pretreatment, enzymaticmore » saccharification, and fermentation samples. The technique's robustness, linearity, and interlaboratory reproducibility were evaluated and showed excellent-to-acceptable characteristics. Additionally, quantitation by the CarboPac SA10/PAD was compared with the current practice method utilizing a HPX-87P/RID. While these two methods showed good agreement a statistical comparison found significant quantitation difference between them, highlighting the difference between selective and universal detection modes.« less

  9. Usefulness of quantitative susceptibility mapping for the diagnosis of Parkinson disease.

    PubMed

    Murakami, Y; Kakeda, S; Watanabe, K; Ueda, I; Ogasawara, A; Moriya, J; Ide, S; Futatsuya, K; Sato, T; Okada, K; Uozumi, T; Tsuji, S; Liu, T; Wang, Y; Korogi, Y

    2015-06-01

    Quantitative susceptibility mapping allows overcoming several nonlocal restrictions of susceptibility-weighted and phase imaging and enables quantification of magnetic susceptibility. We compared the diagnostic accuracy of quantitative susceptibility mapping and R2* (1/T2*) mapping to discriminate between patients with Parkinson disease and controls. For 21 patients with Parkinson disease and 21 age- and sex-matched controls, 2 radiologists measured the quantitative susceptibility mapping values and R2* values in 6 brain structures (the thalamus, putamen, caudate nucleus, pallidum, substantia nigra, and red nucleus). The quantitative susceptibility mapping values and R2* values of the substantia nigra were significantly higher in patients with Parkinson disease (P < .01); measurements in other brain regions did not differ significantly between patients and controls. For the discrimination of patients with Parkinson disease from controls, receiver operating characteristic analysis suggested that the optimal cutoff values for the substantia nigra, based on the Youden Index, were >0.210 for quantitative susceptibility mapping and >28.8 for R2*. The sensitivity, specificity, and accuracy of quantitative susceptibility mapping were 90% (19 of 21), 86% (18 of 21), and 88% (37 of 42), respectively; for R2* mapping, they were 81% (17 of 21), 52% (11 of 21), and 67% (28 of 42). Pair-wise comparisons showed that the areas under the receiver operating characteristic curves were significantly larger for quantitative susceptibility mapping than for R2* mapping (0.91 versus 0.69, P < .05). Quantitative susceptibility mapping showed higher diagnostic performance than R2* mapping for the discrimination between patients with Parkinson disease and controls. © 2015 by American Journal of Neuroradiology.

  10. Visualization and quantitative analysis of extrachromosomal telomere-repeat DNA in individual human cells by Halo-FISH

    PubMed Central

    Komosa, Martin; Root, Heather; Meyn, M. Stephen

    2015-01-01

    Current methods for characterizing extrachromosomal nuclear DNA in mammalian cells do not permit single-cell analysis, are often semi-quantitative and frequently biased toward the detection of circular species. To overcome these limitations, we developed Halo-FISH to visualize and quantitatively analyze extrachromosomal DNA in single cells. We demonstrate Halo-FISH by using it to analyze extrachromosomal telomere-repeat (ECTR) in human cells that use the Alternative Lengthening of Telomeres (ALT) pathway(s) to maintain telomere lengths. We find that GM847 and VA13 ALT cells average ∼80 detectable G/C-strand ECTR DNA molecules/nucleus, while U2OS ALT cells average ∼18 molecules/nucleus. In comparison, human primary and telomerase-positive cells contain <5 ECTR DNA molecules/nucleus. ECTR DNA in ALT cells exhibit striking cell-to-cell variations in number (<20 to >300), range widely in length (<1 to >200 kb) and are composed of primarily G- or C-strand telomere-repeat DNA. Halo-FISH enables, for the first time, the simultaneous analysis of ECTR DNA and chromosomal telomeres in a single cell. We find that ECTR DNA comprises ∼15% of telomere-repeat DNA in GM847 and VA13 cells, but <4% in U2OS cells. In addition to its use in ALT cell analysis, Halo-FISH can facilitate the study of a wide variety of extrachromosomal DNA in mammalian cells. PMID:25662602

  11. Advanced forensic validation for human spermatozoa identification using SPERM HY-LITER™ Express with quantitative image analysis.

    PubMed

    Takamura, Ayari; Watanabe, Ken; Akutsu, Tomoko

    2017-07-01

    Identification of human semen is indispensable for the investigation of sexual assaults. Fluorescence staining methods using commercial kits, such as the series of SPERM HY-LITER™ kits, have been useful to detect human sperm via strong fluorescence. These kits have been examined from various forensic aspects. However, because of a lack of evaluation methods, these studies did not provide objective, or quantitative, descriptions of the results nor clear criteria for the decisions reached. In addition, the variety of validations was considerably limited. In this study, we conducted more advanced validations of SPERM HY-LITER™ Express using our established image analysis method. Use of this method enabled objective and specific identification of fluorescent sperm's spots and quantitative comparisons of the sperm detection performance under complex experimental conditions. For body fluid mixtures, we examined interference with the fluorescence staining from other body fluid components. Effects of sample decomposition were simulated in high humidity and high temperature conditions. Semen with quite low sperm concentrations, such as azoospermia and oligospermia samples, represented the most challenging cases in application of the kit. Finally, the tolerance of the kit against various acidic and basic environments was analyzed. The validations herein provide useful information for the practical applications of the SPERM HY-LITER™ Express kit, which were previously unobtainable. Moreover, the versatility of our image analysis method toward various complex cases was demonstrated.

  12. Unlocking the potential of publicly available microarray data using inSilicoDb and inSilicoMerging R/Bioconductor packages.

    PubMed

    Taminau, Jonatan; Meganck, Stijn; Lazar, Cosmin; Steenhoff, David; Coletta, Alain; Molter, Colin; Duque, Robin; de Schaetzen, Virginie; Weiss Solís, David Y; Bersini, Hugues; Nowé, Ann

    2012-12-24

    With an abundant amount of microarray gene expression data sets available through public repositories, new possibilities lie in combining multiple existing data sets. In this new context, analysis itself is no longer the problem, but retrieving and consistently integrating all this data before delivering it to the wide variety of existing analysis tools becomes the new bottleneck. We present the newly released inSilicoMerging R/Bioconductor package which, together with the earlier released inSilicoDb R/Bioconductor package, allows consistent retrieval, integration and analysis of publicly available microarray gene expression data sets. Inside the inSilicoMerging package a set of five visual and six quantitative validation measures are available as well. By providing (i) access to uniformly curated and preprocessed data, (ii) a collection of techniques to remove the batch effects between data sets from different sources, and (iii) several validation tools enabling the inspection of the integration process, these packages enable researchers to fully explore the potential of combining gene expression data for downstream analysis. The power of using both packages is demonstrated by programmatically retrieving and integrating gene expression studies from the InSilico DB repository [https://insilicodb.org/app/].

  13. Benchmarking quantitative label-free LC-MS data processing workflows using a complex spiked proteomic standard dataset.

    PubMed

    Ramus, Claire; Hovasse, Agnès; Marcellin, Marlène; Hesse, Anne-Marie; Mouton-Barbosa, Emmanuelle; Bouyssié, David; Vaca, Sebastian; Carapito, Christine; Chaoui, Karima; Bruley, Christophe; Garin, Jérôme; Cianférani, Sarah; Ferro, Myriam; Van Dorssaeler, Alain; Burlet-Schiltz, Odile; Schaeffer, Christine; Couté, Yohann; Gonzalez de Peredo, Anne

    2016-01-30

    Proteomic workflows based on nanoLC-MS/MS data-dependent-acquisition analysis have progressed tremendously in recent years. High-resolution and fast sequencing instruments have enabled the use of label-free quantitative methods, based either on spectral counting or on MS signal analysis, which appear as an attractive way to analyze differential protein expression in complex biological samples. However, the computational processing of the data for label-free quantification still remains a challenge. Here, we used a proteomic standard composed of an equimolar mixture of 48 human proteins (Sigma UPS1) spiked at different concentrations into a background of yeast cell lysate to benchmark several label-free quantitative workflows, involving different software packages developed in recent years. This experimental design allowed to finely assess their performances in terms of sensitivity and false discovery rate, by measuring the number of true and false-positive (respectively UPS1 or yeast background proteins found as differential). The spiked standard dataset has been deposited to the ProteomeXchange repository with the identifier PXD001819 and can be used to benchmark other label-free workflows, adjust software parameter settings, improve algorithms for extraction of the quantitative metrics from raw MS data, or evaluate downstream statistical methods. Bioinformatic pipelines for label-free quantitative analysis must be objectively evaluated in their ability to detect variant proteins with good sensitivity and low false discovery rate in large-scale proteomic studies. This can be done through the use of complex spiked samples, for which the "ground truth" of variant proteins is known, allowing a statistical evaluation of the performances of the data processing workflow. We provide here such a controlled standard dataset and used it to evaluate the performances of several label-free bioinformatics tools (including MaxQuant, Skyline, MFPaQ, IRMa-hEIDI and Scaffold) in different workflows, for detection of variant proteins with different absolute expression levels and fold change values. The dataset presented here can be useful for tuning software tool parameters, and also testing new algorithms for label-free quantitative analysis, or for evaluation of downstream statistical methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. High-Resolution Enabled 12-Plex DiLeu Isobaric Tags for Quantitative Proteomics

    PubMed Central

    2015-01-01

    Multiplex isobaric tags (e.g., tandem mass tags (TMT) and isobaric tags for relative and absolute quantification (iTRAQ)) are a valuable tool for high-throughput mass spectrometry based quantitative proteomics. We have developed our own multiplex isobaric tags, DiLeu, that feature quantitative performance on par with commercial offerings but can be readily synthesized in-house as a cost-effective alternative. In this work, we achieve a 3-fold increase in the multiplexing capacity of the DiLeu reagent without increasing structural complexity by exploiting mass defects that arise from selective incorporation of 13C, 15N, and 2H stable isotopes in the reporter group. The inclusion of eight new reporter isotopologues that differ in mass from the existing four reporters by intervals of 6 mDa yields a 12-plex isobaric set that preserves the synthetic simplicity and quantitative performance of the original implementation. We show that the new reporter variants can be baseline-resolved in high-resolution higher-energy C-trap dissociation (HCD) spectra, and we demonstrate accurate 12-plex quantitation of a DiLeu-labeled Saccharomyces cerevisiae lysate digest via high-resolution nano liquid chromatography–tandem mass spectrometry (nanoLC–MS2) analysis on an Orbitrap Elite mass spectrometer. PMID:25405479

  15. Quantitative imaging biomarker ontology (QIBO) for knowledge representation of biomedical imaging biomarkers.

    PubMed

    Buckler, Andrew J; Liu, Tiffany Ting; Savig, Erica; Suzek, Baris E; Ouellette, M; Danagoulian, J; Wernsing, G; Rubin, Daniel L; Paik, David

    2013-08-01

    A widening array of novel imaging biomarkers is being developed using ever more powerful clinical and preclinical imaging modalities. These biomarkers have demonstrated effectiveness in quantifying biological processes as they occur in vivo and in the early prediction of therapeutic outcomes. However, quantitative imaging biomarker data and knowledge are not standardized, representing a critical barrier to accumulating medical knowledge based on quantitative imaging data. We use an ontology to represent, integrate, and harmonize heterogeneous knowledge across the domain of imaging biomarkers. This advances the goal of developing applications to (1) improve precision and recall of storage and retrieval of quantitative imaging-related data using standardized terminology; (2) streamline the discovery and development of novel imaging biomarkers by normalizing knowledge across heterogeneous resources; (3) effectively annotate imaging experiments thus aiding comprehension, re-use, and reproducibility; and (4) provide validation frameworks through rigorous specification as a basis for testable hypotheses and compliance tests. We have developed the Quantitative Imaging Biomarker Ontology (QIBO), which currently consists of 488 terms spanning the following upper classes: experimental subject, biological intervention, imaging agent, imaging instrument, image post-processing algorithm, biological target, indicated biology, and biomarker application. We have demonstrated that QIBO can be used to annotate imaging experiments with standardized terms in the ontology and to generate hypotheses for novel imaging biomarker-disease associations. Our results established the utility of QIBO in enabling integrated analysis of quantitative imaging data.

  16. Dynamic whole body PET parametric imaging: II. Task-oriented statistical estimation

    PubMed Central

    Karakatsanis, Nicolas A.; Lodge, Martin A.; Zhou, Y.; Wahl, Richard L.; Rahmim, Arman

    2013-01-01

    In the context of oncology, dynamic PET imaging coupled with standard graphical linear analysis has been previously employed to enable quantitative estimation of tracer kinetic parameters of physiological interest at the voxel level, thus, enabling quantitative PET parametric imaging. However, dynamic PET acquisition protocols have been confined to the limited axial field-of-view (~15–20cm) of a single bed position and have not been translated to the whole-body clinical imaging domain. On the contrary, standardized uptake value (SUV) PET imaging, considered as the routine approach in clinical oncology, commonly involves multi-bed acquisitions, but is performed statically, thus not allowing for dynamic tracking of the tracer distribution. Here, we pursue a transition to dynamic whole body PET parametric imaging, by presenting, within a unified framework, clinically feasible multi-bed dynamic PET acquisition protocols and parametric imaging methods. In a companion study, we presented a novel clinically feasible dynamic (4D) multi-bed PET acquisition protocol as well as the concept of whole body PET parametric imaging employing Patlak ordinary least squares (OLS) regression to estimate the quantitative parameters of tracer uptake rate Ki and total blood distribution volume V. In the present study, we propose an advanced hybrid linear regression framework, driven by Patlak kinetic voxel correlations, to achieve superior trade-off between contrast-to-noise ratio (CNR) and mean squared error (MSE) than provided by OLS for the final Ki parametric images, enabling task-based performance optimization. Overall, whether the observer's task is to detect a tumor or quantitatively assess treatment response, the proposed statistical estimation framework can be adapted to satisfy the specific task performance criteria, by adjusting the Patlak correlation-coefficient (WR) reference value. The multi-bed dynamic acquisition protocol, as optimized in the preceding companion study, was employed along with extensive Monte Carlo simulations and an initial clinical FDG patient dataset to validate and demonstrate the potential of the proposed statistical estimation methods. Both simulated and clinical results suggest that hybrid regression in the context of whole-body Patlak Ki imaging considerably reduces MSE without compromising high CNR. Alternatively, for a given CNR, hybrid regression enables larger reductions than OLS in the number of dynamic frames per bed, allowing for even shorter acquisitions of ~30min, thus further contributing to the clinical adoption of the proposed framework. Compared to the SUV approach, whole body parametric imaging can provide better tumor quantification, and can act as a complement to SUV, for the task of tumor detection. PMID:24080994

  17. Dynamic whole-body PET parametric imaging: II. Task-oriented statistical estimation.

    PubMed

    Karakatsanis, Nicolas A; Lodge, Martin A; Zhou, Y; Wahl, Richard L; Rahmim, Arman

    2013-10-21

    In the context of oncology, dynamic PET imaging coupled with standard graphical linear analysis has been previously employed to enable quantitative estimation of tracer kinetic parameters of physiological interest at the voxel level, thus, enabling quantitative PET parametric imaging. However, dynamic PET acquisition protocols have been confined to the limited axial field-of-view (~15-20 cm) of a single-bed position and have not been translated to the whole-body clinical imaging domain. On the contrary, standardized uptake value (SUV) PET imaging, considered as the routine approach in clinical oncology, commonly involves multi-bed acquisitions, but is performed statically, thus not allowing for dynamic tracking of the tracer distribution. Here, we pursue a transition to dynamic whole-body PET parametric imaging, by presenting, within a unified framework, clinically feasible multi-bed dynamic PET acquisition protocols and parametric imaging methods. In a companion study, we presented a novel clinically feasible dynamic (4D) multi-bed PET acquisition protocol as well as the concept of whole-body PET parametric imaging employing Patlak ordinary least squares (OLS) regression to estimate the quantitative parameters of tracer uptake rate Ki and total blood distribution volume V. In the present study, we propose an advanced hybrid linear regression framework, driven by Patlak kinetic voxel correlations, to achieve superior trade-off between contrast-to-noise ratio (CNR) and mean squared error (MSE) than provided by OLS for the final Ki parametric images, enabling task-based performance optimization. Overall, whether the observer's task is to detect a tumor or quantitatively assess treatment response, the proposed statistical estimation framework can be adapted to satisfy the specific task performance criteria, by adjusting the Patlak correlation-coefficient (WR) reference value. The multi-bed dynamic acquisition protocol, as optimized in the preceding companion study, was employed along with extensive Monte Carlo simulations and an initial clinical (18)F-deoxyglucose patient dataset to validate and demonstrate the potential of the proposed statistical estimation methods. Both simulated and clinical results suggest that hybrid regression in the context of whole-body Patlak Ki imaging considerably reduces MSE without compromising high CNR. Alternatively, for a given CNR, hybrid regression enables larger reductions than OLS in the number of dynamic frames per bed, allowing for even shorter acquisitions of ~30 min, thus further contributing to the clinical adoption of the proposed framework. Compared to the SUV approach, whole-body parametric imaging can provide better tumor quantification, and can act as a complement to SUV, for the task of tumor detection.

  18. Segmentation-free image processing and analysis of precipitate shapes in 2D and 3D

    NASA Astrophysics Data System (ADS)

    Bales, Ben; Pollock, Tresa; Petzold, Linda

    2017-06-01

    Segmentation based image analysis techniques are routinely employed for quantitative analysis of complex microstructures containing two or more phases. The primary advantage of these approaches is that spatial information on the distribution of phases is retained, enabling subjective judgements of the quality of the segmentation and subsequent analysis process. The downside is that computing micrograph segmentations with data from morphologically complex microstructures gathered with error-prone detectors is challenging and, if no special care is taken, the artifacts of the segmentation will make any subsequent analysis and conclusions uncertain. In this paper we demonstrate, using a two phase nickel-base superalloy microstructure as a model system, a new methodology for analysis of precipitate shapes using a segmentation-free approach based on the histogram of oriented gradients feature descriptor, a classic tool in image analysis. The benefits of this methodology for analysis of microstructure in two and three-dimensions are demonstrated.

  19. Chromatographic, Spectroscopic and Mass Spectrometric Approaches for Exploring the Habitability of Mars in 2012 and Beyond with the Curiosity Rover

    NASA Technical Reports Server (NTRS)

    Mahaffy, Paul

    2012-01-01

    The Sample Analysis at Mars (SAM) suite of instruments on the Curiosity Rover of Mars Science Laboratory Mission is designed to provide chemical and isotopic analysis of organic and inorganic volatiles for both atmospheric and solid samples. The goals of the science investigation enabled by the gas chromatograph mass spectrometer and tunable laser spectrometer instruments of SAM are to work together with the other MSL investigations is to quantitatively assess habitability through a series of chemical and geological measurements. We describe the multi-column gas chromatograph system employed on SAM and the approach to extraction and analysis of organic compounds that might be preserved in ancient martian rocks.

  20. A Reproducible Computerized Method for Quantitation of Capillary Density using Nailfold Capillaroscopy.

    PubMed

    Cheng, Cynthia; Lee, Chadd W; Daskalakis, Constantine

    2015-10-27

    Capillaroscopy is a non-invasive, efficient, relatively inexpensive and easy to learn methodology for directly visualizing the microcirculation. The capillaroscopy technique can provide insight into a patient's microvascular health, leading to a variety of potentially valuable dermatologic, ophthalmologic, rheumatologic and cardiovascular clinical applications. In addition, tumor growth may be dependent on angiogenesis, which can be quantitated by measuring microvessel density within the tumor. However, there is currently little to no standardization of techniques, and only one publication to date reports the reliability of a currently available, complex computer based algorithms for quantitating capillaroscopy data.(1) This paper describes a new, simpler, reliable, standardized capillary counting algorithm for quantitating nailfold capillaroscopy data. A simple, reproducible computerized capillaroscopy algorithm such as this would facilitate more widespread use of the technique among researchers and clinicians. Many researchers currently analyze capillaroscopy images by hand, promoting user fatigue and subjectivity of the results. This paper describes a novel, easy-to-use automated image processing algorithm in addition to a reproducible, semi-automated counting algorithm. This algorithm enables analysis of images in minutes while reducing subjectivity; only a minimal amount of training time (in our experience, less than 1 hr) is needed to learn the technique.

  1. A Reproducible Computerized Method for Quantitation of Capillary Density using Nailfold Capillaroscopy

    PubMed Central

    Daskalakis, Constantine

    2015-01-01

    Capillaroscopy is a non-invasive, efficient, relatively inexpensive and easy to learn methodology for directly visualizing the microcirculation. The capillaroscopy technique can provide insight into a patient’s microvascular health, leading to a variety of potentially valuable dermatologic, ophthalmologic, rheumatologic and cardiovascular clinical applications. In addition, tumor growth may be dependent on angiogenesis, which can be quantitated by measuring microvessel density within the tumor. However, there is currently little to no standardization of techniques, and only one publication to date reports the reliability of a currently available, complex computer based algorithms for quantitating capillaroscopy data.1 This paper describes a new, simpler, reliable, standardized capillary counting algorithm for quantitating nailfold capillaroscopy data. A simple, reproducible computerized capillaroscopy algorithm such as this would facilitate more widespread use of the technique among researchers and clinicians. Many researchers currently analyze capillaroscopy images by hand, promoting user fatigue and subjectivity of the results. This paper describes a novel, easy-to-use automated image processing algorithm in addition to a reproducible, semi-automated counting algorithm. This algorithm enables analysis of images in minutes while reducing subjectivity; only a minimal amount of training time (in our experience, less than 1 hr) is needed to learn the technique. PMID:26554744

  2. [Skin biopsy in diagnosis of chronic graft-versus-host disease in patients after allogeneic haematopoietic stem cell transplantation: pathologist's point of view on quantitative scoring system].

    PubMed

    Grzanka, Dariusz; Styczyński, Jan; Debski, Robert; Krenska, Anna; Pacholska, Małgorzata; Prokurat, Andrzej I; Wysocki, Mariusz; Marszałek, Andrzej

    2008-01-01

    Pathology diagnosis of chronic graft-versus-host-disease (GVHD) after allogeneic haematopoietic stem cell transplantation (allo-HSCT) is an important issue in clinical follow-up, in spite of frequent difficulties in interpretation., related to dynamic changes occurring in the skin during the disease, as well as to sequelae of basic disease and immunosuppressive therapy. Recently presented Consensus NIH (National Health Institute, Bethesda, USA) of histopathologic (HP) analysis is still complex and intrinsically divergent, thus clinically difficult to implement. Analysis of clinical value of histological evaluation results of skin biopsy in children after allo-HSCT and its correlation with clinical status. Ten skin biopsies were taken from 7 patients (4 boys, 3 girls, age 3-15 years) after allo-HSCT (6 MFD, 1 MMUD) and analyzed after hematoxylin/eosine and immunohistochemical (CD3, CD45T, CD20) staining. Pathology analysis was based on commonly accepted criteria enabling simple and unambiguous interpretation. Results were compared with clinical data and indications for immunosuppressive therapy. It was found that reliable and coherent interpretation can be made when following parameters were taken into account: 1. in epithelium: the presence of apoptosis, archetypical changes and vacuolar degeneration in the basilar layer, presence of CD3/CD45 in the epidermis; 2. in the dermis: the extent of collagenization, presence of melanophages and lymphocyte infiltrations; 3. in the eccrine glands epithelium: eccrine glands atrophy and presence of lymphocytes. A new scoring system of skin biopsy analysis in patients with chronic GVHD based on the modified NIH Consensus was proposed. The preliminary clinical value of histological results was assessed. Skin biopsy evaluation based on limited qualitative and quantitative analysis of lymphocyte infiltrates together with studies on intensity of apoptosis, collagenization and archetypical changes is a valuable diagnostic method complementary to clinical records, enabling easier undertaking of therapeutic decisions.

  3. [Prognostic value of chosen parameters of mechanical and bioelectrical uterine activity in prediction of threatening preterm labour].

    PubMed

    Zietek, Jerzy; Sikora, Jerzy; Horoba, Krzysztof; Matonia, Adam; Jezewski, Janusz; Magnucki, Jacek; Kobielska, Lucyna

    2009-03-01

    To record and analyse bioelectrical activity of the uterine muscle in the course of physiological pregnancy, labour and threatening premature labour; to define which parameters from the analysis of both electrohysterogram and mechanical activity signal allow us to predict threatening premature labour. Material comprised 62 pregnant women: Group I--27 patients in their first physiological pregnancy, Group II--21 patients in their first pregnancy with symptoms of threatening premature labour, and Group III--14 patients in the first labour period. The on-line analysis of the mechanical (TOCO) and electrical (EHG) contraction activity relied on determination of quantitative parameters of detected uterine contractions. The obtained statistical results demonstrated a possibility to differentiate between Group I and II through the amplitude and contraction area for EHG signal, and only the contraction amplitude for TOCO signal. Additionally, significant differentiating parameters for electrohysterogram are: contraction power and its median frequency. Analyzing Group I and III, significant differences were noted for contraction amplitude and area obtained both from EHG and TOCO signals. Similarly, the contraction power (from EHG) enables us to assign the contractions either to records from Group I or to labour type. There was no significant difference noted between Group II and III. Identification of pregnant women at risk of premature labour should lead to their inclusion in rigorous perinatal surveillance. This requires novel, more sensitive methods that are able to detect early symptoms of the uterine contraction activity increase. Electrohysterography provides complete information on principles of bioelectrical uterine activity. Quantitative parameters of EHG analysis enable the detection of records (contractions) with the symptoms of premature uterine contraction activity.

  4. Spatially-Resolved Proteomics: Rapid Quantitative Analysis of Laser Capture Microdissected Alveolar Tissue Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clair, Geremy; Piehowski, Paul D.; Nicola, Teodora

    Global proteomics approaches allow characterization of whole tissue lysates to an impressive depth. However, it is now increasingly recognized that to better understand the complexity of multicellular organisms, global protein profiling of specific spatially defined regions/substructures of tissues (i.e. spatially-resolved proteomics) is essential. Laser capture microdissection (LCM) enables microscopic isolation of defined regions of tissues preserving crucial spatial information. However, current proteomics workflows entail several manual sample preparation steps and are challenged by the microscopic mass-limited samples generated by LCM, and that impact measurement robustness, quantification, and throughput. Here, we coupled LCM with a fully automated sample preparation workflow thatmore » with a single manual step allows: protein extraction, tryptic digestion, peptide cleanup and LC-MS/MS analysis of proteomes from microdissected tissues. Benchmarking against the current state of the art in ultrasensitive global proteomic analysis, our approach demonstrated significant improvements in quantification and throughput. Using our LCM-SNaPP proteomics approach, we characterized to a depth of more than 3,400 proteins, the ontogeny of protein changes during normal lung development in laser capture microdissected alveolar tissue containing ~4,000 cells per sample. Importantly, the data revealed quantitative changes for 350 low abundance transcription factors and signaling molecules, confirming earlier transcript-level observations and defining seven modules of coordinated transcription factor/signaling molecule expression patterns, suggesting that a complex network of temporal regulatory control directs normal lung development with epigenetic regulation fine-tuning pre-natal developmental processes. Our LCM-proteomics approach facilitates efficient, spatially-resolved, ultrasensitive global proteomics analyses in high-throughput that will be enabling for several clinical and biological applications.« less

  5. Quantitative Description of Crystal Nucleation and Growth from in Situ Liquid Scanning Transmission Electron Microscopy.

    PubMed

    Ievlev, Anton V; Jesse, Stephen; Cochell, Thomas J; Unocic, Raymond R; Protopopescu, Vladimir A; Kalinin, Sergei V

    2015-12-22

    Recent advances in liquid cell (scanning) transmission electron microscopy (S)TEM has enabled in situ nanoscale investigations of controlled nanocrystal growth mechanisms. Here, we experimentally and quantitatively investigated the nucleation and growth mechanisms of Pt nanostructures from an aqueous solution of K2PtCl6. Averaged statistical, network, and local approaches have been used for the data analysis and the description of both collective particles dynamics and local growth features. In particular, interaction between neighboring particles has been revealed and attributed to reduction of the platinum concentration in the vicinity of the particle boundary. The local approach for solving the inverse problem showed that particles dynamics can be simulated by a stationary diffusional model. The obtained results are important for understanding nanocrystal formation and growth processes and for optimization of synthesis conditions.

  6. [Acoustic and aerodynamic characteristics of the oesophageal voice].

    PubMed

    Vázquez de la Iglesia, F; Fernández González, S

    2005-12-01

    The aim of the study is to determine the physiology and pathophisiology of esophageal voice according to objective aerodynamic and acoustic parameters (quantitative and qualitative parameters). Our subjects were comprised of 33 laryngectomized patients (all male) that underwent aerodynamic, acoustic and perceptual protocol. There is a statistical association between acoustic and aerodynamic qualitative parameters (phonation flow chart type, sound spectrum, perceptual analysis) among quantitative parameters (neoglotic pressure, phonation flow, phonation time, fundamental frequency, maximum intensity sound level, speech rate). Nevertheles, not always such observations bring practical resources to clinical practice. We consider that the facts studied may enable us to add, pragmatically, new resources to the more effective vocal rehabilitation to these patients. The physiology of esophageal voice is well understood by the method we have applied, also seeking for rehabilitation, improving oral communication skills in the laryngectomee population.

  7. Camera, Hand Lens, and Microscope Probe (CHAMP): An Instrument Proposed for the 2009 MSL Rover Mission

    NASA Technical Reports Server (NTRS)

    Mungas, Greg S.; Beegle, Luther W.; Boynton, John E.; Lee, Pascal; Shidemantle, Ritch; Fisher, Ted

    2004-01-01

    The Camera, Hand Lens, and Microscope Probe (CHAMP) will allow examination of martian surface features and materials (terrain, rocks, soils, samples) on spatial scales ranging from kilometers to micrometers, thus enabling both microscopy and context imaging with high operational flexibility. CHAMP is designed to allow the detailed and quantitative investigation of a wide range of geologic features and processes on Mars, leading to a better quantitative understanding of the evolution of the martian surface environment through time. In particular, CHAMP will provide key data that will help understand the local region explored by Mars Surface Laboratory (MSL) as a potential habitat for life. CHAMP will also support other anticipated MSL investigations, in particular by helping identify and select the highest priority targets for sample collection and analysis by the MSL's analytical suite.

  8. In Situ Quantification of [Re(CO) 3] + by Fluorescence Spectroscopy in Simulated Hanford Tank Waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Branch, Shirmir D.; French, Amanda D.; Lines, Amanda M.

    A pretreatment protocol is presented that allows for the quantitative conversion and subsequent in situ spectroscopic analysis of [Re(CO)3]+ species in simulated Hanford tank waste. The protocol encompasses adding a simulated waste sample containing the non-emissive [Re(CO)3]+ species to a developer solution that enables the rapid, quantitative conversion of the non-emissive species to a luminescent species which can then be detected spectroscopically. The [Re(CO)3]+ species concentration in an alkaline, simulated Hanford tank waste supernatant can be quantified by the standard addition method. In a test case, the [Re(CO)3]+ species was measured to be at a concentration of 38.9 µM, whichmore » was a difference of 2.01% from the actual concentration of 39.7 µM.« less

  9. Microanalysis of plant cell wall polysaccharides.

    PubMed

    Obel, Nicolai; Erben, Veronika; Schwarz, Tatjana; Kühnel, Stefan; Fodor, Andrea; Pauly, Markus

    2009-09-01

    Oligosaccharide Mass Profiling (OLIMP) allows a fast and sensitive assessment of cell wall polymer structure when coupled with Matrix Assisted Laser Desorption Ionisation Time Of Flight Mass Spectrometry (MALDI-TOF MS). The short time required for sample preparation and analysis makes possible the study of a wide range of plant organs, revealing a high degree of heterogeneity in the substitution pattern of wall polymers such as the cross-linking glycan xyloglucan and the pectic polysaccharide homogalacturonan. The high sensitivity of MALDI-TOF allows the use of small amounts of samples, thus making it possible to investigate the wall structure of single cell types when material is collected by such methods as laser micro-dissection. As an example, the analysis of the xyloglucan structure in the leaf cell types outer epidermis layer, entire epidermis cell layer, palisade mesophyll cells, and vascular bundles were investigated. OLIMP is amenable to in situ wall analysis, where wall polymers are analyzed on unprepared plant tissue itself without first isolating cell walls. In addition, OLIMP enables analysis of wall polymers in Golgi-enriched fractions, the location of nascent matrix polysaccharide biosynthesis, enabling separation of the processes of wall biosynthesis versus post-deposition apoplastic metabolism. These new tools will make possible a semi-quantitative analysis of the cell wall at an unprecedented level.

  10. Miniaturized Battery-Free Wireless Systems for Wearable Pulse Oximetry.

    PubMed

    Kim, Jeonghyun; Gutruf, Philipp; Chiarelli, Antonio M; Heo, Seung Yun; Cho, Kyoungyeon; Xie, Zhaoqian; Banks, Anthony; Han, Seungyoung; Jang, Kyung-In; Lee, Jung Woo; Lee, Kyu-Tae; Feng, Xue; Huang, Yonggang; Fabiani, Monica; Gratton, Gabriele; Paik, Ungyu; Rogers, John A

    2017-01-05

    Development of unconventional technologies for wireless collection, storage and analysis of quantitative, clinically relevant information on physiological status is of growing interest. Soft, biocompatible systems are widely regarded as important because they facilitate mounting on external (e.g. skin) and internal (e.g. heart, brain) surfaces of the body. Ultra-miniaturized, lightweight and battery-free devices have the potential to establish complementary options in bio-integration, where chronic interfaces (i.e. months) are possible on hard surfaces such as the fingernails and the teeth, with negligible risk for irritation or discomfort. Here we report materials and device concepts for flexible platforms that incorporate advanced optoelectronic functionality for applications in wireless capture and transmission of photoplethysmograms, including quantitative information on blood oxygenation, heart rate and heart rate variability. Specifically, reflectance pulse oximetry in conjunction with near-field communication (NFC) capabilities enables operation in thin, miniaturized flexible devices. Studies of the material aspects associated with the body interface, together with investigations of the radio frequency characteristics, the optoelectronic data acquisition approaches and the analysis methods capture all of the relevant engineering considerations. Demonstrations of operation on various locations of the body and quantitative comparisons to clinical gold standards establish the versatility and the measurement accuracy of these systems, respectively.

  11. A soft, wearable microfluidic device for the capture, storage, and colorimetric sensing of sweat.

    PubMed

    Koh, Ahyeon; Kang, Daeshik; Xue, Yeguang; Lee, Seungmin; Pielak, Rafal M; Kim, Jeonghyun; Hwang, Taehwan; Min, Seunghwan; Banks, Anthony; Bastien, Philippe; Manco, Megan C; Wang, Liang; Ammann, Kaitlyn R; Jang, Kyung-In; Won, Phillip; Han, Seungyong; Ghaffari, Roozbeh; Paik, Ungyu; Slepian, Marvin J; Balooch, Guive; Huang, Yonggang; Rogers, John A

    2016-11-23

    Capabilities in health monitoring enabled by capture and quantitative chemical analysis of sweat could complement, or potentially obviate the need for, approaches based on sporadic assessment of blood samples. Established sweat monitoring technologies use simple fabric swatches and are limited to basic analysis in controlled laboratory or hospital settings. We present a collection of materials and device designs for soft, flexible, and stretchable microfluidic systems, including embodiments that integrate wireless communication electronics, which can intimately and robustly bond to the surface of the skin without chemical and mechanical irritation. This integration defines access points for a small set of sweat glands such that perspiration spontaneously initiates routing of sweat through a microfluidic network and set of reservoirs. Embedded chemical analyses respond in colorimetric fashion to markers such as chloride and hydronium ions, glucose, and lactate. Wireless interfaces to digital image capture hardware serve as a means for quantitation. Human studies demonstrated the functionality of this microfluidic device during fitness cycling in a controlled environment and during long-distance bicycle racing in arid, outdoor conditions. The results include quantitative values for sweat rate, total sweat loss, pH, and concentration of chloride and lactate. Copyright © 2016, American Association for the Advancement of Science.

  12. Electron paramagnetic resonance oximetry as a quantitative method to measure cellular respiration: a consideration of oxygen diffusion interference.

    PubMed

    Presley, Tennille; Kuppusamy, Periannan; Zweier, Jay L; Ilangovan, Govindasamy

    2006-12-15

    Electron paramagnetic resonance (EPR) oximetry is being widely used to measure the oxygen consumption of cells, mitochondria, and submitochondrial particles. However, further improvement of this technique, in terms of data analysis, is required to use it as a quantitative tool. Here, we present a new approach for quantitative analysis of cellular respiration using EPR oximetry. The course of oxygen consumption by cells in suspension has been observed to have three distinct zones: pO(2)-independent respiration at higher pO(2) ranges, pO(2)-dependent respiration at low pO(2) ranges, and a static equilibrium with no change in pO(2) at very low pO(2) values. The approach here enables one to comprehensively analyze all of the three zones together-where the progression of O(2) diffusion zones around each cell, their overlap within time, and their potential impact on the measured pO(2) data are considered. The obtained results agree with previously established methods such as high-resolution respirometry measurements. Additionally, it is also demonstrated how the diffusion limitations can depend on cell density and consumption rate. In conclusion, the new approach establishes a more accurate and meaningful model to evaluate the EPR oximetry data on cellular respiration to quantify related parameters using EPR oximetry.

  13. Exploiting induced variation to dissect quantitative traits in barley.

    PubMed

    Druka, Arnis; Franckowiak, Jerome; Lundqvist, Udda; Bonar, Nicola; Alexander, Jill; Guzy-Wrobelska, Justyna; Ramsay, Luke; Druka, Ilze; Grant, Iain; Macaulay, Malcolm; Vendramin, Vera; Shahinnia, Fahimeh; Radovic, Slobodanka; Houston, Kelly; Harrap, David; Cardle, Linda; Marshall, David; Morgante, Michele; Stein, Nils; Waugh, Robbie

    2010-04-01

    The identification of genes underlying complex quantitative traits such as grain yield by means of conventional genetic analysis (positional cloning) requires the development of several large mapping populations. However, it is possible that phenotypically related, but more extreme, allelic variants generated by mutational studies could provide a means for more efficient cloning of QTLs (quantitative trait loci). In barley (Hordeum vulgare), with the development of high-throughput genome analysis tools, efficient genome-wide identification of genetic loci harbouring mutant alleles has recently become possible. Genotypic data from NILs (near-isogenic lines) that carry induced or natural variants of genes that control aspects of plant development can be compared with the location of QTLs to potentially identify candidate genes for development--related traits such as grain yield. As yield itself can be divided into a number of allometric component traits such as tillers per plant, kernels per spike and kernel size, mutant alleles that both affect these traits and are located within the confidence intervals for major yield QTLs may represent extreme variants of the underlying genes. In addition, the development of detailed comparative genomic models based on the alignment of a high-density barley gene map with the rice and sorghum physical maps, has enabled an informed prioritization of 'known function' genes as candidates for both QTLs and induced mutant genes.

  14. Flexible opto-electronics enabled microfluidics systems with cloud connectivity for point-of-care micronutrient analysis.

    PubMed

    Lee, Stephen; Aranyosi, A J; Wong, Michelle D; Hong, Ji Hyung; Lowe, Jared; Chan, Carol; Garlock, David; Shaw, Scott; Beattie, Patrick D; Kratochvil, Zachary; Kubasti, Nick; Seagers, Kirsten; Ghaffari, Roozbeh; Swanson, Christina D

    2016-04-15

    In developing countries, the deployment of medical diagnostic technologies remains a challenge because of infrastructural limitations (e.g. refrigeration, electricity), and paucity of health professionals, distribution centers and transportation systems. Here we demonstrate the technical development and clinical testing of a novel electronics enabled microfluidic paper-based analytical device (EE-μPAD) for quantitative measurement of micronutrient concentrations in decentralized, resource-limited settings. The system performs immune-detection using paper-based microfluidics, instrumented with flexible electronics and optoelectronic sensors in a mechanically robust, ultrathin format comparable in size to a credit card. Autonomous self-calibration, plasma separation, flow monitoring, timing and data storage enable multiple devices to be run simultaneously. Measurements are wirelessly transferred to a mobile phone application that geo-tags the data and transmits it to a remote server for real time tracking of micronutrient deficiencies. Clinical tests of micronutrient levels from whole blood samples (n=95) show comparable sensitivity and specificity to ELISA-based tests. These results demonstrate instantaneous acquisition and global aggregation of diagnostics data using a fully integrated point of care system that will enable rapid and distributed surveillance of disease prevalence and geographical progression. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Corra: Computational framework and tools for LC-MS discovery and targeted mass spectrometry-based proteomics

    PubMed Central

    Brusniak, Mi-Youn; Bodenmiller, Bernd; Campbell, David; Cooke, Kelly; Eddes, James; Garbutt, Andrew; Lau, Hollis; Letarte, Simon; Mueller, Lukas N; Sharma, Vagisha; Vitek, Olga; Zhang, Ning; Aebersold, Ruedi; Watts, Julian D

    2008-01-01

    Background Quantitative proteomics holds great promise for identifying proteins that are differentially abundant between populations representing different physiological or disease states. A range of computational tools is now available for both isotopically labeled and label-free liquid chromatography mass spectrometry (LC-MS) based quantitative proteomics. However, they are generally not comparable to each other in terms of functionality, user interfaces, information input/output, and do not readily facilitate appropriate statistical data analysis. These limitations, along with the array of choices, present a daunting prospect for biologists, and other researchers not trained in bioinformatics, who wish to use LC-MS-based quantitative proteomics. Results We have developed Corra, a computational framework and tools for discovery-based LC-MS proteomics. Corra extends and adapts existing algorithms used for LC-MS-based proteomics, and statistical algorithms, originally developed for microarray data analyses, appropriate for LC-MS data analysis. Corra also adapts software engineering technologies (e.g. Google Web Toolkit, distributed processing) so that computationally intense data processing and statistical analyses can run on a remote server, while the user controls and manages the process from their own computer via a simple web interface. Corra also allows the user to output significantly differentially abundant LC-MS-detected peptide features in a form compatible with subsequent sequence identification via tandem mass spectrometry (MS/MS). We present two case studies to illustrate the application of Corra to commonly performed LC-MS-based biological workflows: a pilot biomarker discovery study of glycoproteins isolated from human plasma samples relevant to type 2 diabetes, and a study in yeast to identify in vivo targets of the protein kinase Ark1 via phosphopeptide profiling. Conclusion The Corra computational framework leverages computational innovation to enable biologists or other researchers to process, analyze and visualize LC-MS data with what would otherwise be a complex and not user-friendly suite of tools. Corra enables appropriate statistical analyses, with controlled false-discovery rates, ultimately to inform subsequent targeted identification of differentially abundant peptides by MS/MS. For the user not trained in bioinformatics, Corra represents a complete, customizable, free and open source computational platform enabling LC-MS-based proteomic workflows, and as such, addresses an unmet need in the LC-MS proteomics field. PMID:19087345

  16. Combined Population Dynamics and Entropy Modelling Supports Patient Stratification in Chronic Myeloid Leukemia

    NASA Astrophysics Data System (ADS)

    Brehme, Marc; Koschmieder, Steffen; Montazeri, Maryam; Copland, Mhairi; Oehler, Vivian G.; Radich, Jerald P.; Brümmendorf, Tim H.; Schuppert, Andreas

    2016-04-01

    Modelling the parameters of multistep carcinogenesis is key for a better understanding of cancer progression, biomarker identification and the design of individualized therapies. Using chronic myeloid leukemia (CML) as a paradigm for hierarchical disease evolution we show that combined population dynamic modelling and CML patient biopsy genomic analysis enables patient stratification at unprecedented resolution. Linking CD34+ similarity as a disease progression marker to patient-derived gene expression entropy separated established CML progression stages and uncovered additional heterogeneity within disease stages. Importantly, our patient data informed model enables quantitative approximation of individual patients’ disease history within chronic phase (CP) and significantly separates “early” from “late” CP. Our findings provide a novel rationale for personalized and genome-informed disease progression risk assessment that is independent and complementary to conventional measures of CML disease burden and prognosis.

  17. Real medical benefit assessed by indirect comparison.

    PubMed

    Falissard, Bruno; Zylberman, Myriam; Cucherat, Michel; Izard, Valérie; Meyer, François

    2009-01-01

    Frequently, in data packages submitted for Marketing Approval to the CHMP, there is a lack of relevant head-to-head comparisons of medicinal products that could enable national authorities responsible for the approval of reimbursement to assess the Added Therapeutic Value (ASMR) of new clinical entities or line extensions of existing therapies.Indirect or mixed treatment comparisons (MTC) are methods stemming from the field of meta-analysis that have been designed to tackle this problem. Adjusted indirect comparisons, meta-regressions, mixed models, Bayesian network analyses pool results of randomised controlled trials (RCTs), enabling a quantitative synthesis.The REAL procedure, recently developed by the HAS (French National Authority for Health), is a mixture of an MTC and effect model based on expert opinions. It is intended to translate the efficacy observed in the trials into effectiveness expected in day-to-day clinical practice in France.

  18. Quantitative FE-EPMA measurement of formation and inhibition of carbon contamination on Fe for trace carbon analysis.

    PubMed

    Tanaka, Yuji; Yamashita, Takako; Nagoshi, Masayasu

    2017-04-01

    Hydrocarbon contamination introduced during point, line and map analyses in a field emission electron probe microanalysis (FE-EPMA) was investigated to enable reliable quantitative analysis of trace amounts of carbon in steels. The increment of contamination on pure iron in point analysis is proportional to the number of iterations of beam irradiation, but not to the accumulated irradiation time. A combination of a longer dwell time and single measurement with a liquid nitrogen (LN2) trap as an anti-contamination device (ACD) is sufficient for a quantitative point analysis. However, in line and map analyses, contamination increases with irradiation time in addition to the number of iterations, even though the LN2 trap and a plasma cleaner are used as ACDs. Thus, a shorter dwell time and single measurement are preferred for line and map analyses, although it is difficult to eliminate the influence of contamination. While ring-like contamination around the irradiation point grows during electron-beam irradiation, contamination at the irradiation point increases during blanking time after irradiation. This can explain the increment of contamination in iterative point analysis as well as in line and map analyses. Among the ACDs, which are tested in this study, specimen heating at 373 K has a significant contamination inhibition effect. This technique makes it possible to obtain line and map analysis data with minimum influence of contamination. The above-mentioned FE-EPMA data are presented and discussed in terms of the contamination-formation mechanisms and the preferable experimental conditions for the quantification of trace carbon in steels. © The Author 2016. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. Elemental analysis of occupational and environmental lung diseases by electron probe microanalyzer with wavelength dispersive spectrometer.

    PubMed

    Takada, Toshinori; Moriyama, Hiroshi; Suzuki, Eiichi

    2014-01-01

    Occupational and environmental lung diseases are a group of pulmonary disorders caused by inhalation of harmful particles, mists, vapors or gases. Mineralogical analysis is not generally required in the diagnosis of most cases of these diseases. Apart from minerals that are encountered rarely or only in specific occupations, small quantities of mineral dusts are present in the healthy lung. As such when mineralogical analysis is required, quantitative or semi-quantitative methods must be employed. An electron probe microanalyzer with wavelength dispersive spectrometer (EPMA-WDS) enables analysis of human lung tissue for deposits of elements by both qualitative and semi-quantitative methods. Since 1993, we have analyzed 162 cases of suspected occupational and environmental lung diseases using an EPMA-WDS. Our institute has been accepting online requests for elemental analysis of lung tissue samples by EPMA-WDS since January 2011. Hard metal lung disease is an occupational interstitial lung disease that primarily affects workers exposed to the dust of tungsten carbide. The characteristic pathological findings of the disease are giant cell interstitial pneumonia (GIP) with centrilobular fibrosis, surrounded by mild alveolitis with giant cells within the alveolar space. EPMA-WDS analysis of biopsied lung tissue from patients with GIP has demonstrated that tungsten and/or cobalt is distributed in the giant cells and centrilobular fibrosing lesion in GIP. Pneumoconiosis, caused by amorphous silica, and acute interstitial pneumonia, associated with the giant tsunami, were also elementally analyzed by EPMA-WDS. The results suggest that commonly found elements, such as silicon, aluminum, and iron, may cause occupational and environmental lung diseases. Copyright © 2013 The Japanese Respiratory Society. Published by Elsevier B.V. All rights reserved.

  20. A Quantitative Analysis of Pulsed Signals Emitted by Wild Bottlenose Dolphins.

    PubMed

    Luís, Ana Rita; Couchinho, Miguel N; Dos Santos, Manuel E

    2016-01-01

    Common bottlenose dolphins (Tursiops truncatus), produce a wide variety of vocal emissions for communication and echolocation, of which the pulsed repertoire has been the most difficult to categorize. Packets of high repetition, broadband pulses are still largely reported under a general designation of burst-pulses, and traditional attempts to classify these emissions rely mainly in their aural characteristics and in graphical aspects of spectrograms. Here, we present a quantitative analysis of pulsed signals emitted by wild bottlenose dolphins, in the Sado estuary, Portugal (2011-2014), and test the reliability of a traditional classification approach. Acoustic parameters (minimum frequency, maximum frequency, peak frequency, duration, repetition rate and inter-click-interval) were extracted from 930 pulsed signals, previously categorized using a traditional approach. Discriminant function analysis revealed a high reliability of the traditional classification approach (93.5% of pulsed signals were consistently assigned to their aurally based categories). According to the discriminant function analysis (Wilk's Λ = 0.11, F3, 2.41 = 282.75, P < 0.001), repetition rate is the feature that best enables the discrimination of different pulsed signals (structure coefficient = 0.98). Classification using hierarchical cluster analysis led to a similar categorization pattern: two main signal types with distinct magnitudes of repetition rate were clustered into five groups. The pulsed signals, here described, present significant differences in their time-frequency features, especially repetition rate (P < 0.001), inter-click-interval (P < 0.001) and duration (P < 0.001). We document the occurrence of a distinct signal type-short burst-pulses, and highlight the existence of a diverse repertoire of pulsed vocalizations emitted in graded sequences. The use of quantitative analysis of pulsed signals is essential to improve classifications and to better assess the contexts of emission, geographic variation and the functional significance of pulsed signals.

  1. Quantitative detection method for Roundup Ready soybean in food using duplex real-time PCR MGB chemistry.

    PubMed

    Samson, Maria Cristina; Gullì, Mariolina; Marmiroli, Nelson

    2010-07-01

    Methodologies that enable the detection of genetically modified organisms (GMOs) (authorized and non-authorized) in food and feed strongly influence the potential for adequate updating and implementation of legislation together with labeling requirements. Quantitative polymerase chain reaction (qPCR) systems were designed to boost the sensitivity and specificity on the identification of GMOs in highly degraded DNA samples; however, such testing will become economically difficult to cope with due to increasing numbers of approved genetically modified (GM) lines. Multiplexing approaches are therefore in development to provide cost-efficient solution. Construct-specific primers and probe were developed for quantitative analysis of Roundup Ready soybean (RRS) event glyphosate-tolerant soybean (GTS) 40-3-2. The lectin gene (Le1) was used as a reference gene, and its specificity was verified. RRS- and Le1-specific quantitative real-time PCR (qRTPCR) were optimized in a duplex platform that has been validated with respect to limit of detection (LOD) and limit of quantification (LOQ), as well as accuracy. The analysis of model processed food samples showed that the degradation of DNA has no adverse or little effects on the performance of quantification assay. In this study, a duplex qRTPCR using TaqMan minor groove binder-non-fluorescent quencher (MGB-NFQ) chemistry was developed for specific detection and quantification of RRS event GTS 40-3-2 that can be used for practical monitoring in processed food products.

  2. Surgical task analysis of simulated laparoscopic cholecystectomy with a navigation system.

    PubMed

    Sugino, T; Kawahira, H; Nakamura, R

    2014-09-01

       Advanced surgical procedures, which have become complex and difficult, increase the burden of surgeons. Quantitative analysis of surgical procedures can improve training, reduce variability, and enable optimization of surgical procedures. To this end, a surgical task analysis system was developed that uses only surgical navigation information.    Division of the surgical procedure, task progress analysis, and task efficiency analysis were done. First, the procedure was divided into five stages. Second, the operating time and progress rate were recorded to document task progress during specific stages, including the dissecting task. Third, the speed of the surgical instrument motion (mean velocity and acceleration), as well as the size and overlap ratio of the approximate ellipse of the location log data distribution, was computed to estimate the task efficiency during each stage. These analysis methods were evaluated based on experimental validation with two groups of surgeons, i.e., skilled and "other" surgeons. The performance metrics and analytical parameters included incidents during the operation, the surgical environment, and the surgeon's skills or habits.    Comparison of groups revealed that skilled surgeons tended to perform the procedure in less time and involved smaller regions; they also manipulated the surgical instruments more gently.    Surgical task analysis developed for quantitative assessment of surgical procedures and surgical performance may provide practical methods and metrics for objective evaluation of surgical expertise.

  3. Full toroidal imaging of non-axisymmetric plasma material interaction in the National Spherical Torus Experiment divertor.

    PubMed

    Scotti, Filippo; Roquemore, A L; Soukhanovskii, V A

    2012-10-01

    A pair of two dimensional fast cameras with a wide angle view (allowing a full radial and toroidal coverage of the lower divertor) was installed in the National Spherical Torus Experiment in order to monitor non-axisymmetric effects. A custom polar remapping procedure and an absolute photometric calibration enabled the easier visualization and quantitative analysis of non-axisymmetric plasma material interaction (e.g., strike point splitting due to application of 3D fields and effects of toroidally asymmetric plasma facing components).

  4. Absolute Quantification of Selected Proteins in the Human Osteoarthritic Secretome

    PubMed Central

    Peffers, Mandy J.; Beynon, Robert J.; Clegg, Peter D.

    2013-01-01

    Osteoarthritis (OA) is characterized by a loss of extracellular matrix which is driven by catabolic cytokines. Proteomic analysis of the OA cartilage secretome enables the global study of secreted proteins. These are an important class of molecules with roles in numerous pathological mechanisms. Although cartilage studies have identified profiles of secreted proteins, quantitative proteomics techniques have been implemented that would enable further biological questions to be addressed. To overcome this limitation, we used the secretome from human OA cartilage explants stimulated with IL-1β and compared proteins released into the media using a label-free LC-MS/MS-based strategy. We employed QconCAT technology to quantify specific proteins using selected reaction monitoring. A total of 252 proteins were identified, nine were differentially expressed by IL-1 β stimulation. Selected protein candidates were quantified in absolute amounts using QconCAT. These findings confirmed a significant reduction in TIMP-1 in the secretome following IL-1β stimulation. Label-free and QconCAT analysis produced equivocal results indicating no effect of cytokine stimulation on aggrecan, cartilage oligomeric matrix protein, fibromodulin, matrix metalloproteinases 1 and 3 or plasminogen release. This study enabled comparative protein profiling and absolute quantification of proteins involved in molecular pathways pertinent to understanding the pathogenesis of OA. PMID:24132152

  5. Adduct ion-targeted qualitative and quantitative analysis of polyoxypregnanes by ultra-high pressure liquid chromatography coupled with triple quadrupole mass spectrometry.

    PubMed

    Wu, Xu; Zhu, Lin; Ma, Jiang; Ye, Yang; Lin, Ge

    2017-10-25

    Polyoxypregnane and its glycosides (POPs) are frequently present in plants of Asclepiadaceae family, and have a variety of biological activities. There is a great need to comprehensively profile these phytochemicals and to quantify them for monitoring their contents in the herbs and the biological samples. However, POPs undergo extensive adduct ion formation in ESI-MS, which has posed a challenge for qualitative and quantitative analysis of POPs. In the present study, we took the advantage of such extensive adduct ion formation to investigate the suitability of adduct ion-targeted analysis of POPs. For the qualitative analysis, we firstly demonstrated that the sodium and ammonium adduct ion-targeted product ion scans (PIS) provided adequate MS/MS fragmentations for structural characterization of POPs. Aided with precursor ion (PI) scans, which showed high selectivity and sensitivity and improved peak assignment confidence in conjunction with full scan (FS), the informative adduct ion-targeted PIS enabled rapid POPs profiling. For the quantification, we used formic acid rather than ammonium acetate as an additive in the mobile phase to avoid simultaneous formation of sodium and ammonium adduct ions, and greatly improved reproducibility of MS response of POPs. By monitoring the solely formed sodium adduct ions [M+Na] + , a method for simultaneous quantification of 25 POPs in the dynamic multiple reaction monitoring mode was then developed and validated. Finally, the aforementioned methods were applied to qualitative and quantitative analysis of POPs in the extract of a traditional Chinses medicinal herb, Marsdenia tenacissima (Roxb.) Wight et Arn., and in the plasma obtained from the rats treated with this herb. The results demonstrated that adduct ion formation could be optimized for the qualitative and quantitative analysis of POPs, and our developed PI/FS-PIS scanning and sole [M+Na] + ion monitoring significantly improved the analysis of POPs in both herbal and biological samples. This study also provides implications for the analysis of other compounds which undergo extensive adduct ion formation in ESI-MS. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. On the Distinction Between Quantitative and Qualitative Research.

    ERIC Educational Resources Information Center

    Smith, P. L.

    Quantitative and qualitative research are differing modes of measurement, one using numbers and the other not. The assignment of numerals to represent properties enables a researcher to distinguish minutely between different properties. The major issue dividing these approaches to empirical research represents a philosophical dispute which has…

  7. Quantitative single-molecule imaging by confocal laser scanning microscopy.

    PubMed

    Vukojevic, Vladana; Heidkamp, Marcus; Ming, Yu; Johansson, Björn; Terenius, Lars; Rigler, Rudolf

    2008-11-25

    A new approach to quantitative single-molecule imaging by confocal laser scanning microscopy (CLSM) is presented. It relies on fluorescence intensity distribution to analyze the molecular occurrence statistics captured by digital imaging and enables direct determination of the number of fluorescent molecules and their diffusion rates without resorting to temporal or spatial autocorrelation analyses. Digital images of fluorescent molecules were recorded by using fast scanning and avalanche photodiode detectors. In this way the signal-to-background ratio was significantly improved, enabling direct quantitative imaging by CLSM. The potential of the proposed approach is demonstrated by using standard solutions of fluorescent dyes, fluorescently labeled DNA molecules, quantum dots, and the Enhanced Green Fluorescent Protein in solution and in live cells. The method was verified by using fluorescence correlation spectroscopy. The relevance for biological applications, in particular, for live cell imaging, is discussed.

  8. Human genomic DNA quantitation system, H-Quant: development and validation for use in forensic casework.

    PubMed

    Shewale, Jaiprakash G; Schneida, Elaine; Wilson, Jonathan; Walker, Jerilyn A; Batzer, Mark A; Sinha, Sudhir K

    2007-03-01

    The human DNA quantification (H-Quant) system, developed for use in human identification, enables quantitation of human genomic DNA in biological samples. The assay is based on real-time amplification of AluYb8 insertions in hominoid primates. The relatively high copy number of subfamily-specific Alu repeats in the human genome enables quantification of very small amounts of human DNA. The oligonucleotide primers present in H-Quant are specific for human DNA and closely related great apes. During the real-time PCR, the SYBR Green I dye binds to the DNA that is synthesized by the human-specific AluYb8 oligonucleotide primers. The fluorescence of the bound SYBR Green I dye is measured at the end of each PCR cycle. The cycle at which the fluorescence crosses the chosen threshold correlates to the quantity of amplifiable DNA in that sample. The minimal sensitivity of the H-Quant system is 7.6 pg/microL of human DNA. The amplicon generated in the H-Quant assay is 216 bp, which is within the same range of the common amplifiable short tandem repeat (STR) amplicons. This size amplicon enables quantitation of amplifiable DNA as opposed to a quantitation of degraded or nonamplifiable DNA of smaller sizes. Development and validation studies were performed on the 7500 real-time PCR system following the Quality Assurance Standards for Forensic DNA Testing Laboratories.

  9. Catheter Insertion Reference Trajectory Construction Method Using Photoelastic Stress Analysis for Quantification of Respect for Tissue During Endovascular Surgery Simulation

    NASA Astrophysics Data System (ADS)

    Tercero, Carlos; Ikeda, Seiichi; Fukuda, Toshio; Arai, Fumihito; Negoro, Makoto; Takahashi, Ikuo

    2011-10-01

    There is a need to develop quantitative evaluation for simulator based training in medicine. Photoelastic stress analysis can be used in human tissue modeling materials; this enables the development of simulators that measure respect for tissue. For applying this to endovascular surgery, first we present a model of saccular aneurism where stress variation during micro-coils deployment is measured, and then relying on a bi-planar vision system we measure a catheter trajectory and compare it to a reference trajectory considering respect for tissue. New photoelastic tissue modeling materials will expand the applications of this technology to other medical training domains.

  10. Wide-Field Imaging of Single-Nanoparticle Extinction with Sub-nm2 Sensitivity

    NASA Astrophysics Data System (ADS)

    Payne, Lukas M.; Langbein, Wolfgang; Borri, Paola

    2018-03-01

    We report on a highly sensitive wide-field imaging technique for quantitative measurement of the optical extinction cross section σext of single nanoparticles. The technique is simple and high speed, and it enables the simultaneous acquisition of hundreds of nanoparticles for statistical analysis. Using rapid referencing, fast acquisition, and a deconvolution analysis, a shot-noise-limited sensitivity down to 0.4 nm2 is achieved. Measurements on a set of individual gold nanoparticles of 5 nm diameter using this method yield σext=(10.0 ±3.1 ) nm2, which is consistent with theoretical expectations and well above the background fluctuations of 0.9 nm2 .

  11. Optical computed tomography for spatially isotropic four-dimensional imaging of live single cells

    PubMed Central

    Kelbauskas, Laimonas; Shetty, Rishabh; Cao, Bin; Wang, Kuo-Chen; Smith, Dean; Wang, Hong; Chao, Shi-Hui; Gangaraju, Sandhya; Ashcroft, Brian; Kritzer, Margaret; Glenn, Honor; Johnson, Roger H.; Meldrum, Deirdre R.

    2017-01-01

    Quantitative three-dimensional (3D) computed tomography (CT) imaging of living single cells enables orientation-independent morphometric analysis of the intricacies of cellular physiology. Since its invention, x-ray CT has become indispensable in the clinic for diagnostic and prognostic purposes due to its quantitative absorption-based imaging in true 3D that allows objects of interest to be viewed and measured from any orientation. However, x-ray CT has not been useful at the level of single cells because there is insufficient contrast to form an image. Recently, optical CT has been developed successfully for fixed cells, but this technology called Cell-CT is incompatible with live-cell imaging due to the use of stains, such as hematoxylin, that are not compatible with cell viability. We present a novel development of optical CT for quantitative, multispectral functional 4D (three spatial + one spectral dimension) imaging of living single cells. The method applied to immune system cells offers truly isotropic 3D spatial resolution and enables time-resolved imaging studies of cells suspended in aqueous medium. Using live-cell optical CT, we found a heterogeneous response to mitochondrial fission inhibition in mouse macrophages and differential basal remodeling of small (0.1 to 1 fl) and large (1 to 20 fl) nuclear and mitochondrial structures on a 20- to 30-s time scale in human myelogenous leukemia cells. Because of its robust 3D measurement capabilities, live-cell optical CT represents a powerful new tool in the biomedical research field. PMID:29226240

  12. Electrochemical determination of microRNAs based on isothermal strand-displacement polymerase reaction coupled with multienzyme functionalized magnetic micro-carriers.

    PubMed

    Ma, Wen; Situ, Bo; Lv, Weifeng; Li, Bo; Yin, Xiaomao; Vadgama, Pankaj; Zheng, Lei; Wang, Wen

    2016-06-15

    MicroRNAs (miRNAs) show great potential for disease diagnostics due to their specific molecular profiles. Detection of miRNAs remains challenging and often requires sophisticated platforms. Here we report a multienzyme-functionalized magnetic microcarriers-assisted isothermal strand-displacement polymerase reaction (ISDPR) for quantitative detection of miRNAs. Magnetic micro-carriers (MMCs) were functionalized with molecular beacons to enable miRNAs recognition and magnetic separation. The target miRNAs triggered a phi29-mediated ISDPR, which can produce biotin-modified sequences on the MMCs. Streptavidin-alkaline phosphatase was then conjugated to the MMC surface through biotin-streptavidin interactions. In the presence of 2-phospho-L-ascorbic acid, miRNAs were quantitatively determined on a screen-printed carbon electrode from the anodic current of the enzymatic product. We show that this method enables detection of miRNAs as low as 9 fM and allows the discrimination of one base mismatched sequence. The proposed method was also successfully applied to analyze miRNAs in clinical tumor samples. This paper reports a new strategy for miRNAs analysis with high sensitivity, simplicity, and low cost. It would be particularly useful for rapid point-of-care testing of miRNAs in clinical laboratory. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Unusual Stability of Messenger RNA in Snake Venom Reveals Gene Expression Dynamics of Venom Replenishment

    PubMed Central

    Currier, Rachel B.; Calvete, Juan J.; Sanz, Libia; Harrison, Robert A.; Rowley, Paul D.; Wagstaff, Simon C.

    2012-01-01

    Venom is a critical evolutionary innovation enabling venomous snakes to become successful limbless predators; it is therefore vital that venomous snakes possess a highly efficient venom production and delivery system to maintain their predatory arsenal. Here, we exploit the unusual stability of messenger RNA in venom to conduct, for the first time, quantitative PCR to characterise the dynamics of gene expression of newly synthesised venom proteins following venom depletion. Quantitative PCR directly from venom enables real-time dynamic studies of gene expression in the same animals because it circumvents the conventional requirement to sacrifice snakes to extract mRNA from dissected venom glands. Using qPCR and proteomic analysis, we show that gene expression and protein re-synthesis triggered by venom expulsion peaks between days 3–7 of the cycle of venom replenishment, with different protein families expressed in parallel. We demonstrate that venom re-synthesis occurs very rapidly following depletion of venom stores, presumably to ensure venomous snakes retain their ability to efficiently predate and remain defended from predators. The stability of mRNA in venom is biologically fascinating, and could significantly empower venom research by expanding opportunities to produce transcriptomes from historical venom stocks and rare or endangered venomous species, for new therapeutic, diagnostic and evolutionary studies. PMID:22879897

  14. Quantitative characterization of nanoparticle agglomeration within biological media

    NASA Astrophysics Data System (ADS)

    Hondow, Nicole; Brydson, Rik; Wang, Peiyi; Holton, Mark D.; Brown, M. Rowan; Rees, Paul; Summers, Huw D.; Brown, Andy

    2012-07-01

    Quantitative analysis of nanoparticle dispersion state within biological media is essential to understanding cellular uptake and the roles of diffusion, sedimentation, and endocytosis in determining nanoparticle dose. The dispersion of polymer-coated CdTe/ZnS quantum dots in water and cell growth medium with and without fetal bovine serum was analyzed by transmission electron microscopy (TEM) and dynamic light scattering (DLS) techniques. Characterization by TEM of samples prepared by plunge freezing the blotted solutions into liquid ethane was sensitive to the dispersion state of the quantum dots and enabled measurement of agglomerate size distributions even in the presence of serum proteins where DLS failed. In addition, TEM showed a reduced packing fraction of quantum dots per agglomerate when dispersed in biological media and serum compared to just water, highlighting the effect of interactions between the media, serum proteins, and the quantum dots. The identification of a heterogeneous distribution of quantum dots and quantum dot agglomerates in cell growth medium and serum by TEM will enable correlation with the previously reported optical metrology of in vitro cellular uptake of this quantum dot dispersion. In this paper, we present a comparative study of TEM and DLS and show that plunge-freeze TEM provides a robust assessment of nanoparticle agglomeration state.

  15. Multifactorial Optimization of Contrast-Enhanced Nanofocus Computed Tomography for Quantitative Analysis of Neo-Tissue Formation in Tissue Engineering Constructs.

    PubMed

    Sonnaert, Maarten; Kerckhofs, Greet; Papantoniou, Ioannis; Van Vlierberghe, Sandra; Boterberg, Veerle; Dubruel, Peter; Luyten, Frank P; Schrooten, Jan; Geris, Liesbet

    2015-01-01

    To progress the fields of tissue engineering (TE) and regenerative medicine, development of quantitative methods for non-invasive three dimensional characterization of engineered constructs (i.e. cells/tissue combined with scaffolds) becomes essential. In this study, we have defined the most optimal staining conditions for contrast-enhanced nanofocus computed tomography for three dimensional visualization and quantitative analysis of in vitro engineered neo-tissue (i.e. extracellular matrix containing cells) in perfusion bioreactor-developed Ti6Al4V constructs. A fractional factorial 'design of experiments' approach was used to elucidate the influence of the staining time and concentration of two contrast agents (Hexabrix and phosphotungstic acid) and the neo-tissue volume on the image contrast and dataset quality. Additionally, the neo-tissue shrinkage that was induced by phosphotungstic acid staining was quantified to determine the operating window within which this contrast agent can be accurately applied. For Hexabrix the staining concentration was the main parameter influencing image contrast and dataset quality. Using phosphotungstic acid the staining concentration had a significant influence on the image contrast while both staining concentration and neo-tissue volume had an influence on the dataset quality. The use of high concentrations of phosphotungstic acid did however introduce significant shrinkage of the neo-tissue indicating that, despite sub-optimal image contrast, low concentrations of this staining agent should be used to enable quantitative analysis. To conclude, design of experiments allowed us to define the most optimal staining conditions for contrast-enhanced nanofocus computed tomography to be used as a routine screening tool of neo-tissue formation in Ti6Al4V constructs, transforming it into a robust three dimensional quality control methodology.

  16. Quantitative analysis of fragrance in selectable one dimensional or two dimensional gas chromatography-mass spectrometry with simultaneous detection of multiple detectors in single injection.

    PubMed

    Tan, Hui Peng; Wan, Tow Shi; Min, Christina Liew Shu; Osborne, Murray; Ng, Khim Hui

    2014-03-14

    A selectable one-dimensional ((1)D) or two-dimensional ((2)D) gas chromatography-mass spectrometry (GC-MS) system coupled with flame ionization detector (FID) and olfactory detection port (ODP) was employed in this study to analyze perfume oil and fragrance in shower gel. A split/splitless (SSL) injector and a programmable temperature vaporization (PTV) injector are connected via a 2-way splitter of capillary flow technology (CFT) in this selectable (1)D/(2)D GC-MS/FID/ODP system to facilitate liquid sample injections and thermal desorption (TD) for stir bar sorptive extraction (SBSE) technique, respectively. The dual-linked injectors set-up enable the use of two different injector ports (one at a time) in single sequence run without having to relocate the (1)D capillary column from one inlet to another. Target analytes were separated in (1)D GC-MS/FID/ODP and followed by further separation of co-elution mixture from (1)D in (2)D GC-MS/FID/ODP in single injection without any instrumental reconfiguration. A (1)D/(2)D quantitative analysis method was developed and validated for its repeatability - tR; calculated linear retention indices (LRI); response ratio in both MS and FID signal, limit of detection (LOD), limit of quantitation (LOQ), as well as linearity over a concentration range. The method was successfully applied in quantitative analysis of perfume solution at different concentration level (RSD≤0.01%, n=5) and shower gel spiked with perfume at different dosages (RSD≤0.04%, n=5) with good recovery (96-103% for SSL injection; 94-107% for stir bar sorptive extraction-thermal desorption (SBSE-TD). Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Stable isotopic labeling-based quantitative targeted glycomics (i-QTaG).

    PubMed

    Kim, Kyoung-Jin; Kim, Yoon-Woo; Kim, Yun-Gon; Park, Hae-Min; Jin, Jang Mi; Hwan Kim, Young; Yang, Yung-Hun; Kyu Lee, Jun; Chung, Junho; Lee, Sun-Gu; Saghatelian, Alan

    2015-01-01

    Mass spectrometry (MS) analysis combined with stable isotopic labeling is a promising method for the relative quantification of aberrant glycosylation in diseases and disorders. We developed a stable isotopic labeling-based quantitative targeted glycomics (i-QTaG) technique for the comparative and quantitative analysis of total N-glycans using matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). We established the analytical procedure with the chemical derivatizations (i.e., sialic acid neutralization and stable isotopic labeling) of N-glycans using a model glycoprotein (bovine fetuin). Moreover, the i-QTaG using MALDI-TOF MS was evaluated with various molar ratios (1:1, 1:2, 1:5) of (13) C6 /(12) C6 -2-aminobenzoic acid-labeled glycans from normal human serum. Finally, this method was applied to direct comparison of the total N-glycan profiles between normal human sera (n = 8) and prostate cancer patient sera (n = 17). The intensities of the N-glycan peaks from i-QTaG method showed a good linearity (R(2) > 0.99) with the amount of the bovine fetuin glycoproteins. The ratios of relative intensity between the isotopically 2-AA labeled N-glycans were close to the theoretical molar ratios (1:1, 1:2, 1:5). We also demonstrated that the up-regulation of the Lewis antigen (~82%) in sera from prostate cancer patients. In this proof-of-concept study, we demonstrated that the i-QTaG method, which enables to achieve a reliable comparative quantitation of total N-glycans via MALDI-TOF MS analysis, has the potential to diagnose and monitor alterations in glycosylation associated with disease states or biotherapeutics. © 2015 American Institute of Chemical Engineers.

  18. Data-Driven Surface Traversability Analysis for Mars 2020 Landing Site Selection

    NASA Technical Reports Server (NTRS)

    Ono, Masahiro; Rothrock, Brandon; Almeida, Eduardo; Ansar, Adnan; Otero, Richard; Huertas, Andres; Heverly, Matthew

    2015-01-01

    The objective of this paper is three-fold: 1) to describe the engineering challenges in the surface mobility of the Mars 2020 Rover mission that are considered in the landing site selection processs, 2) to introduce new automated traversability analysis capabilities, and 3) to present the preliminary analysis results for top candidate landing sites. The analysis capabilities presented in this paper include automated terrain classification, automated rock detection, digital elevation model (DEM) generation, and multi-ROI (region of interest) route planning. These analysis capabilities enable to fully utilize the vast volume of high-resolution orbiter imagery, quantitatively evaluate surface mobility requirements for each candidate site, and reject subjectivity in the comparison between sites in terms of engineering considerations. The analysis results supported the discussion in the Second Landing Site Workshop held in August 2015, which resulted in selecting eight candidate sites that will be considered in the third workshop.

  19. A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.

    PubMed

    Yu, Hongyang; Khan, Faisal; Veitch, Brian

    2017-09-01

    Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.

  20. Stereoselective Degradation and Molecular Ecological Mechanism of Chiral Pesticides Beta-Cypermethrin in Soils with Different pH Values.

    PubMed

    Yang, Zhong-Hua; Ji, Guo-Dong

    2015-12-15

    For decades, pesticides have been widely used for agricultural activities around the world, and the environmental problems caused by these compounds have raised widespread concern. However, the different enantioselective behaviors of chiral pesticide enantiomers are often ignored. Here, the selective degradation patterns and mechanisms of chiral pesticide enantiomers were successfully investigated for the first time in the soils of three cultivation areas with different pH values. Beta-cypermethrin was chosen as the target analyte. We found that the degradation rates of the four isomers of beta-cypermethrin were different. We used stepwise regression equations between degradation rates and functional genes to quantitatively study their relationships. Quantitative response analysis revealed that different isomers have different equations even under identical conditions. The results of path analysis showed that a single functional gene can make different direct and indirect contributions to the degradation of different isomers. Finally, the high-throughput technology was used to analysis the genome of the three tested soils and then compared the main microbial communities in them. We have successfully devised a method to investigate the molecular biological mechanisms of the selective degradation behavior of chiral compounds, thus enabling us to better understand these mechanisms.

  1. PeptideDepot: flexible relational database for visual analysis of quantitative proteomic data and integration of existing protein information.

    PubMed

    Yu, Kebing; Salomon, Arthur R

    2009-12-01

    Recently, dramatic progress has been achieved in expanding the sensitivity, resolution, mass accuracy, and scan rate of mass spectrometers able to fragment and identify peptides through MS/MS. Unfortunately, this enhanced ability to acquire proteomic data has not been accompanied by a concomitant increase in the availability of flexible tools allowing users to rapidly assimilate, explore, and analyze this data and adapt to various experimental workflows with minimal user intervention. Here we fill this critical gap by providing a flexible relational database called PeptideDepot for organization of expansive proteomic data sets, collation of proteomic data with available protein information resources, and visual comparison of multiple quantitative proteomic experiments. Our software design, built upon the synergistic combination of a MySQL database for safe warehousing of proteomic data with a FileMaker-driven graphical user interface for flexible adaptation to diverse workflows, enables proteomic end-users to directly tailor the presentation of proteomic data to the unique analysis requirements of the individual proteomics lab. PeptideDepot may be deployed as an independent software tool or integrated directly with our high throughput autonomous proteomic pipeline used in the automated acquisition and post-acquisition analysis of proteomic data.

  2. Virtualising the Quantitative Research Methods Course: An Island-Based Approach

    ERIC Educational Resources Information Center

    Baglin, James; Reece, John; Baker, Jenalle

    2015-01-01

    Many recent improvements in pedagogical practice have been enabled by the rapid development of innovative technologies, particularly for teaching quantitative research methods and statistics. This study describes the design, implementation, and evaluation of a series of specialised computer laboratory sessions. The sessions combined the use of an…

  3. Recent Progress in the Remote Detection of Vapours and Gaseous Pollutants.

    ERIC Educational Resources Information Center

    Moffat, A. J.; And Others

    Work has been continuing on the correlation spectrometry techniques described at previous remote sensing symposiums. Advances in the techniques are described which enable accurate quantitative measurements of diffused atmospheric gases to be made using controlled light sources, accurate quantitative measurements of gas clouds relative to…

  4. QuASAR: quantitative allele-specific analysis of reads

    PubMed Central

    Harvey, Chris T.; Moyerbrailean, Gregory A.; Davis, Gordon O.; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-01-01

    Motivation: Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. Results: We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. Availability and implementation: http://github.com/piquelab/QuASAR. Contact: fluca@wayne.edu or rpique@wayne.edu Supplementary information: Supplementary Material is available at Bioinformatics online. PMID:25480375

  5. Computational Support for Technology- Investment Decisions

    NASA Technical Reports Server (NTRS)

    Adumitroaie, Virgil; Hua, Hook; Lincoln, William; Block, Gary; Mrozinski, Joseph; Shelton, Kacie; Weisbin, Charles; Elfes, Alberto; Smith, Jeffrey

    2007-01-01

    Strategic Assessment of Risk and Technology (START) is a user-friendly computer program that assists human managers in making decisions regarding research-and-development investment portfolios in the presence of uncertainties and of non-technological constraints that include budgetary and time limits, restrictions related to infrastructure, and programmatic and institutional priorities. START facilitates quantitative analysis of technologies, capabilities, missions, scenarios and programs, and thereby enables the selection and scheduling of value-optimal development efforts. START incorporates features that, variously, perform or support a unique combination of functions, most of which are not systematically performed or supported by prior decision- support software. These functions include the following: Optimal portfolio selection using an expected-utility-based assessment of capabilities and technologies; Temporal investment recommendations; Distinctions between enhancing and enabling capabilities; Analysis of partial funding for enhancing capabilities; and Sensitivity and uncertainty analysis. START can run on almost any computing hardware, within Linux and related operating systems that include Mac OS X versions 10.3 and later, and can run in Windows under the Cygwin environment. START can be distributed in binary code form. START calls, as external libraries, several open-source software packages. Output is in Excel (.xls) file format.

  6. Chemometric analysis of correlations between electronic absorption characteristics and structural and/or physicochemical parameters for ampholytic substances of biological and pharmaceutical relevance.

    PubMed

    Judycka-Proma, U; Bober, L; Gajewicz, A; Puzyn, T; Błażejowski, J

    2015-03-05

    Forty ampholytic compounds of biological and pharmaceutical relevance were subjected to chemometric analysis based on unsupervised and supervised learning algorithms. This enabled relations to be found between empirical spectral characteristics derived from electronic absorption data and structural and physicochemical parameters predicted by quantum chemistry methods or phenomenological relationships based on additivity rules. It was found that the energies of long wavelength absorption bands are correlated through multiparametric linear relationships with parameters reflecting the bulkiness features of the absorbing molecules as well as their nucleophilicity and electrophilicity. These dependences enable the quantitative analysis of spectral features of the compounds, as well as a comparison of their similarities and certain pharmaceutical and biological features. Three QSPR models to predict the energies of long-wavelength absorption in buffers with pH=2.5 and pH=7.0, as well as in methanol, were developed and validated in this study. These models can be further used to predict the long-wavelength absorption energies of untested substances (if they are structurally similar to the training compounds). Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Techniques in helical scanning, dynamic imaging and image segmentation for improved quantitative analysis with X-ray micro-CT

    NASA Astrophysics Data System (ADS)

    Sheppard, Adrian; Latham, Shane; Middleton, Jill; Kingston, Andrew; Myers, Glenn; Varslot, Trond; Fogden, Andrew; Sawkins, Tim; Cruikshank, Ron; Saadatfar, Mohammad; Francois, Nicolas; Arns, Christoph; Senden, Tim

    2014-04-01

    This paper reports on recent advances at the micro-computed tomography facility at the Australian National University. Since 2000 this facility has been a significant centre for developments in imaging hardware and associated software for image reconstruction, image analysis and image-based modelling. In 2010 a new instrument was constructed that utilises theoretically-exact image reconstruction based on helical scanning trajectories, allowing higher cone angles and thus better utilisation of the available X-ray flux. We discuss the technical hurdles that needed to be overcome to allow imaging with cone angles in excess of 60°. We also present dynamic tomography algorithms that enable the changes between one moment and the next to be reconstructed from a sparse set of projections, allowing higher speed imaging of time-varying samples. Researchers at the facility have also created a sizeable distributed-memory image analysis toolkit with capabilities ranging from tomographic image reconstruction to 3D shape characterisation. We show results from image registration and present some of the new imaging and experimental techniques that it enables. Finally, we discuss the crucial question of image segmentation and evaluate some recently proposed techniques for automated segmentation.

  8. Quantitative ion beam analysis of M-C-O systems: application to an oxidized uranium carbide sample

    NASA Astrophysics Data System (ADS)

    Martin, G.; Raveu, G.; Garcia, P.; Carlot, G.; Khodja, H.; Vickridge, I.; Barthe, M. F.; Sauvage, T.

    2014-04-01

    A large variety of materials contain both carbon and oxygen atoms, in particular oxidized carbides, carbon alloys (as ZrC, UC, steels, etc.), and oxycarbide compounds (SiCO glasses, TiCO, etc.). Here a new ion beam analysis methodology is described which enables quantification of elemental composition and oxygen concentration profile over a few microns. It is based on two procedures. The first, relative to the experimental configuration relies on a specific detection setup which is original in that it enables the separation of the carbon and oxygen NRA signals. The second concerns the data analysis procedure i.e. the method for deriving the elemental composition from the particle energy spectrum. It is a generic algorithm and is here successfully applied to characterize an oxidized uranium carbide sample, developed as a potential fuel for generation IV nuclear reactors. Furthermore, a micro-beam was used to simultaneously determine the local elemental composition and oxygen concentration profiles over the first microns below the sample surface. This method is adapted to the determination of the composition of M?C?O? compounds with a sensitivity on elemental atomic concentrations around 1000 ppm.

  9. Quantitative image analysis for evaluating the coating thickness and pore distribution in coated small particles.

    PubMed

    Laksmana, F L; Van Vliet, L J; Hartman Kok, P J A; Vromans, H; Frijlink, H W; Van der Voort Maarschalk, K

    2009-04-01

    This study aims to develop a characterization method for coating structure based on image analysis, which is particularly promising for the rational design of coated particles in the pharmaceutical industry. The method applies the MATLAB image processing toolbox to images of coated particles taken with Confocal Laser Scanning Microscopy (CSLM). The coating thicknesses have been determined along the particle perimeter, from which a statistical analysis could be performed to obtain relevant thickness properties, e.g. the minimum coating thickness and the span of the thickness distribution. The characterization of the pore structure involved a proper segmentation of pores from the coating and a granulometry operation. The presented method facilitates the quantification of porosity, thickness and pore size distribution of a coating. These parameters are considered the important coating properties, which are critical to coating functionality. Additionally, the effect of the coating process variations on coating quality can straight-forwardly be assessed. Enabling a good characterization of the coating qualities, the presented method can be used as a fast and effective tool to predict coating functionality. This approach also enables the influence of different process conditions on coating properties to be effectively monitored, which latterly leads to process tailoring.

  10. Quantitative protein localization signatures reveal an association between spatial and functional divergences of proteins.

    PubMed

    Loo, Lit-Hsin; Laksameethanasan, Danai; Tung, Yi-Ling

    2014-03-01

    Protein subcellular localization is a major determinant of protein function. However, this important protein feature is often described in terms of discrete and qualitative categories of subcellular compartments, and therefore it has limited applications in quantitative protein function analyses. Here, we present Protein Localization Analysis and Search Tools (PLAST), an automated analysis framework for constructing and comparing quantitative signatures of protein subcellular localization patterns based on microscopy images. PLAST produces human-interpretable protein localization maps that quantitatively describe the similarities in the localization patterns of proteins and major subcellular compartments, without requiring manual assignment or supervised learning of these compartments. Using the budding yeast Saccharomyces cerevisiae as a model system, we show that PLAST is more accurate than existing, qualitative protein localization annotations in identifying known co-localized proteins. Furthermore, we demonstrate that PLAST can reveal protein localization-function relationships that are not obvious from these annotations. First, we identified proteins that have similar localization patterns and participate in closely-related biological processes, but do not necessarily form stable complexes with each other or localize at the same organelles. Second, we found an association between spatial and functional divergences of proteins during evolution. Surprisingly, as proteins with common ancestors evolve, they tend to develop more diverged subcellular localization patterns, but still occupy similar numbers of compartments. This suggests that divergence of protein localization might be more frequently due to the development of more specific localization patterns over ancestral compartments than the occupation of new compartments. PLAST enables systematic and quantitative analyses of protein localization-function relationships, and will be useful to elucidate protein functions and how these functions were acquired in cells from different organisms or species. A public web interface of PLAST is available at http://plast.bii.a-star.edu.sg.

  11. Quantitative Protein Localization Signatures Reveal an Association between Spatial and Functional Divergences of Proteins

    PubMed Central

    Loo, Lit-Hsin; Laksameethanasan, Danai; Tung, Yi-Ling

    2014-01-01

    Protein subcellular localization is a major determinant of protein function. However, this important protein feature is often described in terms of discrete and qualitative categories of subcellular compartments, and therefore it has limited applications in quantitative protein function analyses. Here, we present Protein Localization Analysis and Search Tools (PLAST), an automated analysis framework for constructing and comparing quantitative signatures of protein subcellular localization patterns based on microscopy images. PLAST produces human-interpretable protein localization maps that quantitatively describe the similarities in the localization patterns of proteins and major subcellular compartments, without requiring manual assignment or supervised learning of these compartments. Using the budding yeast Saccharomyces cerevisiae as a model system, we show that PLAST is more accurate than existing, qualitative protein localization annotations in identifying known co-localized proteins. Furthermore, we demonstrate that PLAST can reveal protein localization-function relationships that are not obvious from these annotations. First, we identified proteins that have similar localization patterns and participate in closely-related biological processes, but do not necessarily form stable complexes with each other or localize at the same organelles. Second, we found an association between spatial and functional divergences of proteins during evolution. Surprisingly, as proteins with common ancestors evolve, they tend to develop more diverged subcellular localization patterns, but still occupy similar numbers of compartments. This suggests that divergence of protein localization might be more frequently due to the development of more specific localization patterns over ancestral compartments than the occupation of new compartments. PLAST enables systematic and quantitative analyses of protein localization-function relationships, and will be useful to elucidate protein functions and how these functions were acquired in cells from different organisms or species. A public web interface of PLAST is available at http://plast.bii.a-star.edu.sg. PMID:24603469

  12. Surrogate matrix and surrogate analyte approaches for definitive quantitation of endogenous biomolecules.

    PubMed

    Jones, Barry R; Schultz, Gary A; Eckstein, James A; Ackermann, Bradley L

    2012-10-01

    Quantitation of biomarkers by LC-MS/MS is complicated by the presence of endogenous analytes. This challenge is most commonly overcome by calibration using an authentic standard spiked into a surrogate matrix devoid of the target analyte. A second approach involves use of a stable-isotope-labeled standard as a surrogate analyte to allow calibration in the actual biological matrix. For both methods, parallelism between calibration standards and the target analyte in biological matrix must be demonstrated in order to ensure accurate quantitation. In this communication, the surrogate matrix and surrogate analyte approaches are compared for the analysis of five amino acids in human plasma: alanine, valine, methionine, leucine and isoleucine. In addition, methodology based on standard addition is introduced, which enables a robust examination of parallelism in both surrogate analyte and surrogate matrix methods prior to formal validation. Results from additional assays are presented to introduce the standard-addition methodology and to highlight the strengths and weaknesses of each approach. For the analysis of amino acids in human plasma, comparable precision and accuracy were obtained by the surrogate matrix and surrogate analyte methods. Both assays were well within tolerances prescribed by regulatory guidance for validation of xenobiotic assays. When stable-isotope-labeled standards are readily available, the surrogate analyte approach allows for facile method development. By comparison, the surrogate matrix method requires greater up-front method development; however, this deficit is offset by the long-term advantage of simplified sample analysis.

  13. Advances in Surface Plasmon Resonance Imaging enable quantitative measurement of laterally heterogeneous coatings of nanoscale thickness

    NASA Astrophysics Data System (ADS)

    Raegen, Adam; Reiter, Kyle; Clarke, Anthony; Lipkowski, Jacek; Dutcher, John

    2013-03-01

    The Surface Plasmon Resonance (SPR) phenomenon is routinely exploited to qualitatively probe changes to the optical properties of nanoscale coatings on thin metallic surfaces, for use in probes and sensors. Unfortunately, extracting truly quantitative information is usually limited to a select few cases - uniform absorption/desorption of small biomolecules and films, in which a continuous ``slab'' model is a good approximation. We present advancements in the SPR technique that expand the number of cases for which the technique can provide meaningful results. Use of a custom, angle-scanning SPR imaging system, together with a refined data analysis method, allow for quantitative kinetic measurements of laterally heterogeneous systems. We first demonstrate the directionally heterogeneous nature of the SPR phenomenon using a directionally ordered sample, then show how this allows for the calculation of the average coverage of a heterogeneous sample. Finally, the degradation of cellulose microfibrils and bundles of microfibrils due to the action of cellulolytic enzymes will be presented as an excellent example of the capabilities of the SPR imaging system.

  14. Quantitative characterization of surface topography using spectral analysis

    NASA Astrophysics Data System (ADS)

    Jacobs, Tevis D. B.; Junge, Till; Pastewka, Lars

    2017-03-01

    Roughness determines many functional properties of surfaces, such as adhesion, friction, and (thermal and electrical) contact conductance. Recent analytical models and simulations enable quantitative prediction of these properties from knowledge of the power spectral density (PSD) of the surface topography. The utility of the PSD is that it contains statistical information that is unbiased by the particular scan size and pixel resolution chosen by the researcher. In this article, we first review the mathematical definition of the PSD, including the one- and two-dimensional cases, and common variations of each. We then discuss strategies for reconstructing an accurate PSD of a surface using topography measurements at different size scales. Finally, we discuss detecting and mitigating artifacts at the smallest scales, and computing upper/lower bounds on functional properties obtained from models. We accompany our discussion with virtual measurements on computer-generated surfaces. This discussion summarizes how to analyze topography measurements to reconstruct a reliable PSD. Analytical models demonstrate the potential for tuning functional properties by rationally tailoring surface topography—however, this potential can only be achieved through the accurate, quantitative reconstruction of the PSDs of real-world surfaces.

  15. FreeWalker: a smart insole for longitudinal gait analysis.

    PubMed

    Wang, Baitong; Rajput, Kuldeep Singh; Tam, Wing-Kin; Tung, Anthony K H; Yang, Zhi

    2015-08-01

    Gait analysis is an important diagnostic measure to investigate the pattern of walking. Traditional gait analysis is generally carried out in a gait lab, with equipped force and body tracking sensors, which needs a trained medical professional to interpret the results. This procedure is tedious, expensive, and unreliable and makes it difficult to track the progress across multiple visits. In this paper, we present a smart insole called FreeWalker, which provides quantitative gait analysis outside the confinement of traditional lab, at low- cost. The insole consists of eight pressure sensors and two motion tracking sensors, i.e. 3-axis accelerometer and 3-axis gyroscope. This enables measurement of under-foot pressure distribution and motion sequences in real-time. The insole is enabled with onboard SD card as well as wireless data transmission, which help in continuous gait-cycle analysis. The data is then sent to a gateway, for analysis and interpretation of data, using a user interface where gait features are graphically displayed. We also present validation result of a subject's left foot, who was asked to perform a specific task. Experiment results show that we could achieve a data-sampling rate of over 1 KHz, transmitting data up to a distance of 20 meter and maintain a battery life of around 24 hours. Taking advantage of these features, FreeWalker can be used in various applications, like medical diagnosis, rehabilitation, sports and entertainment.

  16. HoloMonitor M4: holographic imaging cytometer for real-time kinetic label-free live-cell analysis of adherent cells

    NASA Astrophysics Data System (ADS)

    Sebesta, Mikael; Egelberg, Peter J.; Langberg, Anders; Lindskov, Jens-Henrik; Alm, Kersti; Janicke, Birgit

    2016-03-01

    Live-cell imaging enables studying dynamic cellular processes that cannot be visualized in fixed-cell assays. An increasing number of scientists in academia and the pharmaceutical industry are choosing live-cell analysis over or in addition to traditional fixed-cell assays. We have developed a time-lapse label-free imaging cytometer HoloMonitorM4. HoloMonitor M4 assists researchers to overcome inherent disadvantages of fluorescent analysis, specifically effects of chemical labels or genetic modifications which can alter cellular behavior. Additionally, label-free analysis is simple and eliminates the costs associated with staining procedures. The underlying technology principle is based on digital off-axis holography. While multiple alternatives exist for this type of analysis, we prioritized our developments to achieve the following: a) All-inclusive system - hardware and sophisticated cytometric analysis software; b) Ease of use enabling utilization of instrumentation by expert- and entrylevel researchers alike; c) Validated quantitative assay end-points tracked over time such as optical path length shift, optical volume and multiple derived imaging parameters; d) Reliable digital autofocus; e) Robust long-term operation in the incubator environment; f) High throughput and walk-away capability; and finally g) Data management suitable for single- and multi-user networks. We provide examples of HoloMonitor applications of label-free cell viability measurements and monitoring of cell cycle phase distribution.

  17. Technical Note: Quantitative dynamic contrast-enhanced MRI of a 3-dimensional artificial capillary network.

    PubMed

    Gaass, Thomas; Schneider, Moritz Jörg; Dietrich, Olaf; Ingrisch, Michael; Dinkel, Julien

    2017-04-01

    Variability across devices, patients, and time still hinders widespread recognition of dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) as quantitative biomarker. The purpose of this work was to introduce and characterize a dedicated microchannel phantom as a model for quantitative DCE-MRI measurements. A perfusable, MR-compatible microchannel network was constructed on the basis of sacrificial melt-spun sugar fibers embedded in a block of epoxy resin. Structural analysis was performed on the basis of light microscopy images before DCE-MRI experiments. During dynamic acquisition the capillary network was perfused with a standard contrast agent injection system. Flow-dependency, as well as inter- and intrascanner reproducibility of the computed DCE parameters were evaluated using a 3.0 T whole-body MRI. Semi-quantitative and quantitative flow-related parameters exhibited the expected proportionality to the set flow rate (mean Pearson correlation coefficient: 0.991, P < 2.5e-5). The volume fraction was approximately independent from changes of the applied flow rate through the phantom. Repeatability and reproducibility experiments yielded maximum intrascanner coefficients of variation (CV) of 4.6% for quantitative parameters. All evaluated parameters were well in the range of known in vivo results for the applied flow rates. The constructed phantom enables reproducible, flow-dependent, contrast-enhanced MR measurements with the potential to facilitate standardization and comparability of DCE-MRI examinations. © 2017 American Association of Physicists in Medicine.

  18. Development of quantitative security optimization approach for the picture archives and carrying system between a clinic and a rehabilitation center

    NASA Astrophysics Data System (ADS)

    Haneda, Kiyofumi; Kajima, Toshio; Koyama, Tadashi; Muranaka, Hiroyuki; Dojo, Hirofumi; Aratani, Yasuhiko

    2002-05-01

    The target of our study is to analyze the level of necessary security requirements, to search for suitable security measures and to optimize security distribution to every portion of the medical practice. Quantitative expression must be introduced to our study, if possible, to enable simplified follow-up security procedures and easy evaluation of security outcomes or results. Using fault tree analysis (FTA), system analysis showed that system elements subdivided into groups by details result in a much more accurate analysis. Such subdivided composition factors greatly depend on behavior of staff, interactive terminal devices, kinds of services provided, and network routes. Security measures were then implemented based on the analysis results. In conclusion, we identified the methods needed to determine the required level of security and proposed security measures for each medical information system, and the basic events and combinations of events that comprise the threat composition factors. Methods for identifying suitable security measures were found and implemented. Risk factors for each basic event, a number of elements for each composition factor, and potential security measures were found. Methods to optimize the security measures for each medical information system were proposed, developing the most efficient distribution of risk factors for basic events.

  19. Transcriptional profiling of CD31(+) cells isolated from murine embryonic stem cells.

    PubMed

    Mariappan, Devi; Winkler, Johannes; Chen, Shuhua; Schulz, Herbert; Hescheler, Jürgen; Sachinidis, Agapios

    2009-02-01

    Identification of genes involved in endothelial differentiation is of great interest for the understanding of the cellular and molecular mechanisms involved in the development of new blood vessels. Mouse embryonic stem (mES) cells serve as a potential source of endothelial cells for transcriptomic analysis. We isolated endothelial cells from 8-days old embryoid bodies by immuno-magnetic separation using platelet endothelial cell adhesion molecule-1 (also known as CD31) expressed on both early and mature endothelial cells. CD31(+) cells exhibit endothelial-like behavior by being able to incorporate DiI-labeled acetylated low-density lipoprotein as well as form tubular structures on matrigel. Quantitative and semi-quantitative PCR analysis further demonstrated the increased expression of endothelial transcripts. To ascertain the specific transcriptomic identity of the CD31(+) cells, large-scale microarray analysis was carried out. Comparative bioinformatic analysis reveals an enrichment of the gene ontology categories angiogenesis, blood vessel morphogenesis, vasculogenesis and blood coagulation in the CD31(+) cell population. Based on the transcriptomic signatures of the CD31(+) cells, we conclude that this ES cell-derived population contains endothelial-like cells expressing a mesodermal marker BMP2 and possess an angiogenic potential. The transcriptomic characterization of CD31(+) cells enables an in vitro functional genomic model to identify genes required for angiogenesis.

  20. OIPAV: an integrated software system for ophthalmic image processing, analysis and visualization

    NASA Astrophysics Data System (ADS)

    Zhang, Lichun; Xiang, Dehui; Jin, Chao; Shi, Fei; Yu, Kai; Chen, Xinjian

    2018-03-01

    OIPAV (Ophthalmic Images Processing, Analysis and Visualization) is a cross-platform software which is specially oriented to ophthalmic images. It provides a wide range of functionalities including data I/O, image processing, interaction, ophthalmic diseases detection, data analysis and visualization to help researchers and clinicians deal with various ophthalmic images such as optical coherence tomography (OCT) images and color photo of fundus, etc. It enables users to easily access to different ophthalmic image data manufactured from different imaging devices, facilitate workflows of processing ophthalmic images and improve quantitative evaluations. In this paper, we will present the system design and functional modules of the platform and demonstrate various applications. With a satisfying function scalability and expandability, we believe that the software can be widely applied in ophthalmology field.

  1. Tracing the spatiotemporally resolved inactivation of optically arranged bacteria by photofunctional microparticles at the single-cell level (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Barroso Peña, Alvaro; Grüner, Malte; Forbes, Taylor; Denz, Cornelia; Strassert, Cristian A.

    2016-09-01

    Antimicrobial Photodynamic Inactivation (PDI) represents an attractive alternative in the treatment of infections by antibiotic-resistant pathogenic bacteria. In PDI a photosensitizer (PS) is administered to the site of the biological target in order to generate cytotoxic singlet oxygen which reacts with the biological membrane upon application of harmless visible light. Established methods for testing the photoinduced cytotoxicity of PSs rely on the observation of the whole bacterial ensemble providing only a population-averaged information about the overall produced toxicity. However, for a deeper understanding of the processes that take place in PDI, new methods are required that provide simultaneous regulation of the ROS production, monitoring the subsequent damage induced in the bacteria cells, and full control of the distance between the bacteria and the center of the singlet oxygen production. Herein we present a novel method that enables the quantitative spatio-time-resolved analysis at the single cell level of the photoinduced damage produced by transparent microspheres functionalized with PSs. For this purpose, a methodology was introduced to monitor phototriggered changes with spatiotemporal resolution employing holographic optical tweezers and functional fluorescence microscopy. The defined distance between the photoactive particles and individual bacteria can be fixed under the microscope before the photosensitization process, and the photoinduced damage is monitored by tracing the fluorescence turn-on of a suitable marker. Our methodology constitutes a new tool for the in vitro design and analysis of photosensitizers, as it enables a quantitative response evaluation of living systems towards oxidative stress.

  2. Quantitative Oxygenation Venography from MRI Phase

    PubMed Central

    Fan, Audrey P.; Bilgic, Berkin; Gagnon, Louis; Witzel, Thomas; Bhat, Himanshu; Rosen, Bruce R.; Adalsteinsson, Elfar

    2014-01-01

    Purpose To demonstrate acquisition and processing methods for quantitative oxygenation venograms that map in vivo oxygen saturation (SvO2) along cerebral venous vasculature. Methods Regularized quantitative susceptibility mapping (QSM) is used to reconstruct susceptibility values and estimate SvO2 in veins. QSM with ℓ1 and ℓ2 regularization are compared in numerical simulations of vessel structures with known magnetic susceptibility. Dual-echo, flow-compensated phase images are collected in three healthy volunteers to create QSM images. Bright veins in the susceptibility maps are vectorized and used to form a three-dimensional vascular mesh, or venogram, along which to display SvO2 values from QSM. Results Quantitative oxygenation venograms that map SvO2 along brain vessels of arbitrary orientation and geometry are shown in vivo. SvO2 values in major cerebral veins lie within the normal physiological range reported by 15O positron emission tomography. SvO2 from QSM is consistent with previous MR susceptometry methods for vessel segments oriented parallel to the main magnetic field. In vessel simulations, ℓ1 regularization results in less than 10% SvO2 absolute error across all vessel tilt orientations and provides more accurate SvO2 estimation than ℓ2 regularization. Conclusion The proposed analysis of susceptibility images enables reliable mapping of quantitative SvO2 along venograms and may facilitate clinical use of venous oxygenation imaging. PMID:24006229

  3. Portable, one-step, and rapid GMR biosensor platform with smartphone interface.

    PubMed

    Choi, Joohong; Gani, Adi Wijaya; Bechstein, Daniel J B; Lee, Jung-Rok; Utz, Paul J; Wang, Shan X

    2016-11-15

    Quantitative immunoassay tests in clinical laboratories require trained technicians, take hours to complete with multiple steps, and the instruments used are generally immobile-patient samples have to be sent in to the labs for analysis. This prevents quantitative immunoassay tests to be performed outside laboratory settings. A portable, quantitative immunoassay device will be valuable in rural and resource-limited areas, where access to healthcare is scarce or far away. We have invented Eigen Diagnosis Platform (EDP), a portable quantitative immunoassay platform based on Giant Magnetoresistance (GMR) biosensor technology. The platform does not require a trained technician to operate, and only requires one-step user involvement. It displays quantitative results in less than 15min after sample insertion, and each test costs less than US$4. The GMR biosensor employed in EDP is capable of detecting multiple biomarkers in one test, enabling a wide array of immune diagnostics to be performed simultaneously. In this paper, we describe the design of EDP, and demonstrate its capability. Multiplexed assay of human immunoglobulin G and M (IgG and IgM) antibodies with EDP achieves sensitivities down to 0.07 and 0.33 nanomolar, respectively. The platform will allow lab testing to be performed in remote areas, and open up applications of immunoassay testing in other non-clinical settings, such as home, school, and office. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Optimization of metabolite basis sets prior to quantitation in magnetic resonance spectroscopy: an approach based on quantum mechanics

    NASA Astrophysics Data System (ADS)

    Lazariev, A.; Allouche, A.-R.; Aubert-Frécon, M.; Fauvelle, F.; Piotto, M.; Elbayed, K.; Namer, I.-J.; van Ormondt, D.; Graveron-Demilly, D.

    2011-11-01

    High-resolution magic angle spinning (HRMAS) nuclear magnetic resonance (NMR) is playing an increasingly important role for diagnosis. This technique enables setting up metabolite profiles of ex vivo pathological and healthy tissue. The need to monitor diseases and pharmaceutical follow-up requires an automatic quantitation of HRMAS 1H signals. However, for several metabolites, the values of chemical shifts of proton groups may slightly differ according to the micro-environment in the tissue or cells, in particular to its pH. This hampers the accurate estimation of the metabolite concentrations mainly when using quantitation algorithms based on a metabolite basis set: the metabolite fingerprints are not correct anymore. In this work, we propose an accurate method coupling quantum mechanical simulations and quantitation algorithms to handle basis-set changes. The proposed algorithm automatically corrects mismatches between the signals of the simulated basis set and the signal under analysis by maximizing the normalized cross-correlation between the mentioned signals. Optimized chemical shift values of the metabolites are obtained. This method, QM-QUEST, provides more robust fitting while limiting user involvement and respects the correct fingerprints of metabolites. Its efficiency is demonstrated by accurately quantitating 33 signals from tissue samples of human brains with oligodendroglioma, obtained at 11.7 tesla. The corresponding chemical shift changes of several metabolites within the series are also analyzed.

  5. Watershed Planning within a Quantitative Scenario Analysis Framework.

    PubMed

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-07-24

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation.

  6. Trends in fluorescence imaging and related techniques to unravel biological information.

    PubMed

    Haustein, Elke; Schwille, Petra

    2007-09-01

    Optical microscopy is among the most powerful tools that the physical sciences have ever provided biology. It is indispensable for basic lab work, as well as for cutting edge research, as the visual monitoring of life processes still belongs to the most compelling evidences for a multitude of biomedical applications. Along with the rapid development of new probes and methods for the analysis of laser induced fluorescence, optical microscopy over past years experienced a vast increase of both new techniques and novel combinations of established methods to study biological processes with unprecedented spatial and temporal precision. On the one hand, major technical advances have significantly improved spatial resolution. On the other hand, life scientists are moving toward three- and even four-dimensional cell biology and biophysics involving time as a crucial coordinate to quantitatively understand living specimen. Monitoring the whole cell or tissue in real time, rather than producing snap-shot-like two-dimensional projections, will enable more physiological and, thus, more clinically relevant experiments, whereas an increase in temporal resolution facilitates monitoring fast nonperiodic processes as well as the quantitative analysis of characteristic dynamics.

  7. Trends in fluorescence imaging and related techniques to unravel biological information

    PubMed Central

    Haustein, Elke; Schwille, Petra

    2007-01-01

    Optical microscopy is among the most powerful tools that the physical sciences have ever provided biology. It is indispensable for basic lab work, as well as for cutting edge research, as the visual monitoring of life processes still belongs to the most compelling evidences for a multitude of biomedical applications. Along with the rapid development of new probes and methods for the analysis of laser induced fluorescence, optical microscopy over past years experienced a vast increase of both new techniques and novel combinations of established methods to study biological processes with unprecedented spatial and temporal precision. On the one hand, major technical advances have significantly improved spatial resolution. On the other hand, life scientists are moving toward three- and even four-dimensional cell biology and biophysics involving time as a crucial coordinate to quantitatively understand living specimen. Monitoring the whole cell or tissue in real time, rather than producing snap-shot-like two-dimensional projections, will enable more physiological and, thus, more clinically relevant experiments, whereas an increase in temporal resolution facilitates monitoring fast nonperiodic processes as well as the quantitative analysis of characteristic dynamics. PMID:19404444

  8. Evaluation of a multiresidue method for measuring fourteen chemical groups of pesticides in water by use of LC-MS-MS.

    PubMed

    Carvalho, J J; Jerónimo, P C A; Gonçalves, C; Alpendurada, M F

    2008-11-01

    European Council Directive 98/83/EC on the quality of water intended for human consumption brought a new challenge for water-quality control routine laboratories, mainly on pesticides analysis. Under the guidelines of ISO/IEC 17025:2005, a multiresidue method was developed, validated, implemented in routine, and studied with real samples during a one-year period. The proposed method enables routine laboratories to handle a large number of samples, since 28 pesticides of 14 different chemical groups can be quantitated in a single procedure. The method comprises a solid-phase extraction step and subsequent analysis by liquid chromatography-mass spectrometry (LC-MS-MS). The accuracy was established on the basis of participation in interlaboratory proficiency tests, with encouraging results (majority |z-score| <2), and the precision was consistently analysed over one year. The limits of quantitation (below 0.050 microg L(-1)) are in agreement with the enforced threshold value for pesticides of 0.10 microg L(-1). Overall method performance is suitable for routine use according to accreditation rules, taking into account the data collected over one year.

  9. qFlow Cytometry-Based Receptoromic Screening: A High-Throughput Quantification Approach Informing Biomarker Selection and Nanosensor Development.

    PubMed

    Chen, Si; Weddell, Jared; Gupta, Pavan; Conard, Grace; Parkin, James; Imoukhuede, Princess I

    2017-01-01

    Nanosensor-based detection of biomarkers can improve medical diagnosis; however, a critical factor in nanosensor development is deciding which biomarker to target, as most diseases present several biomarkers. Biomarker-targeting decisions can be informed via an understanding of biomarker expression. Currently, immunohistochemistry (IHC) is the accepted standard for profiling biomarker expression. While IHC provides a relative mapping of biomarker expression, it does not provide cell-by-cell readouts of biomarker expression or absolute biomarker quantification. Flow cytometry overcomes both these IHC challenges by offering biomarker expression on a cell-by-cell basis, and when combined with calibration standards, providing quantitation of biomarker concentrations: this is known as qFlow cytometry. Here, we outline the key components for applying qFlow cytometry to detect biomarkers within the angiogenic vascular endothelial growth factor receptor family. The key aspects of the qFlow cytometry methodology include: antibody specificity testing, immunofluorescent cell labeling, saturation analysis, fluorescent microsphere calibration, and quantitative analysis of both ensemble and cell-by-cell data. Together, these methods enable high-throughput quantification of biomarker expression.

  10. Hybrid computational and experimental approach for the study and optimization of mechanical components

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    1998-05-01

    Increased demands on the performance and efficiency of mechanical components impose challenges on their engineering design and optimization, especially when new and more demanding applications must be developed in relatively short periods of time while satisfying design objectives, as well as cost and manufacturability. In addition, reliability and durability must be taken into consideration. As a consequence, effective quantitative methodologies, computational and experimental, should be applied in the study and optimization of mechanical components. Computational investigations enable parametric studies and the determination of critical engineering design conditions, while experimental investigations, especially those using optical techniques, provide qualitative and quantitative information on the actual response of the structure of interest to the applied load and boundary conditions. We discuss a hybrid experimental and computational approach for investigation and optimization of mechanical components. The approach is based on analytical, computational, and experimental resolutions methodologies in the form of computational, noninvasive optical techniques, and fringe prediction analysis tools. Practical application of the hybrid approach is illustrated with representative examples that demonstrate the viability of the approach as an effective engineering tool for analysis and optimization.

  11. Precision of dehydroascorbic acid quantitation with the use of the subtraction method--validation of HPLC-DAD method for determination of total vitamin C in food.

    PubMed

    Mazurek, Artur; Jamroz, Jerzy

    2015-04-15

    In food analysis, a method for determination of vitamin C should enable measuring of total content of ascorbic acid (AA) and dehydroascorbic acid (DHAA) because both chemical forms exhibit biological activity. The aim of the work was to confirm applicability of HPLC-DAD method for analysis of total content of vitamin C (TC) and ascorbic acid in various types of food by determination of validation parameters such as: selectivity, precision, accuracy, linearity and limits of detection and quantitation. The results showed that the method applied for determination of TC and AA was selective, linear and precise. Precision of DHAA determination by the subtraction method was also evaluated. It was revealed that the results of DHAA determination obtained by the subtraction method were not precise which resulted directly from the assumption of this method and the principles of uncertainty propagation. The proposed chromatographic method should be recommended for routine determinations of total vitamin C in various food. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Characterization of Colloidal Quantum Dot Ligand Exchange by X-ray Photoelectron Spectroscopy

    NASA Astrophysics Data System (ADS)

    Atewologun, Ayomide; Ge, Wangyao; Stiff-Roberts, Adrienne D.

    2013-05-01

    Colloidal quantum dots (CQDs) are chemically synthesized semiconductor nanoparticles with size-dependent wavelength tunability. Chemical synthesis of CQDs involves the attachment of long organic surface ligands to prevent aggregation; however, these ligands also impede charge transport. Therefore, it is beneficial to exchange longer surface ligands for shorter ones for optoelectronic devices. Typical characterization techniques used to analyze surface ligand exchange include Fourier-transform infrared spectroscopy, x-ray diffraction, transmission electron microscopy, and nuclear magnetic resonance spectroscopy, yet these techniques do not provide a simultaneously direct, quantitative, and sensitive method for evaluating surface ligands on CQDs. In contrast, x-ray photoelectron spectroscopy (XPS) can provide nanoscale sensitivity for quantitative analysis of CQD surface ligand exchange. A unique aspect of this work is that a fingerprint is identified for shorter surface ligands by resolving the regional XPS spectrum corresponding to different types of carbon bonds. In addition, a deposition technique known as resonant infrared matrix-assisted pulsed laser evaporation is used to improve the CQD film uniformity such that stronger XPS signals are obtained, enabling more accurate analysis of the ligand exchange process.

  13. Self-contained microfluidic systems: a review.

    PubMed

    Boyd-Moss, Mitchell; Baratchi, Sara; Di Venere, Martina; Khoshmanesh, Khashayar

    2016-08-16

    Microfluidic systems enable rapid diagnosis, screening and monitoring of diseases and health conditions using small amounts of biological samples and reagents. Despite these remarkable features, conventional microfluidic systems rely on bulky expensive external equipment, which hinders their utility as powerful analysis tools outside of research laboratories. 'Self-contained' microfluidic systems, which contain all necessary components to facilitate a complete assay, have been developed to address this limitation. In this review, we provide an in-depth overview of self-contained microfluidic systems. We categorise these systems based on their operating mechanisms into three major groups: passive, hand-powered and active. Several examples are provided to discuss the structure, capabilities and shortcomings of each group. In particular, we discuss the self-contained microfluidic systems enabled by active mechanisms, due to their unique capability for running multi-step and highly controllable diagnostic assays. Integration of self-contained microfluidic systems with the image acquisition and processing capabilities of smartphones, especially those equipped with accessory optical components, enables highly sensitive and quantitative assays, which are discussed. Finally, the future trends and possible solutions to expand the versatility of self-contained, stand-alone microfluidic platforms are outlined.

  14. The accurate assessment of small-angle X-ray scattering data

    DOE PAGES

    Grant, Thomas D.; Luft, Joseph R.; Carter, Lester G.; ...

    2015-01-23

    Small-angle X-ray scattering (SAXS) has grown in popularity in recent times with the advent of bright synchrotron X-ray sources, powerful computational resources and algorithms enabling the calculation of increasingly complex models. However, the lack of standardized data-quality metrics presents difficulties for the growing user community in accurately assessing the quality of experimental SAXS data. Here, a series of metrics to quantitatively describe SAXS data in an objective manner using statistical evaluations are defined. These metrics are applied to identify the effects of radiation damage, concentration dependence and interparticle interactions on SAXS data from a set of 27 previously described targetsmore » for which high-resolution structures have been determined via X-ray crystallography or nuclear magnetic resonance (NMR) spectroscopy. Studies show that these metrics are sufficient to characterize SAXS data quality on a small sample set with statistical rigor and sensitivity similar to or better than manual analysis. The development of data-quality analysis strategies such as these initial efforts is needed to enable the accurate and unbiased assessment of SAXS data quality.« less

  15. Systems modelling methodology for the analysis of apoptosis signal transduction and cell death decisions.

    PubMed

    Rehm, Markus; Prehn, Jochen H M

    2013-06-01

    Systems biology and systems medicine, i.e. the application of systems biology in a clinical context, is becoming of increasing importance in biology, drug discovery and health care. Systems biology incorporates knowledge and methods that are applied in mathematics, physics and engineering, but may not be part of classical training in biology. We here provide an introduction to basic concepts and methods relevant to the construction and application of systems models for apoptosis research. We present the key methods relevant to the representation of biochemical processes in signal transduction models, with a particular reference to apoptotic processes. We demonstrate how such models enable a quantitative and temporal analysis of changes in molecular entities in response to an apoptosis-inducing stimulus, and provide information on cell survival and cell death decisions. We introduce methods for analyzing the spatial propagation of cell death signals, and discuss the concepts of sensitivity analyses that enable a prediction of network responses to disturbances of single or multiple parameters. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. Computer modeling of pulsed CO2 lasers for lidar applications

    NASA Technical Reports Server (NTRS)

    Spiers, Gary D.; Smithers, Martin E.; Murty, Rom

    1991-01-01

    The experimental results will enable a comparison of the numerical code output with experimental data. This will ensure verification of the validity of the code. The measurements were made on a modified commercial CO2 laser. Results are listed as following. (1) The pulse shape and energy dependence on gas pressure were measured. (2) The intrapulse frequency chirp due to plasma and laser induced medium perturbation effects were determined. A simple numerical model showed quantitative agreement with these measurements. The pulse to pulse frequency stability was also determined. (3) The dependence was measured of the laser transverse mode stability on cavity length. A simple analysis of this dependence in terms of changes to the equivalent fresnel number and the cavity magnification was performed. (4) An analysis was made of the discharge pulse shape which enabled the low efficiency of the laser to be explained in terms of poor coupling of the electrical energy into the vibrational levels. And (5) the existing laser resonator code was changed to allow it to run on the Cray XMP under the new operating system.

  17. Enantiomeric separation of the antiuremic drug colchicine by electrokinetic chromatography. Method development and quantitative analysis.

    PubMed

    Menéndez-López, Nuria; Valimaña-Traverso, Jesús; Castro-Puyana, María; Salgado, Antonio; García, María Ángeles; Marina, María Luisa

    2017-05-10

    Two analytical methodologies were developed by CE enabling the enantiomeric separation of colchicine, an antiuremic drug commercialized as pure enantiomer. Succinyl-γ-CD and Sulfated-γ-CD were selected as chiral selectors after a screening with different anionic CDs. Under the optimized conditions, chiral resolutions of 5.6 in 12min and 3.2 in 8min were obtained for colchicine with Succinyl-γ-CD and Sulfated-γ-CD, respectively. An opposite enantiomeric migration order was observed with these two CDs being S-colchicine the first-migrating enantiomer with Succinyl-γ-CD and the second-migrating enantiomer with Sulfated-γ-CD. H NMR experiments showed a 1:1 stoichiometry for the enantiomer-CD complexes in both cases. However, the apparent and averaged equilibrium constants for the enantiomer-CD complexes could be calculated only for Succinyl-γ-CD. The developed methods were applied to the analysis of pharmaceutical formulations but only the use of Succinyl-γ-CD enabled to detect a 0.1% of enantiomeric impurity in colchicine formulations. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Multistrip Western blotting: a tool for comparative quantitative analysis of multiple proteins.

    PubMed

    Aksamitiene, Edita; Hoek, Jan B; Kiyatkin, Anatoly

    2015-01-01

    The qualitative and quantitative measurements of protein abundance and modification states are essential in understanding their functions in diverse cellular processes. Typical Western blotting, though sensitive, is prone to produce substantial errors and is not readily adapted to high-throughput technologies. Multistrip Western blotting is a modified immunoblotting procedure based on simultaneous electrophoretic transfer of proteins from multiple strips of polyacrylamide gels to a single membrane sheet. In comparison with the conventional technique, Multistrip Western blotting increases data output per single blotting cycle up to tenfold; allows concurrent measurement of up to nine different total and/or posttranslationally modified protein expression obtained from the same loading of the sample; and substantially improves the data accuracy by reducing immunoblotting-derived signal errors. This approach enables statistically reliable comparison of different or repeated sets of data and therefore is advantageous to apply in biomedical diagnostics, systems biology, and cell signaling research.

  19. Semi-automated identification of cones in the human retina using circle Hough transform

    PubMed Central

    Bukowska, Danuta M.; Chew, Avenell L.; Huynh, Emily; Kashani, Irwin; Wan, Sue Ling; Wan, Pak Ming; Chen, Fred K

    2015-01-01

    A large number of human retinal diseases are characterized by a progressive loss of cones, the photoreceptors critical for visual acuity and color perception. Adaptive Optics (AO) imaging presents a potential method to study these cells in vivo. However, AO imaging in ophthalmology is a relatively new phenomenon and quantitative analysis of these images remains difficult and tedious using manual methods. This paper illustrates a novel semi-automated quantitative technique enabling registration of AO images to macular landmarks, cone counting and its radius quantification at specified distances from the foveal center. The new cone counting approach employs the circle Hough transform (cHT) and is compared to automated counting methods, as well as arbitrated manual cone identification. We explore the impact of varying the circle detection parameter on the validity of cHT cone counting and discuss the potential role of using this algorithm in detecting both cones and rods separately. PMID:26713186

  20. Development of end-selective functionalized carbon nanotubes for biomedical applications

    NASA Astrophysics Data System (ADS)

    Lee, Seung Ho; Kim, Wan Sun; Lee, Ha Rim; Park, Kyu Chang; Lee, Chang Hoon; Park, Hun Kuk; Kim, Kyung Sook

    2015-12-01

    Carbon nanotube (CNT) is a type of carbon allotrope with excellent physical and electrical properties, including high thermal conductivity, mechanical strength, and thermal stability. Therefore, applications of CNT have been considered for a variety of fields, including biosensors, molecular electronics, X-ray, and fuel cells. However, the application of CNT to biomedicine is limited because this material is cytotoxic and inhomogeneous. In particular, the irregularity in the structural properties of paste or bundle-type CNTs causes an uncontrolled modification in biomolecules. Therefore, using CNT as biosensors to obtain quantitative analyses is difficult. In this study, we developed a new method to perform end-selective functionalization of CNT in order to enable quantitative analysis for biomedical applications. The process was as follows: (1) etching the tip of vertically-aligned CNTs under optimum conditions, (2) oxidation of exposed CNTs, and (3) end-selective linkage of functionalized CNTs with biomolecules (dsDNA).

  1. Quantitative Microscopic Analysis of Plasma Membrane Receptor Dynamics in Living Plant Cells.

    PubMed

    Luo, Yu; Russinova, Eugenia

    2017-01-01

    Plasma membrane-localized receptors are essential for cellular communication and signal transduction. In Arabidopsis thaliana, BRASSINOSTEROID INSENSITIVE1 (BRI1) is one of the receptors that is activated by binding to its ligand, the brassinosteroid (BR) hormone, at the cell surface to regulate diverse plant developmental processes. The availability of BRI1 in the plasma membrane is related to its signaling output and is known to be controlled by the dynamic endomembrane trafficking. Advances in fluorescence labeling and confocal microscopy techniques enabled us to gain a better understanding of plasma membrane receptor dynamics in living cells. Here we describe different quantitative microscopy methods to monitor the relative steady-state levels of the BRI1 protein in the plasma membrane of root epidermal cells and its relative exocytosis and recycling rates. The methods can be applied also to analyze similar dynamics of other plasma membrane-localized receptors.

  2. [A quantitative approach to sports training-adapted social determinants concerning sport].

    PubMed

    Alvis-Gómez, Martina K; Neira-Tolosa, Nury A

    2013-01-01

    Identifying and quantitatively analysing social determinants affecting disabled teenagers' inclusion/exclusion in high-performance sports. This was a descriptive cross-sectional study involving 19 12- to 19-year-old athletes suffering physical and sensory disability and 17 staff from the District Institute of Recreation and Sport. Likert-type rating scales were used, based on four analysis categories, i.e. social structure, socio-economic, educational and living condition determinants. Social inequity pervades the national paralympic sports' system. This is because 74 % of individuals only become recognised as sportspeople when they have obtained meritorious results in set competition without appropriate conditions having been previously provided by such paralympic sports institution to enable them to overcome structural and intermediate barriers. The social structure imposed on district-based paralympic sport stigmatises individuals regarding their individual abilities, affects their empowerment and freedom due to the discrimination experienced by disabled teenagers regarding their competitive achievements.

  3. Metabolic network reconstruction of Chlamydomonas offers insight into light-driven algal metabolism

    PubMed Central

    Chang, Roger L; Ghamsari, Lila; Manichaikul, Ani; Hom, Erik F Y; Balaji, Santhanam; Fu, Weiqi; Shen, Yun; Hao, Tong; Palsson, Bernhard Ø; Salehi-Ashtiani, Kourosh; Papin, Jason A

    2011-01-01

    Metabolic network reconstruction encompasses existing knowledge about an organism's metabolism and genome annotation, providing a platform for omics data analysis and phenotype prediction. The model alga Chlamydomonas reinhardtii is employed to study diverse biological processes from photosynthesis to phototaxis. Recent heightened interest in this species results from an international movement to develop algal biofuels. Integrating biological and optical data, we reconstructed a genome-scale metabolic network for this alga and devised a novel light-modeling approach that enables quantitative growth prediction for a given light source, resolving wavelength and photon flux. We experimentally verified transcripts accounted for in the network and physiologically validated model function through simulation and generation of new experimental growth data, providing high confidence in network contents and predictive applications. The network offers insight into algal metabolism and potential for genetic engineering and efficient light source design, a pioneering resource for studying light-driven metabolism and quantitative systems biology. PMID:21811229

  4. Application of Organosilane Monolayer Template to Quantitative Evaluation of Cancer Cell Adhesive Ability

    NASA Astrophysics Data System (ADS)

    Tanii, Takashi; Sasaki, Kosuke; Ichisawa, Kota; Demura, Takanori; Beppu, Yuichi; Vu, Hoan Anh; Thanh Chi, Hoan; Yamamoto, Hideaki; Sato, Yuko

    2011-06-01

    The adhesive ability of two human pancreatic cancer cell lines was evaluated using organosilane monolayer templates (OMTs). Using the OMT, the spreading area of adhered cells can be limited, and this enables us to focus on the initial attachment process of adhesion. Moreover, it becomes possible to arrange the cells in an array and to quantitatively evaluate the number of attached cells. The adhesive ability of the cancer cells cultured on the OMT was controlled by adding (-)-epigallocatechin-3-gallate (EGCG), which blocks a receptor that mediates cell adhesion and is overexpressed in cancer cells. Measurement of the relative ability of the cancer cells to attach to the OMT revealed that the ability for attachment decreased with increasing EGCG concentration. The results agreed well with the western blot analysis, indicating that the OMT can potentially be employed to evaluate the adhesive ability of various cancer cells.

  5. Distributed Cognition and Process Management Enabling Individualized Translational Research: The NIH Undiagnosed Diseases Program Experience

    PubMed Central

    Links, Amanda E.; Draper, David; Lee, Elizabeth; Guzman, Jessica; Valivullah, Zaheer; Maduro, Valerie; Lebedev, Vlad; Didenko, Maxim; Tomlin, Garrick; Brudno, Michael; Girdea, Marta; Dumitriu, Sergiu; Haendel, Melissa A.; Mungall, Christopher J.; Smedley, Damian; Hochheiser, Harry; Arnold, Andrew M.; Coessens, Bert; Verhoeven, Steven; Bone, William; Adams, David; Boerkoel, Cornelius F.; Gahl, William A.; Sincan, Murat

    2016-01-01

    The National Institutes of Health Undiagnosed Diseases Program (NIH UDP) applies translational research systematically to diagnose patients with undiagnosed diseases. The challenge is to implement an information system enabling scalable translational research. The authors hypothesized that similar complex problems are resolvable through process management and the distributed cognition of communities. The team, therefore, built the NIH UDP integrated collaboration system (UDPICS) to form virtual collaborative multidisciplinary research networks or communities. UDPICS supports these communities through integrated process management, ontology-based phenotyping, biospecimen management, cloud-based genomic analysis, and an electronic laboratory notebook. UDPICS provided a mechanism for efficient, transparent, and scalable translational research and thereby addressed many of the complex and diverse research and logistical problems of the NIH UDP. Full definition of the strengths and deficiencies of UDPICS will require formal qualitative and quantitative usability and process improvement measurement. PMID:27785453

  6. General practitioners learning qualitative research: A case study of postgraduate education.

    PubMed

    Hepworth, Julie; Kay, Margaret

    2015-10-01

    Qualitative research is increasingly being recognised as a vital aspect of primary healthcare research. Teaching and learning how to conduct qualitative research is especially important for general practitioners and other clinicians in the professional educational setting. This article examines a case study of postgraduate professional education in qualitative research for clinicians, for the purpose of enabling a robust discussion around teaching and learning in medicine and the health sciences. A series of three workshops was delivered for primary healthcare academics. The workshops were evaluated using a quantitative survey and qualitative free-text responses to enable descriptive analyses. Participants found qualitative philosophy and theory the most difficult areas to engage with, and learning qualitative coding and analysis was considered the easiest to learn. Key elements for successful teaching were identified, including the use of adult learning principles, the value of an experienced facilitator and an awareness of the impact of clinical subcultures on learning.

  7. Transient dynamics of a nonlinear magneto-optical rotation

    NASA Astrophysics Data System (ADS)

    Grewal, Raghwinder Singh; Pustelny, S.; Rybak, A.; Florkowski, M.

    2018-04-01

    We analyze nonlinear magneto-optical rotation (NMOR) in rubidium vapor subjected to a continuously scanned magnetic field. By varying the magnetic-field sweep rate, a transition from traditionally observed dispersivelike NMOR signals (low sweep rate) to oscillating signals (higher sweep rates) is demonstrated. The transient oscillatory behavior is studied versus light and magnetic-field parameters, revealing a strong dependence of the signals on magnetic sweep rate and light intensity. The experimental results are supported with density-matrix calculations, which enable quantitative analysis of the effect. Fitting of the signals simulated versus different parameters with a theoretically motivated curve reveals the presence of oscillatory and static components in the signals. The components depend differently on the system parameters, which suggests their distinct nature. The investigations provide insight into the dynamics of ground-state coherence generation and enable application of NMOR in detection of transient spin couplings.

  8. CPTAC Assay Portal: a repository of targeted proteomic assays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiteaker, Jeffrey R.; Halusa, Goran; Hoofnagle, Andrew N.

    2014-06-27

    To address these issues, the Clinical Proteomic Tumor Analysis Consortium (CPTAC) of the National Cancer Institute (NCI) has launched an Assay Portal (http://assays.cancer.gov) to serve as a public repository of well-characterized quantitative, MS-based, targeted proteomic assays. The purpose of the CPTAC Assay Portal is to facilitate widespread adoption of targeted MS assays by disseminating SOPs, reagents, and assay characterization data for highly characterized assays. A primary aim of the NCI-supported portal is to bring together clinicians or biologists and analytical chemists to answer hypothesis-driven questions using targeted, MS-based assays. Assay content is easily accessed through queries and filters, enabling investigatorsmore » to find assays to proteins relevant to their areas of interest. Detailed characterization data are available for each assay, enabling researchers to evaluate assay performance prior to launching the assay in their own laboratory.« less

  9. Combined Population Dynamics and Entropy Modelling Supports Patient Stratification in Chronic Myeloid Leukemia

    PubMed Central

    Brehme, Marc; Koschmieder, Steffen; Montazeri, Maryam; Copland, Mhairi; Oehler, Vivian G.; Radich, Jerald P.; Brümmendorf, Tim H.; Schuppert, Andreas

    2016-01-01

    Modelling the parameters of multistep carcinogenesis is key for a better understanding of cancer progression, biomarker identification and the design of individualized therapies. Using chronic myeloid leukemia (CML) as a paradigm for hierarchical disease evolution we show that combined population dynamic modelling and CML patient biopsy genomic analysis enables patient stratification at unprecedented resolution. Linking CD34+ similarity as a disease progression marker to patient-derived gene expression entropy separated established CML progression stages and uncovered additional heterogeneity within disease stages. Importantly, our patient data informed model enables quantitative approximation of individual patients’ disease history within chronic phase (CP) and significantly separates “early” from “late” CP. Our findings provide a novel rationale for personalized and genome-informed disease progression risk assessment that is independent and complementary to conventional measures of CML disease burden and prognosis. PMID:27048866

  10. Direct visualization reveals kinetics of meiotic chromosome synapsis

    DOE PAGES

    Rog, Ofer; Dernburg, Abby  F.

    2015-03-17

    The synaptonemal complex (SC) is a conserved protein complex that stabilizes interactions along homologous chromosomes (homologs) during meiosis. The SC regulates genetic exchanges between homologs, thereby enabling reductional division and the production of haploid gametes. Here, we directly observe SC assembly (synapsis) by optimizing methods for long-term fluorescence recording in C. elegans. We report that synapsis initiates independently on each chromosome pair at or near pairing centers—specialized regions required for homolog associations. Once initiated, the SC extends rapidly and mostly irreversibly to chromosome ends. Quantitation of SC initiation frequencies and extension rates reveals that initiation is a rate-limiting step inmore » homolog interactions. Eliminating the dynein-driven chromosome movements that accompany synapsis severely retards SC extension, revealing a new role for these conserved motions. This work provides the first opportunity to directly observe and quantify key aspects of meiotic chromosome interactions and will enable future in vivo analysis of germline processes.« less

  11. Comparability analysis of protein therapeutics by bottom-up LC-MS with stable isotope-tagged reference standards

    PubMed Central

    Manuilov, Anton V; Radziejewski, Czeslaw H

    2011-01-01

    Comparability studies lie at the heart of assessments that evaluate differences amongst manufacturing processes and stability studies of protein therapeutics. Low resolution chromatographic and electrophoretic methods facilitate quantitation, but do not always yield detailed insight into the effect of the manufacturing change or environmental stress. Conversely, mass spectrometry (MS) can provide high resolution information on the molecule, but conventional methods are not very quantitative. This gap can be reconciled by use of a stable isotope-tagged reference standard (SITRS), a version of the analyte protein that is uniformly labeled with 13C6-arginine and 13C6-lysine. The SITRS serves as an internal control that is trypsin-digested and analyzed by liquid chromatography (LC)-MS with the analyte sample. The ratio of the ion intensities of each unlabeled and labeled peptide pair is then compared to that of other sample(s). A comparison of these ratios provides a readily accessible way to spot even minute differences among samples. In a study of a monoclonal antibody (mAb) spiked with varying amounts of the same antibody bearing point mutations, peptides containing the mutations were readily identified and quantified at concentrations as low as 2% relative to unmodified peptides. The method was robust, reproducible and produced a linear response for every peptide that was monitored. The method was also successfully used to distinguish between two batches of a mAb that were produced in two different cell lines while two batches produced from the same cell line were found to be highly comparable. Finally, the use of the SITRS method in the comparison of two stressed mAb samples enabled the identification of sites susceptible to deamidation and oxidation, as well as their quantitation. The experimental results indicate that use of a SITRS in a peptide mapping experiment with MS detection enables sensitive and quantitative comparability studies of proteins at high resolution. PMID:21654206

  12. Versatile quantitative phase imaging system applied to high-speed, low noise and multimodal imaging (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Federici, Antoine; Aknoun, Sherazade; Savatier, Julien; Wattellier, Benoit F.

    2017-02-01

    Quadriwave lateral shearing interferometry (QWLSI) is a well-established quantitative phase imaging (QPI) technique based on the analysis of interference patterns of four diffraction orders by an optical grating set in front of an array detector [1]. As a QPI modality, this is a non-invasive imaging technique which allow to measure the optical path difference (OPD) of semi-transparent samples. We present a system enabling QWLSI with high-performance sCMOS cameras [2] and apply it to perform high-speed imaging, low noise as well as multimodal imaging. This modified QWLSI system contains a versatile optomechanical device which images the optical grating near the detector plane. Such a device is coupled with any kind of camera by varying its magnification. In this paper, we study the use of a sCMOS Zyla5.5 camera from Andor along with our modified QWLSI system. We will present high-speed live cell imaging, up to 200Hz frame rate, in order to follow intracellular fast motions while measuring the quantitative phase information. The structural and density information extracted from the OPD signal is complementary to the specific and localized fluorescence signal [2]. In addition, QPI detects cells even when the fluorophore is not expressed. This is very useful to follow a protein expression with time. The 10 µm spatial pixel resolution of our modified QWLSI associated to the high sensitivity of the Zyla5.5 enabling to perform high quality fluorescence imaging, we have carried out multimodal imaging revealing fine structures cells, like actin filaments, merged with the morphological information of the phase. References [1]. P. Bon, G. Maucort, B. Wattellier, and S. Monneret, "Quadriwave lateral shearing interferometry for quantitative phase microscopy of living cells," Opt. Express, vol. 17, pp. 13080-13094, 2009. [2] P. Bon, S. Lécart, E. Fort and S. Lévêque-Fort, "Fast label-free cytoskeletal network imaging in living mammalian cells," Biophysical journal, 106(8), pp. 1588-1595, 2014

  13. Comparability analysis of protein therapeutics by bottom-up LC-MS with stable isotope-tagged reference standards.

    PubMed

    Manuilov, Anton V; Radziejewski, Czeslaw H; Lee, David H

    2011-01-01

    Comparability studies lie at the heart of assessments that evaluate differences amongst manufacturing processes and stability studies of protein therapeutics. Low resolution chromatographic and electrophoretic methods facilitate quantitation, but do not always yield detailed insight into the effect of the manufacturing change or environmental stress. Conversely, mass spectrometry (MS) can provide high resolution information on the molecule, but conventional methods are not very quantitative. This gap can be reconciled by use of a stable isotope-tagged reference standard (SITRS), a version of the analyte protein that is uniformly labeled (13)C6-arginine and (13)C6-lysine. The SITRS serves as an internal control that is trypsin-digested and analyzed by liquid chromatography (LC)-MS with the analyte sample. The ratio of the ion intensities of each unlabeled and labeled peptide pair is then compared to that of other sample(s). A comparison of these ratios provides a readily accessible way to spot even minute differences among samples. In a study of a monoclonal antibody (mAb) spiked with varying amounts of the same antibody bearing point mutations, peptides containing the mutations were readily identified and quantified at concentrations as low as 2% relative to unmodified peptides. The method is robust, reproducible and produced a linear response for every peptide that was monitored. The method was also successfully used to distinguish between two batches of a mAb that were produced in two different cell lines while two batches produced from the same cell line were found to be highly comparable. Finally, the use of the SITRS method in the comparison of two stressed mAb samples enabled the identification of sites susceptible to deamidation and oxidation, as well as their quantitation. The experimental results indicate that use of a SITRS in a peptide mapping experiment with MS detection enables sensitive and quantitative comparability studies of proteins at high resolution.

  14. Measurements in quantitative research: how to select and report on research instruments.

    PubMed

    Hagan, Teresa L

    2014-07-01

    Measures exist to numerically represent degrees of attributes. Quantitative research is based on measurement and is conducted in a systematic, controlled manner. These measures enable researchers to perform statistical tests, analyze differences between groups, and determine the effectiveness of treatments. If something is not measurable, it cannot be tested.

  15. Understanding Knowledge-Sharing Breakdowns: A Meeting of the Quantitative and Qualitative Minds

    ERIC Educational Resources Information Center

    Soller, Amy

    2004-01-01

    The rapid advance of distance learning and networking technology has enabled universities and corporations to reach out and educate students across time and space barriers. Although this technology enables structured collaborative learning activities, online groups often do not enjoy the same benefits as face-to-face learners, and their…

  16. Apparatus and method for quantitative assay of generic transuranic wastes from nuclear reactors

    DOEpatents

    Caldwell, J.T.; Kunz, W.E.; Atencio, J.D.

    1982-03-31

    A combination of passive and active neutron measurements which yields quantitative information about the isotopic composition of transuranic wastes from nuclear power or weapons material manufacture reactors is described. From the measurement of prompt and delayed neutron emission and the incidence of two coincidentally emitted neutrons from induced fission of fissile material in the sample, one can quantify /sup 233/U, /sup 235/U and /sup 239/Pu isotopes in waste samples. Passive coincidence counting, including neutron multiplicity measurement and determination of the overall passive neutron flux additionally enables the separate quantitative evaluation of spontaneous fission isotopes such as /sup 240/Pu, /sup 244/Cm and /sup 252/Cf, and the spontaneous alpha particle emitter /sup 241/Am. These seven isotopes are the most important constituents of wastes from nuclear power reactors and once the mass of each isotope present is determined by the apparatus and method of the instant invention, the overall alpha particle activity can be determined to better than 1 nCi/g from known radioactivity data. Therefore, in addition to the quantitative analysis of the waste sample useful for later reclamation purposes, the alpha particle activity can be determined to decide whether permanent low-level burial is appropriate for the waste sample.

  17. Apparatus and method for quantitative assay of generic transuranic wastes from nuclear reactors

    DOEpatents

    Caldwell, John T.; Kunz, Walter E.; Atencio, James D.

    1984-01-01

    A combination of passive and active neutron measurements which yields quantitative information about the isotopic composition of transuranic wastes from nuclear power or weapons material manufacture reactors is described. From the measurement of prompt and delayed neutron emission and the incidence of two coincidentally emitted neutrons from induced fission of fissile material in the sample, one can quantify .sup.233 U, .sup.235 U and .sup.239 Pu isotopes in waste samples. Passive coincidence counting, including neutron multiplicity measurement and determination of the overall passive neutron flux additionally enables the separate quantitative evaluation of spontaneous fission isotopes such as .sup.240 Pu, .sup.244 Cm and .sup.252 Cf, and the spontaneous alpha particle emitter .sup.241 Am. These seven isotopes are the most important constituents of wastes from nuclear power reactors and once the mass of each isotope present is determined by the apparatus and method of the instant invention, the overall alpha particle activity can be determined to better than 1 nCi/g from known radioactivity data. Therefore, in addition to the quantitative analysis of the waste sample useful for later reclamation purposes, the alpha particle activity can be determined to decide whether "permanent" low-level burial is appropriate for the waste sample.

  18. The PAXgene® Tissue System Preserves Phosphoproteins in Human Tissue Specimens and Enables Comprehensive Protein Biomarker Research

    PubMed Central

    Gündisch, Sibylle; Schott, Christina; Wolff, Claudia; Tran, Kai; Beese, Christian; Viertler, Christian; Zatloukal, Kurt; Becker, Karl-Friedrich

    2013-01-01

    Precise quantitation of protein biomarkers in clinical tissue specimens is a prerequisite for accurate and effective diagnosis, prognosis, and personalized medicine. Although progress is being made, protein analysis from formalin-fixed and paraffin-embedded tissues is still challenging. In previous reports, we showed that the novel formalin-free tissue preservation technology, the PAXgene Tissue System, allows the extraction of intact and immunoreactive proteins from PAXgene-fixed and paraffin-embedded (PFPE) tissues. In the current study, we focused on the analysis of phosphoproteins and the applicability of two-dimensional gel electrophoresis (2D-PAGE) and enzyme-linked immunosorbent assay (ELISA) to the analysis of a variety of malignant and non-malignant human tissues. Using western blot analysis, we found that phosphoproteins are quantitatively preserved in PFPE tissues, and signal intensities are comparable to that in paired, frozen tissues. Furthermore, proteins extracted from PFPE samples are suitable for 2D-PAGE and can be quantified by ELISA specific for denatured proteins. In summary, the PAXgene Tissue System reliably preserves phosphoproteins in human tissue samples, even after prolonged fixation or stabilization times, and is compatible with methods for protein analysis such as 2D-PAGE and ELISA. We conclude that the PAXgene Tissue System has the potential to serve as a versatile tissue fixative for modern pathology. PMID:23555997

  19. EasyFRAP-web: a web-based tool for the analysis of fluorescence recovery after photobleaching data.

    PubMed

    Koulouras, Grigorios; Panagopoulos, Andreas; Rapsomaniki, Maria A; Giakoumakis, Nickolaos N; Taraviras, Stavros; Lygerou, Zoi

    2018-06-13

    Understanding protein dynamics is crucial in order to elucidate protein function and interactions. Advances in modern microscopy facilitate the exploration of the mobility of fluorescently tagged proteins within living cells. Fluorescence recovery after photobleaching (FRAP) is an increasingly popular functional live-cell imaging technique which enables the study of the dynamic properties of proteins at a single-cell level. As an increasing number of labs generate FRAP datasets, there is a need for fast, interactive and user-friendly applications that analyze the resulting data. Here we present easyFRAP-web, a web application that simplifies the qualitative and quantitative analysis of FRAP datasets. EasyFRAP-web permits quick analysis of FRAP datasets through an intuitive web interface with interconnected analysis steps (experimental data assessment, different types of normalization and estimation of curve-derived quantitative parameters). In addition, easyFRAP-web provides dynamic and interactive data visualization and data and figure export for further analysis after every step. We test easyFRAP-web by analyzing FRAP datasets capturing the mobility of the cell cycle regulator Cdt2 in the presence and absence of DNA damage in cultured cells. We show that easyFRAP-web yields results consistent with previous studies and highlights cell-to-cell heterogeneity in the estimated kinetic parameters. EasyFRAP-web is platform-independent and is freely accessible at: https://easyfrap.vmnet.upatras.gr/.

  20. Creating normograms of dural sinuses in healthy persons using computer-assisted detection for analysis and comparison of cross-section dural sinuses in the brain.

    PubMed

    Anconina, Reut; Zur, Dinah; Kesler, Anat; Lublinsky, Svetlana; Toledano, Ronen; Novack, Victor; Benkobich, Elya; Novoa, Rosa; Novic, Evelyne Farkash; Shelef, Ilan

    2017-06-01

    Dural sinuses vary in size and shape in many pathological conditions with abnormal intracranial pressure. Size and shape normograms of dural brain sinuses are not available. The creation of such normograms may enable computer-assisted comparison to pathologic exams and facilitate diagnoses. The purpose of this study was to quantitatively evaluate normal magnetic resonance venography (MRV) studies in order to create normograms of dural sinuses using a computerized algorithm for vessel cross-sectional analysis. This was a retrospective analysis of MRV studies of 30 healthy persons. Data were analyzed using a specially developed Matlab algorithm for vessel cross-sectional analysis. The cross-sectional area and shape measurements were evaluated to create normograms. Mean cross-sectional size was 53.27±13.31 for the right transverse sinus (TS), 46.87+12.57 for the left TS (p=0.089) and 36.65+12.38 for the superior sagittal sinus. Normograms were created. The distribution of cross-sectional areas along the vessels showed distinct patterns and a parallel course for the median, 25th, 50th and 75th percentiles. In conclusion, using a novel computerized method for vessel cross-sectional analysis we were able to quantitatively characterize dural sinuses of healthy persons and create normograms. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Fast analysis of wood preservers using laser induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Uhl, A.; Loebe, K.; Kreuchwig, L.

    2001-06-01

    Laser-induced breakdown spectroscopy (LIBS) is used for the investigation of wood preservers in timber and in furniture. Both experiments in laboratory and practical applications in recycling facilities and on a building site prove the new possibilities for the fast detection of harmful agents in wood. A commercial system was developed for mobile laser-plasma-analysis as well as for industrial use in sorting plants. The universal measuring principle in combination with an Echelle optics permits real simultaneous multi-element-analysis in the range of 200-780 nm with a resolution of a few picometers. It enables the user to detect main and trace elements in wood within a few seconds, nearly independent of the matrix, knowing that different kinds of wood show an equal elemental composition. Sample preparation is not required. The quantitative analysis of inorganic wood preservers (containing, e.g. Cu, Cr, B, As, Pb, Hg) has been performed exactly using carbon as reference element. It can be shown that the detection limits for heavy metals in wood are in the ppm-range. Additional information is given concerning the quantitative analysis. Statistical data, e.g. the standard deviation (S.D.), were determined and calibration curves were used for each particular element. A comparison between ICP-AES and LIBS is given using depth profile correction factors regarding the different penetration depths with respect to the different volumes in wood analyzed by both analytical methods.

  2. QACD: A method for the quantitative assessment of compositional distribution in geologic materials

    NASA Astrophysics Data System (ADS)

    Loocke, M. P.; Lissenberg, J. C. J.; MacLeod, C. J.

    2017-12-01

    In order to fully understand the petrogenetic history of a rock, it is critical to obtain a thorough characterization of the chemical and textural relationships of its mineral constituents. Element mapping combines the microanalytical techniques that allow for the analysis of major- and minor elements at high spatial resolutions (e.g., electron microbeam analysis) with 2D mapping of samples in order to provide unprecedented detail regarding the growth histories and compositional distributions of minerals within a sample. We present a method for the acquisition and processing of large area X-ray element maps obtained by energy-dispersive X-ray spectrometer (EDS) to produce a quantitative assessment of compositional distribution (QACD) of mineral populations within geologic materials. By optimizing the conditions at which the EDS X-ray element maps are acquired, we are able to obtain full thin section quantitative element maps for most major elements in relatively short amounts of time. Such maps can be used to not only accurately identify all phases and calculate mineral modes for a sample (e.g., a petrographic thin section), but, critically, enable a complete quantitative assessment of their compositions. The QACD method has been incorporated into a python-based, easy-to-use graphical user interface (GUI) called Quack. The Quack software facilitates the generation of mineral modes, element and molar ratio maps and the quantification of full-sample compositional distributions. The open-source nature of the Quack software provides a versatile platform which can be easily adapted and modified to suit the needs of the user.

  3. Quantitative evaluation of haze formation of koji and progression of internal haze by drying of koji during koji making.

    PubMed

    Ito, Kazunari; Gomi, Katsuya; Kariyama, Masahiro; Miyake, Tsuyoshi

    2017-07-01

    The construction of an experimental system that can mimic koji making in the manufacturing setting of a sake brewery is initially required for the quantitative evaluation of mycelia grown on/in koji pellets (haze formation). Koji making with rice was investigated with a solid-state fermentation (SSF) system using a non-airflow box (NAB), which produced uniform conditions in the culture substrate with high reproducibility and allowed for the control of favorable conditions in the substrate during culture. The SSF system using NAB accurately reproduced koji making in a manufacturing setting. To evaluate haze formation during koji making, surfaces and cross sections of koji pellets obtained from koji making tests were observed using a digital microscope. Image analysis was used to distinguish between haze and non-haze sections of koji pellets, enabling the evaluation of haze formation in a batch by measuring the haze rate of a specific number of koji pellets. This method allowed us to obtain continuous and quantitative data on the time course of haze formation. Moreover, drying koji during the late stage of koji making was revealed to cause further penetration of mycelia into koji pellets (internal haze). The koji making test with the SSF system using NAB and quantitative evaluation of haze formation in a batch by image analysis is a useful method for understanding the relations between haze formation and koji making conditions. Copyright © 2017 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  4. Quantification of integrated HIV DNA by repetitive-sampling Alu-HIV PCR on the basis of poisson statistics.

    PubMed

    De Spiegelaere, Ward; Malatinkova, Eva; Lynch, Lindsay; Van Nieuwerburgh, Filip; Messiaen, Peter; O'Doherty, Una; Vandekerckhove, Linos

    2014-06-01

    Quantification of integrated proviral HIV DNA by repetitive-sampling Alu-HIV PCR is a candidate virological tool to monitor the HIV reservoir in patients. However, the experimental procedures and data analysis of the assay are complex and hinder its widespread use. Here, we provide an improved and simplified data analysis method by adopting binomial and Poisson statistics. A modified analysis method on the basis of Poisson statistics was used to analyze the binomial data of positive and negative reactions from a 42-replicate Alu-HIV PCR by use of dilutions of an integration standard and on samples of 57 HIV-infected patients. Results were compared with the quantitative output of the previously described Alu-HIV PCR method. Poisson-based quantification of the Alu-HIV PCR was linearly correlated with the standard dilution series, indicating that absolute quantification with the Poisson method is a valid alternative for data analysis of repetitive-sampling Alu-HIV PCR data. Quantitative outputs of patient samples assessed by the Poisson method correlated with the previously described Alu-HIV PCR analysis, indicating that this method is a valid alternative for quantifying integrated HIV DNA. Poisson-based analysis of the Alu-HIV PCR data enables absolute quantification without the need of a standard dilution curve. Implementation of the CI estimation permits improved qualitative analysis of the data and provides a statistical basis for the required minimal number of technical replicates. © 2014 The American Association for Clinical Chemistry.

  5. Electron Paramagnetic Resonance Oximetry as a Quantitative Method to Measure Cellular Respiration: A Consideration of Oxygen Diffusion Interference

    PubMed Central

    Presley, Tennille; Kuppusamy, Periannan; Zweier, Jay L.; Ilangovan, Govindasamy

    2006-01-01

    Electron paramagnetic resonance (EPR) oximetry is being widely used to measure the oxygen consumption of cells, mitochondria, and submitochondrial particles. However, further improvement of this technique, in terms of data analysis, is required to use it as a quantitative tool. Here, we present a new approach for quantitative analysis of cellular respiration using EPR oximetry. The course of oxygen consumption by cells in suspension has been observed to have three distinct zones: pO2-independent respiration at higher pO2 ranges, pO2-dependent respiration at low pO2 ranges, and a static equilibrium with no change in pO2 at very low pO2 values. The approach here enables one to comprehensively analyze all of the three zones together—where the progression of O2 diffusion zones around each cell, their overlap within time, and their potential impact on the measured pO2 data are considered. The obtained results agree with previously established methods such as high-resolution respirometry measurements. Additionally, it is also demonstrated how the diffusion limitations can depend on cell density and consumption rate. In conclusion, the new approach establishes a more accurate and meaningful model to evaluate the EPR oximetry data on cellular respiration to quantify related parameters using EPR oximetry. PMID:17012319

  6. Quantitative and temporal proteome analysis of butyrate-treated colorectal cancer cells.

    PubMed

    Tan, Hwee Tong; Tan, Sandra; Lin, Qingsong; Lim, Teck Kwang; Hew, Choy Leong; Chung, Maxey C M

    2008-06-01

    Colorectal cancer is one of the most common cancers in developed countries, and its incidence is negatively associated with high dietary fiber intake. Butyrate, a short-chain fatty acid fermentation by-product of fiber induces cell maturation with the promotion of growth arrest, differentiation, and/or apoptosis of cancer cells. The stimulation of cell maturation by butyrate in colonic cancer cells follows a temporal progression from the early phase of growth arrest to the activation of apoptotic cascades. Previously we performed two-dimensional DIGE to identify differentially expressed proteins induced by 24-h butyrate treatment of HCT-116 colorectal cancer cells. Herein we used quantitative proteomics approaches using iTRAQ (isobaric tags for relative and absolute quantitation), a stable isotope labeling methodology that enables multiplexing of four samples, for a temporal study of HCT-116 cells treated with butyrate. In addition, cleavable ICAT, which selectively tags cysteine-containing proteins, was also used, and the results complemented those obtained from the iTRAQ strategy. Selected protein targets were validated by real time PCR and Western blotting. A model is proposed to illustrate our findings from this temporal analysis of the butyrate-responsive proteome that uncovered several integrated cellular processes and pathways involved in growth arrest, apoptosis, and metastasis. These signature clusters of butyrate-regulated pathways are potential targets for novel chemopreventive and therapeutic drugs for treatment of colorectal cancer.

  7. Quantitative proteomics and terminomics to elucidate the role of ubiquitination and proteolysis in adaptive immunity.

    PubMed

    Klein, Theo; Viner, Rosa I; Overall, Christopher M

    2016-10-28

    Adaptive immunity is the specialized defence mechanism in vertebrates that evolved to eliminate pathogens. Specialized lymphocytes recognize specific protein epitopes through antigen receptors to mount potent immune responses, many of which are initiated by nuclear factor-kappa B activation and gene transcription. Most, if not all, pathways in adaptive immunity are further regulated by post-translational modification (PTM) of signalling proteins, e.g. phosphorylation, citrullination, ubiquitination and proteolytic processing. The importance of PTMs is reflected by genetic or acquired defects in these pathways that lead to a dysfunctional immune response. Here we discuss the state of the art in targeted proteomics and systems biology approaches to dissect the PTM landscape specifically regarding ubiquitination and proteolysis in B- and T-cell activation. Recent advances have occurred in methods for specific enrichment and targeted quantitation. Together with improved instrument sensitivity, these advances enable the accurate analysis of often rare PTM events that are opaque to conventional proteomics approaches, now rendering in-depth analysis and pathway dissection possible. We discuss published approaches, including as a case study the profiling of the N-terminome of lymphocytes of a rare patient with a genetic defect in the paracaspase protease MALT1, a key regulator protease in antigen-driven signalling, which was manifested by elevated linear ubiquitination.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Authors.

  8. Associative image analysis: a method for automated quantification of 3D multi-parameter images of brain tissue

    PubMed Central

    Bjornsson, Christopher S; Lin, Gang; Al-Kofahi, Yousef; Narayanaswamy, Arunachalam; Smith, Karen L; Shain, William; Roysam, Badrinath

    2009-01-01

    Brain structural complexity has confounded prior efforts to extract quantitative image-based measurements. We present a systematic ‘divide and conquer’ methodology for analyzing three-dimensional (3D) multi-parameter images of brain tissue to delineate and classify key structures, and compute quantitative associations among them. To demonstrate the method, thick (~100 μm) slices of rat brain tissue were labeled using 3 – 5 fluorescent signals, and imaged using spectral confocal microscopy and unmixing algorithms. Automated 3D segmentation and tracing algorithms were used to delineate cell nuclei, vasculature, and cell processes. From these segmentations, a set of 23 intrinsic and 8 associative image-based measurements was computed for each cell. These features were used to classify astrocytes, microglia, neurons, and endothelial cells. Associations among cells and between cells and vasculature were computed and represented as graphical networks to enable further analysis. The automated results were validated using a graphical interface that permits investigator inspection and corrective editing of each cell in 3D. Nuclear counting accuracy was >89%, and cell classification accuracy ranged from 81–92% depending on cell type. We present a software system named FARSIGHT implementing our methodology. Its output is a detailed XML file containing measurements that may be used for diverse quantitative hypothesis-driven and exploratory studies of the central nervous system. PMID:18294697

  9. Extragalactic Supergiants

    NASA Astrophysics Data System (ADS)

    Urbaneja, Miguel A.; Kudritzki, Rolf P.

    2017-11-01

    Blue supergiant stars of B and A spectral types are amongst the visually brightest non-transient astronomical objects. Their intrinsic brightness makes it possible to obtain high quality optical spectra of these objects in distant galaxies, enabling the study not only of these stars in different environments, but also to use them as tools to probe their host galaxies. Quantitative analysis of their optical spectra provide tight constraints on their evolution in a wide range of metallicities, as well as on the present-day chemical composition, extinction laws and distances to their host galaxies. We review in this contribution recent results in this field.

  10. Modelisation and distribution of neutron flux in radium-beryllium source (226Ra-Be)

    NASA Astrophysics Data System (ADS)

    Didi, Abdessamad; Dadouch, Ahmed; Jai, Otman

    2017-09-01

    Using the Monte Carlo N-Particle code (MCNP-6), to analyze the thermal, epithermal and fast neutron fluxes, of 3 millicuries of radium-beryllium, for determine the qualitative and quantitative of many materials, using method of neutron activation analysis. Radium-beryllium source of neutron is established to practical work and research in nuclear field. The main objective of this work was to enable us harness the profile flux of radium-beryllium irradiation, this theoretical study permits to discuss the design of the optimal irradiation and performance for increased the facility research and education of nuclear physics.

  11. Separation techniques: Chromatography

    PubMed Central

    Coskun, Ozlem

    2016-01-01

    Chromatography is an important biophysical technique that enables the separation, identification, and purification of the components of a mixture for qualitative and quantitative analysis. Proteins can be purified based on characteristics such as size and shape, total charge, hydrophobic groups present on the surface, and binding capacity with the stationary phase. Four separation techniques based on molecular characteristics and interaction type use mechanisms of ion exchange, surface adsorption, partition, and size exclusion. Other chromatography techniques are based on the stationary bed, including column, thin layer, and paper chromatography. Column chromatography is one of the most common methods of protein purification. PMID:28058406

  12. Methods for threshold determination in multiplexed assays

    DOEpatents

    Tammero, Lance F. Bentley; Dzenitis, John M; Hindson, Benjamin J

    2014-06-24

    Methods for determination of threshold values of signatures comprised in an assay are described. Each signature enables detection of a target. The methods determine a probability density function of negative samples and a corresponding false positive rate curve. A false positive criterion is established and a threshold for that signature is determined as a point at which the false positive rate curve intersects the false positive criterion. A method for quantitative analysis and interpretation of assay results together with a method for determination of a desired limit of detection of a signature in an assay are also described.

  13. Proteomic analysis of single mammalian cells enabled by microfluidic nanodroplet sample preparation and ultrasensitive nanoLC-MS.

    PubMed

    Zhu, Ying; Clair, Geremy; Chrisler, William; Shen, Yufeng; Zhao, Rui; Shukla, Anil; Moore, Ronald; Misra, Ravi; Pryhuber, Gloria; Smith, Richard; Ansong, Charles; Kelly, Ryan T

    2018-05-24

    We report on the quantitative proteomic analysis of single mammalian cells. Fluorescence-activated cell sorting was employed to deposit cells into a newly developed nanodroplet sample processing chip, after which samples were analysed by ultrasensitive nanoLC-MS. An average of ~670 protein groups were confidently identified from single HeLa cells, which is a far greater level of proteome coverage for single cells than has been previously reported. We demonstrate that the single cell proteomics platform can be used to differentiate cell types from enzyme-dissociated human lung primary cells and identify specific protein markers for epithelial and mesenchymal cells. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Spinal sensory circuits in motion.

    PubMed

    Böhm, Urs Lucas; Wyart, Claire

    2016-12-01

    The role of sensory feedback in shaping locomotion has been long debated. Recent advances in genetics and behavior analysis revealed the importance of proprioceptive pathways in spinal circuits. The mechanisms underlying peripheral mechanosensation enabled to unravel the networks that feedback to spinal circuits in order to modulate locomotion. Sensory inputs to the vertebrate spinal cord were long thought to originate from the periphery. Recent studies challenge this view: GABAergic sensory neurons located within the spinal cord have been shown to relay mechanical and chemical information from the cerebrospinal fluid to motor circuits. Innovative approaches combining genetics, quantitative analysis of behavior and optogenetics now allow probing the contribution of these sensory feedback pathways to locomotion and recovery following spinal cord injury. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. ANALYTIC MODELING OF STARSHADES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cash, Webster

    2011-09-01

    External occulters, otherwise known as starshades, have been proposed as a solution to one of the highest priority yet technically vexing problems facing astrophysics-the direct imaging and characterization of terrestrial planets around other stars. New apodization functions, developed over the past few years, now enable starshades of just a few tens of meters diameter to occult central stars so efficiently that the orbiting exoplanets can be revealed and other high-contrast imaging challenges addressed. In this paper, an analytic approach to the analysis of these apodization functions is presented. It is used to develop a tolerance analysis suitable for use inmore » designing practical starshades. The results provide a mathematical basis for understanding starshades and a quantitative approach to setting tolerances.« less

  16. OpenMS - A platform for reproducible analysis of mass spectrometry data.

    PubMed

    Pfeuffer, Julianus; Sachsenberg, Timo; Alka, Oliver; Walzer, Mathias; Fillbrunn, Alexander; Nilse, Lars; Schilling, Oliver; Reinert, Knut; Kohlbacher, Oliver

    2017-11-10

    In recent years, several mass spectrometry-based omics technologies emerged to investigate qualitative and quantitative changes within thousands of biologically active components such as proteins, lipids and metabolites. The research enabled through these methods potentially contributes to the diagnosis and pathophysiology of human diseases as well as to the clarification of structures and interactions between biomolecules. Simultaneously, technological advances in the field of mass spectrometry leading to an ever increasing amount of data, demand high standards in efficiency, accuracy and reproducibility of potential analysis software. This article presents the current state and ongoing developments in OpenMS, a versatile open-source framework aimed at enabling reproducible analyses of high-throughput mass spectrometry data. It provides implementations of frequently occurring processing operations on MS data through a clean application programming interface in C++ and Python. A collection of 185 tools and ready-made workflows for typical MS-based experiments enable convenient analyses for non-developers and facilitate reproducible research without losing flexibility. OpenMS will continue to increase its ease of use for developers as well as users with improved continuous integration/deployment strategies, regular trainings with updated training materials and multiple sources of support. The active developer community ensures the incorporation of new features to support state of the art research. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  17. The Development of Mathematical Knowledge for Teaching for Quantitative Reasoning Using Video-Based Instruction

    NASA Astrophysics Data System (ADS)

    Walters, Charles David

    Quantitative reasoning (P. W. Thompson, 1990, 1994) is a powerful mathematical tool that enables students to engage in rich problem solving across the curriculum. One way to support students' quantitative reasoning is to develop prospective secondary teachers' (PSTs) mathematical knowledge for teaching (MKT; Ball, Thames, & Phelps, 2008) related to quantitative reasoning. However, this may prove challenging, as prior to entering the classroom, PSTs often have few opportunities to develop MKT by examining and reflecting on students' thinking. Videos offer one avenue through which such opportunities are possible. In this study, I report on the design of a mini-course for PSTs that featured a series of videos created as part of a proof-of-concept NSF-funded project. These MathTalk videos highlight the ways in which the quantitative reasoning of two high school students developed over time. Using a mixed approach to grounded theory, I analyzed pre- and postinterviews using an extant coding scheme based on the Silverman and Thompson (2008) framework for the development of MKT. This analysis revealed a shift in participants' affect as well as three distinct shifts in their MKT around quantitative reasoning with distances, including shifts in: (a) quantitative reasoning; (b) point of view (decentering); and (c) orientation toward problem solving. Using the four-part focusing framework (Lobato, Hohensee, & Rhodehamel, 2013), I analyzed classroom data to account for how participants' noticing was linked with the shifts in MKT. Notably, their increased noticing of aspects of MKT around quantitative reasoning with distances, which features prominently in the MathTalk videos, seemed to contribute to the emergence of the shifts in MKT. Results from this study link elements of the learning environment to the development of specific facets of MKT around quantitative reasoning with distances. These connections suggest that vicarious experiences with two students' quantitative reasoning over time was critical for participants' development of MKT.

  18. Advancing Precision Nuclear Medicine and Molecular Imaging for Lymphoma.

    PubMed

    Wright, Chadwick L; Maly, Joseph J; Zhang, Jun; Knopp, Michael V

    2017-01-01

    PET with fluorodeoxyglucose F 18 ( 18 F FDG-PET) is a meaningful biomarker for the detection, targeted biopsy, and treatment of lymphoma. This article reviews the evolution of 18 F FDG-PET as a putative biomarker for lymphoma and addresses the current capabilities, challenges, and opportunities to enable precision medicine practices for lymphoma. Precision nuclear medicine is driven by new imaging technologies and methodologies to more accurately detect malignant disease. Although quantitative assessment of response is limited, such technologies will enable a more precise metabolic mapping with much higher definition image detail and thus may make it a robust and valid quantitative response assessment methodology. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Accurate, Sensitive, and Precise Multiplexed Proteomics Using the Complement Reporter Ion Cluster

    DOE PAGES

    Sonnett, Matthew; Yeung, Eyan; Wuhr, Martin

    2018-03-09

    We present that quantitative analysis of proteomes across multiple time points, organelles, and perturbations is essential for understanding both fundamental biology and disease states. The development of isobaric tags (e.g. TMT) have enabled the simultaneous measurement of peptide abundances across several different conditions. These multiplexed approaches are promising in principle because of advantages in throughput and measurement quality. However, in practice existing multiplexing approaches suffer from key limitations. In its simple implementation (TMT-MS2), measurements are distorted by chemical noise leading to poor measurement accuracy. The current state-of-the-art (TMT-MS3) addresses this, but requires specialized quadrupole-iontrap-Orbitrap instrumentation. The complement reporter ion approachmore » (TMTc) produces high accuracy measurements and is compatible with many more instruments, like quadrupole-Orbitraps. However, the required deconvolution of the TMTc cluster leads to poor measurement precision. Here, we introduce TMTc+, which adds the modeling of the MS2-isolation step into the deconvolution algorithm. The resulting measurements are comparable in precision to TMT-MS3/MS2. The improved duty cycle, and lower filtering requirements make TMTc+ more sensitive than TMT-MS3 and comparable with TMT-MS2. At the same time, unlike TMT-MS2, TMTc+ is exquisitely able to distinguish signal from chemical noise even outperforming TMT-MS3. Lastly, we compare TMTc+ to quantitative label-free proteomics of total HeLa lysate and find that TMTc+ quantifies 7.8k versus 3.9k proteins in a 5-plex sample. At the same time the median coefficient of variation improves from 13% to 4%. Furthermore, TMTc+ advances quantitative proteomics by enabling accurate, sensitive, and precise multiplexed experiments on more commonly used instruments.« less

  20. Accurate, Sensitive, and Precise Multiplexed Proteomics Using the Complement Reporter Ion Cluster

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sonnett, Matthew; Yeung, Eyan; Wuhr, Martin

    We present that quantitative analysis of proteomes across multiple time points, organelles, and perturbations is essential for understanding both fundamental biology and disease states. The development of isobaric tags (e.g. TMT) have enabled the simultaneous measurement of peptide abundances across several different conditions. These multiplexed approaches are promising in principle because of advantages in throughput and measurement quality. However, in practice existing multiplexing approaches suffer from key limitations. In its simple implementation (TMT-MS2), measurements are distorted by chemical noise leading to poor measurement accuracy. The current state-of-the-art (TMT-MS3) addresses this, but requires specialized quadrupole-iontrap-Orbitrap instrumentation. The complement reporter ion approachmore » (TMTc) produces high accuracy measurements and is compatible with many more instruments, like quadrupole-Orbitraps. However, the required deconvolution of the TMTc cluster leads to poor measurement precision. Here, we introduce TMTc+, which adds the modeling of the MS2-isolation step into the deconvolution algorithm. The resulting measurements are comparable in precision to TMT-MS3/MS2. The improved duty cycle, and lower filtering requirements make TMTc+ more sensitive than TMT-MS3 and comparable with TMT-MS2. At the same time, unlike TMT-MS2, TMTc+ is exquisitely able to distinguish signal from chemical noise even outperforming TMT-MS3. Lastly, we compare TMTc+ to quantitative label-free proteomics of total HeLa lysate and find that TMTc+ quantifies 7.8k versus 3.9k proteins in a 5-plex sample. At the same time the median coefficient of variation improves from 13% to 4%. Furthermore, TMTc+ advances quantitative proteomics by enabling accurate, sensitive, and precise multiplexed experiments on more commonly used instruments.« less

  1. Application development environment for advanced digital workstations

    NASA Astrophysics Data System (ADS)

    Valentino, Daniel J.; Harreld, Michael R.; Liu, Brent J.; Brown, Matthew S.; Huang, Lu J.

    1998-06-01

    One remaining barrier to the clinical acceptance of electronic imaging and information systems is the difficulty in providing intuitive access to the information needed for a specific clinical task (such as reaching a diagnosis or tracking clinical progress). The purpose of this research was to create a development environment that enables the design and implementation of advanced digital imaging workstations. We used formal data and process modeling to identify the diagnostic and quantitative data that radiologists use and the tasks that they typically perform to make clinical decisions. We studied a diverse range of radiology applications, including diagnostic neuroradiology in an academic medical center, pediatric radiology in a children's hospital, screening mammography in a breast cancer center, and thoracic radiology consultation for an oncology clinic. We used object- oriented analysis to develop software toolkits that enable a programmer to rapidly implement applications that closely match clinical tasks. The toolkits support browsing patient information, integrating patient images and reports, manipulating images, and making quantitative measurements on images. Collectively, we refer to these toolkits as the UCLA Digital ViewBox toolkit (ViewBox/Tk). We used the ViewBox/Tk to rapidly prototype and develop a number of diverse medical imaging applications. Our task-based toolkit approach enabled rapid and iterative prototyping of workstations that matched clinical tasks. The toolkit functionality and performance provided a 'hands-on' feeling for manipulating images, and for accessing textual information and reports. The toolkits directly support a new concept for protocol based-reading of diagnostic studies. The design supports the implementation of network-based application services (e.g., prefetching, workflow management, and post-processing) that will facilitate the development of future clinical applications.

  2. Mixed methods research.

    PubMed

    Halcomb, Elizabeth; Hickman, Louise

    2015-04-08

    Mixed methods research involves the use of qualitative and quantitative data in a single research project. It represents an alternative methodological approach, combining qualitative and quantitative research approaches, which enables nurse researchers to explore complex phenomena in detail. This article provides a practical overview of mixed methods research and its application in nursing, to guide the novice researcher considering a mixed methods research project.

  3. Clustering and Network Analysis of Reverse Phase Protein Array Data.

    PubMed

    Byron, Adam

    2017-01-01

    Molecular profiling of proteins and phosphoproteins using a reverse phase protein array (RPPA) platform, with a panel of target-specific antibodies, enables the parallel, quantitative proteomic analysis of many biological samples in a microarray format. Hence, RPPA analysis can generate a high volume of multidimensional data that must be effectively interrogated and interpreted. A range of computational techniques for data mining can be applied to detect and explore data structure and to form functional predictions from large datasets. Here, two approaches for the computational analysis of RPPA data are detailed: the identification of similar patterns of protein expression by hierarchical cluster analysis and the modeling of protein interactions and signaling relationships by network analysis. The protocols use freely available, cross-platform software, are easy to implement, and do not require any programming expertise. Serving as data-driven starting points for further in-depth analysis, validation, and biological experimentation, these and related bioinformatic approaches can accelerate the functional interpretation of RPPA data.

  4. Multi-component determination and chemometric analysis of Paris polyphylla by ultra high performance liquid chromatography with photodiode array detection.

    PubMed

    Chen, Pei; Jin, Hong-Yu; Sun, Lei; Ma, Shuang-Cheng

    2016-09-01

    Multi-source analysis of traditional Chinese medicine is key to ensuring its safety and efficacy. Compared with traditional experimental differentiation, chemometric analysis is a simpler strategy to identify traditional Chinese medicines. Multi-component analysis plays an increasingly vital role in the quality control of traditional Chinese medicines. A novel strategy, based on chemometric analysis and quantitative analysis of multiple components, was proposed to easily and effectively control the quality of traditional Chinese medicines such as Chonglou. Ultra high performance liquid chromatography was more convenient and efficient. Five species of Chonglou were distinguished by chemometric analysis and nine saponins, including Chonglou saponins I, II, V, VI, VII, D, and H, as well as dioscin and gracillin, were determined in 18 min. The method is feasible and credible, and enables to improve quality control of traditional Chinese medicines and natural products. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Application of adenosine triphosphate affinity probe and scheduled multiple-reaction monitoring analysis for profiling global kinome in human cells in response to arsenite treatment.

    PubMed

    Guo, Lei; Xiao, Yongsheng; Wang, Yinsheng

    2014-11-04

    Phosphorylation of cellular components catalyzed by kinases plays important roles in cell signaling and proliferation. Quantitative assessment of perturbation in global kinome may provide crucial knowledge for elucidating the mechanisms underlying the cytotoxic effects of environmental toxicants. Here, we utilized an adenosine triphosphate (ATP) affinity probe coupled with stable isotope labeling by amino acids in cell culture (SILAC) to assess quantitatively the arsenite-induced alteration of global kinome in human cells. We constructed a SILAC-compatible kinome library for scheduled multiple-reaction monitoring (MRM) analysis and adopted on-the-fly recalibration of retention time shift, which provided better throughput of the analytical method and enabled the simultaneous quantification of the expression of ∼300 kinases in two LC-MRM runs. With this improved analytical method, we conducted an in-depth quantitative analysis of the perturbation of kinome of GM00637 human skin fibroblast cells induced by arsenite exposure. Several kinases involved in cell cycle progression, including cyclin-dependent kinases (CDK1 and CDK4) and Aurora kinases A, B, and C, were found to be hyperactivated, and the altered expression of CDK1 was further validated by Western analysis. In addition, treatment with a CDK inhibitor, flavopiridol, partially restored the arsenite-induced growth inhibition of human skin fibroblast cells. Thus, sodium arsenite may confer its cytotoxic effect partly through the aberrant activation of CDKs and the resultant perturbation of cell cycle progression. Together, we developed a high-throughput, SILAC-compatible, and MRM-based kinome profiling method and demonstrated that the method is powerful in deciphering the molecular modes of action of a widespread environmental toxicant. The method should be generally applicable for uncovering the cellular pathways triggered by other extracellular stimuli.

  6. Application of Adenosine Triphosphate Affinity Probe and Scheduled Multiple-Reaction Monitoring Analysis for Profiling Global Kinome in Human Cells in Response to Arsenite Treatment

    PubMed Central

    2015-01-01

    Phosphorylation of cellular components catalyzed by kinases plays important roles in cell signaling and proliferation. Quantitative assessment of perturbation in global kinome may provide crucial knowledge for elucidating the mechanisms underlying the cytotoxic effects of environmental toxicants. Here, we utilized an adenosine triphosphate (ATP) affinity probe coupled with stable isotope labeling by amino acids in cell culture (SILAC) to assess quantitatively the arsenite-induced alteration of global kinome in human cells. We constructed a SILAC-compatible kinome library for scheduled multiple-reaction monitoring (MRM) analysis and adopted on-the-fly recalibration of retention time shift, which provided better throughput of the analytical method and enabled the simultaneous quantification of the expression of ∼300 kinases in two LC-MRM runs. With this improved analytical method, we conducted an in-depth quantitative analysis of the perturbation of kinome of GM00637 human skin fibroblast cells induced by arsenite exposure. Several kinases involved in cell cycle progression, including cyclin-dependent kinases (CDK1 and CDK4) and Aurora kinases A, B, and C, were found to be hyperactivated, and the altered expression of CDK1 was further validated by Western analysis. In addition, treatment with a CDK inhibitor, flavopiridol, partially restored the arsenite-induced growth inhibition of human skin fibroblast cells. Thus, sodium arsenite may confer its cytotoxic effect partly through the aberrant activation of CDKs and the resultant perturbation of cell cycle progression. Together, we developed a high-throughput, SILAC-compatible, and MRM-based kinome profiling method and demonstrated that the method is powerful in deciphering the molecular modes of action of a widespread environmental toxicant. The method should be generally applicable for uncovering the cellular pathways triggered by other extracellular stimuli. PMID:25301106

  7. Multi-Resolution Analysis of LiDAR data for Characterizing a Stabilized Aeolian Landscape in South Texas

    NASA Astrophysics Data System (ADS)

    Barrineau, C. P.; Dobreva, I. D.; Bishop, M. P.; Houser, C.

    2014-12-01

    Aeolian systems are ideal natural laboratories for examining self-organization in patterned landscapes, as certain wind regimes generate certain morphologies. Topographic information and scale dependent analysis offer the opportunity to study such systems and characterize process-form relationships. A statistically based methodology for differentiating aeolian features would enable the quantitative association of certain surface characteristics with certain morphodynamic regimes. We conducted a multi-resolution analysis of LiDAR elevation data to assess scale-dependent morphometric variations in an aeolian landscape in South Texas. For each pixel, mean elevation values are calculated along concentric circles moving outward at 100-meter intervals (i.e. 500 m, 600 m, 700 m from pixel). The calculated average elevation values plotted against distance from the pixel of interest as curves are used to differentiate multi-scalar variations in elevation across the landscape. In this case, it is hypothesized these curves may be used to quantitatively differentiate certain morphometries from others like a spectral signature may be used to classify paved surfaces from natural vegetation, for example. After generating multi-resolution curves for all the pixels in a selected area of interest (AOI), a Principal Components Analysis is used to highlight commonalities and singularities between generated curves from pixels across the AOI. Our findings suggest that the resulting components could be used for identification of discrete aeolian features like open sands, trailing ridges and active dune crests, and, in particular, zones of deflation. This new approach to landscape characterization not only works to mitigate bias introduced when researchers must select training pixels for morphometric investigations, but can also reveal patterning in aeolian landscapes that would not be as obvious without quantitative characterization.

  8. QuASAR: quantitative allele-specific analysis of reads.

    PubMed

    Harvey, Chris T; Moyerbrailean, Gregory A; Davis, Gordon O; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-04-15

    Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. http://github.com/piquelab/QuASAR. fluca@wayne.edu or rpique@wayne.edu Supplementary Material is available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. New microfluidic-based sampling procedure for overcoming the hematocrit problem associated with dried blood spot analysis.

    PubMed

    Leuthold, Luc Alexis; Heudi, Olivier; Déglon, Julien; Raccuglia, Marc; Augsburger, Marc; Picard, Franck; Kretz, Olivier; Thomas, Aurélien

    2015-02-17

    Hematocrit (Hct) is one of the most critical issues associated with the bioanalytical methods used for dried blood spot (DBS) sample analysis. Because Hct determines the viscosity of blood, it may affect the spreading of blood onto the filter paper. Hence, accurate quantitative data can only be obtained if the size of the paper filter extracted contains a fixed blood volume. We describe for the first time a microfluidic-based sampling procedure to enable accurate blood volume collection on commercially available DBS cards. The system allows the collection of a controlled volume of blood (e.g., 5 or 10 μL) within several seconds. Reproducibility of the sampling volume was examined in vivo on capillary blood by quantifying caffeine and paraxanthine on 5 different extracted DBS spots at two different time points and in vitro with a test compound, Mavoglurant, on 10 different spots at two Hct levels. Entire spots were extracted. In addition, the accuracy and precision (n = 3) data for the Mavoglurant quantitation in blood with Hct levels between 26% and 62% were evaluated. The interspot precision data were below 9.0%, which was equivalent to that of a manually spotted volume with a pipet. No Hct effect was observed in the quantitative results obtained for Hct levels from 26% to 62%. These data indicate that our microfluidic-based sampling procedure is accurate and precise and that the analysis of Mavoglurant is not affected by the Hct values. This provides a simple procedure for DBS sampling with a fixed volume of capillary blood, which could eliminate the recurrent Hct issue linked to DBS sample analysis.

  10. Quantitative secondary electron detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agrawal, Jyoti; Joy, David C.; Nayak, Subuhadarshi

    Quantitative Secondary Electron Detection (QSED) using the array of solid state devices (SSD) based electron-counters enable critical dimension metrology measurements in materials such as semiconductors, nanomaterials, and biological samples (FIG. 3). Methods and devices effect a quantitative detection of secondary electrons with the array of solid state detectors comprising a number of solid state detectors. An array senses the number of secondary electrons with a plurality of solid state detectors, counting the number of secondary electrons with a time to digital converter circuit in counter mode.

  11. High-resolution quantitative determination of dielectric function by using scattering scanning near-field optical microscopy

    PubMed Central

    Tranca, D. E.; Stanciu, S. G.; Hristu, R.; Stoichita, C.; Tofail, S. A. M.; Stanciu, G. A.

    2015-01-01

    A new method for high-resolution quantitative measurement of the dielectric function by using scattering scanning near-field optical microscopy (s-SNOM) is presented. The method is based on a calibration procedure that uses the s-SNOM oscillating dipole model of the probe-sample interaction and quantitative s-SNOM measurements. The nanoscale capabilities of the method have the potential to enable novel applications in various fields such as nano-electronics, nano-photonics, biology or medicine. PMID:26138665

  12. Calypso: a user-friendly web-server for mining and visualizing microbiome-environment interactions.

    PubMed

    Zakrzewski, Martha; Proietti, Carla; Ellis, Jonathan J; Hasan, Shihab; Brion, Marie-Jo; Berger, Bernard; Krause, Lutz

    2017-03-01

    Calypso is an easy-to-use online software suite that allows non-expert users to mine, interpret and compare taxonomic information from metagenomic or 16S rDNA datasets. Calypso has a focus on multivariate statistical approaches that can identify complex environment-microbiome associations. The software enables quantitative visualizations, statistical testing, multivariate analysis, supervised learning, factor analysis, multivariable regression, network analysis and diversity estimates. Comprehensive help pages, tutorials and videos are provided via a wiki page. The web-interface is accessible via http://cgenome.net/calypso/ . The software is programmed in Java, PERL and R and the source code is available from Zenodo ( https://zenodo.org/record/50931 ). The software is freely available for non-commercial users. l.krause@uq.edu.au. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  13. Application of Deep Learning in Automated Analysis of Molecular Images in Cancer: A Survey

    PubMed Central

    Xue, Yong; Chen, Shihui; Liu, Yong

    2017-01-01

    Molecular imaging enables the visualization and quantitative analysis of the alterations of biological procedures at molecular and/or cellular level, which is of great significance for early detection of cancer. In recent years, deep leaning has been widely used in medical imaging analysis, as it overcomes the limitations of visual assessment and traditional machine learning techniques by extracting hierarchical features with powerful representation capability. Research on cancer molecular images using deep learning techniques is also increasing dynamically. Hence, in this paper, we review the applications of deep learning in molecular imaging in terms of tumor lesion segmentation, tumor classification, and survival prediction. We also outline some future directions in which researchers may develop more powerful deep learning models for better performance in the applications in cancer molecular imaging. PMID:29114182

  14. Phosphorylation-specific status of RNAi triggers in pharmacokinetic and biodistribution analyses

    PubMed Central

    Trubetskoy, Vladimir S.; Griffin, Jacob B.; Nicholas, Anthony L.; Nord, Eric M.; Xu, Zhao; Peterson, Ryan M.; Wooddell, Christine I.; Rozema, David B.; Wakefield, Darren H.; Lewis, David L.

    2017-01-01

    Abstract The RNA interference (RNAi)-based therapeutic ARC-520 for chronic hepatitis B virus (HBV) infection consists of a melittin-derived peptide conjugated to N-acetylgalactosamine for hepatocyte targeting and endosomal escape, and cholesterol-conjugated RNAi triggers, which together result in HBV gene silencing. To characterize the kinetics of RNAi trigger delivery and 5΄-phosphorylation of guide strands correlating with gene knockdown, we employed a peptide-nucleic acid (PNA) hybridization assay. A fluorescent sense strand PNA probe binding to RNAi duplex guide strands was coupled with anion exchange high performance liquid chromatography to quantitate guide strands and metabolites. Compared to PCR- or ELISA-based methods, this assay enables separate quantitation of non-phosphorylated full-length guide strands from 5΄-phosphorylated forms that may associate with RNA-induced silencing complexes (RISC). Biodistribution studies in mice indicated that ARC-520 guide strands predominantly accumulated in liver. 5΄-phosphorylation of guide strands was observed within 5 min after ARC-520 injection, and was detected for at least 4 weeks corresponding to the duration of HBV mRNA silencing. Guide strands detected in RISC by AGO2 immuno-isolation represented 16% of total 5΄-phosphorylated guide strands in liver, correlating with a 2.7 log10 reduction of HBsAg. The PNA method enables pharmacokinetic analysis of RNAi triggers, elucidates potential metabolic processing events and defines pharmacokinetic-pharmacodynamic relationships. PMID:28180327

  15. Quantitative Characterisation of Sky Conditions on Paranal with the Microwave Radiometer LHATPRO - Five Years and Learning

    NASA Astrophysics Data System (ADS)

    Kerber, Florian; Querel, R.; Neureiter, B.; Hanuschik, R.

    2017-09-01

    "A Low Humidity and Temperature Profiling (LHATPRO) microwave radiometer, optimized for measuring small amounts of atmospheric precipitable water vapour (PWV), has now been in use for more than five years to monitor sky conditions over ESO's Paranal observatory (median PWV 2.5 mm). We'll summarise the performance characteristics of the unit and the current applications of its data in scheduling observations in Service Mode to take advantage of favourable conditions for infrared observations. We'll elaborate on our improved understanding of PWV over Paranal, including an analysis of PWV homogeneity addressing an important calibration issue. In addition we'll describe how the capabilities of the LHATPRO can be used in the future to further strengthen science operations and calibration by also offering line-of-sight support for individual VLT observations. Using its IR data we developed a method for an automated classification of photometric observing conditions in a quantitative way, supporting high precision photometry. Its highly precise PWV measurements enable new low PWV science during episodes of extremely low water vapour that result in a strongly increased transmission also outside the standard atmospheric windows. A goal for the future is to combine various diagnostics measurements (altitude resolved profiles) by LHATPRO and other instruments and sophisticated atmospheric modeling to better characterize relevant properties of the atmosphere and to thus enable more precise, local short-term forecasting for optimised science operations."

  16. Studying learning in the healthcare setting: the potential of quantitative diary methods.

    PubMed

    Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke

    2015-08-01

    Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples' experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the field of medical education. Quantitative diary methods offer several methodological advantages, such as measuring aspects of learning with great detail, accuracy and authenticity. Moreover, they enable researchers to study how and under which conditions learning in the health care setting occurs and in which way learning can be promoted. Hence, quantitative diary methods may contribute to theory development and the optimization of teaching methods in medical education.

  17. An intelligent data model for the storage of structured grids

    NASA Astrophysics Data System (ADS)

    Clyne, John; Norton, Alan

    2013-04-01

    With support from the U.S. National Science Foundation we have developed, and currently maintain, VAPOR: a geosciences-focused, open source visual data analysis package. VAPOR enables highly interactive exploration, as well as qualitative and quantitative analysis of high-resolution simulation outputs using only a commodity, desktop computer. The enabling technology behind VAPOR's ability to interact with a data set, whose size would overwhelm all but the largest analysis computing resources, is a progressive data access file format, called the VAPOR Data Collection (VDC). The VDC is based on the discrete wavelet transform and their information compaction properties. Prior to analysis, raw data undergo a wavelet transform, concentrating the information content into a fraction of the coefficients. The coefficients are then sorted by their information content (magnitude) into a small number of bins. Data are reconstructed by applying an inverse wavelet transform. If all of the coefficient bins are used during reconstruction the process is lossless (up to floating point round-off). If only a subset of the bins are used, an approximation of the original data is produced. A crucial point here is that the principal benefit to reconstruction from a subset of wavelet coefficients is a reduction in I/O. Further, if smaller coefficients are simply discarded, or perhaps stored on more capacious tertiary storage, secondary storage requirements (e.g. disk) can be reduced as well. In practice, these reductions in I/O or storage can be on the order of tens or even hundreds. This talk will briefly describe the VAPOR Data Collection, and will present real world success stories from the geosciences that illustrate how progressive data access enables highly interactive exploration of Big Data.

  18. Quantitative analysis of charge trapping and classification of sub-gap states in MoS2 TFT by pulse I-V method

    NASA Astrophysics Data System (ADS)

    Park, Junghak; Hur, Ji-Hyun; Jeon, Sanghun

    2018-04-01

    The threshold voltage instabilities and huge hysteresis of MoS2 thin film transistors (TFTs) have raised concerns about their practical applicability in next-generation switching devices. These behaviors are associated with charge trapping, which stems from tunneling to the adjacent trap site, interfacial redox reaction and interface and/or bulk trap states. In this report, we present quantitative analysis on the electron charge trapping mechanism of MoS2 TFT by fast pulse I-V method and the space charge limited current (SCLC) measurement. By adopting the fast pulse I-V method, we were able to obtain effective mobility. In addition, the origin of the trap states was identified by disassembling the sub-gap states into interface trap and bulk trap states by simple extraction analysis. These measurement methods and analyses enable not only quantitative extraction of various traps but also an understanding of the charge transport mechanism in MoS2 TFTs. The fast I-V data and SCLC data obtained under various measurement temperatures and ambient show that electron transport to neighboring trap sites by tunneling is the main charge trapping mechanism in thin-MoS2 TFTs. This implies that interfacial traps account for most of the total sub-gap states while the bulk trap contribution is negligible, at approximately 0.40% and 0.26% in air and vacuum ambient, respectively. Thus, control of the interface trap states is crucial to further improve the performance of devices with thin channels.

  19. Quantitative analysis of charge trapping and classification of sub-gap states in MoS2 TFT by pulse I-V method.

    PubMed

    Park, Junghak; Hur, Ji-Hyun; Jeon, Sanghun

    2018-04-27

    The threshold voltage instabilities and huge hysteresis of MoS 2 thin film transistors (TFTs) have raised concerns about their practical applicability in next-generation switching devices. These behaviors are associated with charge trapping, which stems from tunneling to the adjacent trap site, interfacial redox reaction and interface and/or bulk trap states. In this report, we present quantitative analysis on the electron charge trapping mechanism of MoS 2 TFT by fast pulse I-V method and the space charge limited current (SCLC) measurement. By adopting the fast pulse I-V method, we were able to obtain effective mobility. In addition, the origin of the trap states was identified by disassembling the sub-gap states into interface trap and bulk trap states by simple extraction analysis. These measurement methods and analyses enable not only quantitative extraction of various traps but also an understanding of the charge transport mechanism in MoS 2 TFTs. The fast I-V data and SCLC data obtained under various measurement temperatures and ambient show that electron transport to neighboring trap sites by tunneling is the main charge trapping mechanism in thin-MoS 2 TFTs. This implies that interfacial traps account for most of the total sub-gap states while the bulk trap contribution is negligible, at approximately 0.40% and 0.26% in air and vacuum ambient, respectively. Thus, control of the interface trap states is crucial to further improve the performance of devices with thin channels.

  20. Functionalization of SBA-15 mesoporous silica by Cu-phosphonate units: Probing of synthesis route

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laskowski, Lukasz, E-mail: lukasz.laskowski@kik.pcz.pl; Czestochowa University of Technology, Institute of Physics, Al. Armii Krajowej 19, 42-201 Czestochowa; Laskowska, Magdalena, E-mail: magdalena.laskowska@onet.pl

    2014-12-15

    Mesoporous silica SBA-15 containing propyl-copper phosphonate units was investigated. The structure of mesoporous samples was tested by N{sub 2} isothermal sorption (BET and BHJ analysis), TEM microscopy and X-Ray scattering. Quantitative analysis EDX has given information about proportions between component atoms in the sample. Quantitative elemental analysis has been carried out to support EDX. To examine bounding between copper atoms and phosphonic units the Raman spectroscopy was carried out. As a support of Raman scattering, the theoretical calculations were made based on density functional theory, with the B3LYP method. By comparison of the calculated vibrational spectra of the molecule withmore » experimental results, distribution of the active units inside silica matrix has been determined. - Graphical abstract: The present study is devoted to mesoporous silica SBA-15 containing propyl-copper phosphonate units. The species were investigated to confirm of synthesis procedure correctness by the micro-Raman technique combined with DFT numerical simulations. Complementary research was carried out to test the structure of mesoporous samples. - Highlights: • SBA-15 silica functionalized with propyl-copper phosphonate units was synthesized. • Synthesis efficiency probed by Raman study supported with DFT simulations. • Homogenous distribution of active units was proved. • Synthesis route enables precise control of distance between copper ions.« less

  1. PeptideDepot: Flexible Relational Database for Visual Analysis of Quantitative Proteomic Data and Integration of Existing Protein Information

    PubMed Central

    Yu, Kebing; Salomon, Arthur R.

    2010-01-01

    Recently, dramatic progress has been achieved in expanding the sensitivity, resolution, mass accuracy, and scan rate of mass spectrometers able to fragment and identify peptides through tandem mass spectrometry (MS/MS). Unfortunately, this enhanced ability to acquire proteomic data has not been accompanied by a concomitant increase in the availability of flexible tools allowing users to rapidly assimilate, explore, and analyze this data and adapt to a variety of experimental workflows with minimal user intervention. Here we fill this critical gap by providing a flexible relational database called PeptideDepot for organization of expansive proteomic data sets, collation of proteomic data with available protein information resources, and visual comparison of multiple quantitative proteomic experiments. Our software design, built upon the synergistic combination of a MySQL database for safe warehousing of proteomic data with a FileMaker-driven graphical user interface for flexible adaptation to diverse workflows, enables proteomic end-users to directly tailor the presentation of proteomic data to the unique analysis requirements of the individual proteomics lab. PeptideDepot may be deployed as an independent software tool or integrated directly with our High Throughput Autonomous Proteomic Pipeline (HTAPP) used in the automated acquisition and post-acquisition analysis of proteomic data. PMID:19834895

  2. Barriers to and enablers of diabetic retinopathy screening attendance: a systematic review of published and grey literature.

    PubMed

    Graham-Rowe, E; Lorencatto, F; Lawrenson, J G; Burr, J M; Grimshaw, J M; Ivers, N M; Presseau, J; Vale, L; Peto, T; Bunce, C; Francis, J J

    2018-05-23

    To identify and synthesize studies reporting modifiable barriers/enablers associated with retinopathy screening attendance in people with Type 1 or Type 2 diabetes, and to identify those most likely to influence attendance. We searched MEDLINE, EMBASE, PsycINFO, Cochrane Library and the 'grey literature' for quantitative and qualitative studies to February 2017. Data (i.e. participant quotations, interpretive summaries, survey results) reporting barriers/enablers were extracted and deductively coded into domains from the Theoretical Domains Framework; with domains representing categories of theoretical barriers/enablers proposed to mediate behaviour change. Inductive thematic analysis was conducted within domains to describe the role each domain plays in facilitating or hindering screening attendance. Domains that were more frequently coded and for which more themes were generated were judged more likely to influence attendance. Sixty-nine primary studies were included. We identified six theoretical domains ['environmental context and resources' (75% of included studies), 'social influences' (51%), 'knowledge' (50%), 'memory, attention, decision processes' (50%), 'beliefs about consequences' (38%) and 'emotions' (33%)] as the key mediators of diabetic retinopathy screening attendance. Examples of barriers populating these domains included inaccurate diabetic registers and confusion between routine eye care and retinopathy screening. Recommendations by healthcare professionals and community-level media coverage acted as enablers. Across a variety of contexts, we found common barriers to and enablers of retinopathy screening that could be targeted in interventions aiming to increase screening attendance. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  3. Non-destructive analysis of sucrose, caffeine and trigonelline on single green coffee beans by hyperspectral imaging.

    PubMed

    Caporaso, Nicola; Whitworth, Martin B; Grebby, Stephen; Fisk, Ian D

    2018-04-01

    Hyperspectral imaging (HSI) is a novel technology for the food sector that enables rapid non-contact analysis of food materials. HSI was applied for the first time to whole green coffee beans, at a single seed level, for quantitative prediction of sucrose, caffeine and trigonelline content. In addition, the intra-bean distribution of coffee constituents was analysed in Arabica and Robusta coffees on a large sample set from 12 countries, using a total of 260 samples. Individual green coffee beans were scanned by reflectance HSI (980-2500nm) and then the concentration of sucrose, caffeine and trigonelline analysed with a reference method (HPLC-MS). Quantitative prediction models were subsequently built using Partial Least Squares (PLS) regression. Large variations in sucrose, caffeine and trigonelline were found between different species and origin, but also within beans from the same batch. It was shown that estimation of sucrose content is possible for screening purposes (R 2 =0.65; prediction error of ~0.7% w/w coffee, with observed range of ~6.5%), while the performance of the PLS model was better for caffeine and trigonelline prediction (R 2 =0.85 and R 2 =0.82, respectively; prediction errors of 0.2 and 0.1%, on a range of 2.3 and 1.1% w/w coffee, respectively). The prediction error is acceptable mainly for laboratory applications, with the potential application to breeding programmes and for screening purposes for the food industry. The spatial distribution of coffee constituents was also successfully visualised for single beans and this enabled mapping of the analytes across the bean structure at single pixel level. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  4. Investigating rate-limiting barriers to nanoscale nonviral gene transfer with nanobiophotonics

    NASA Astrophysics Data System (ADS)

    Chen, Hunter H.

    Nucleic acids are a novel class of therapeutics poised to address many unmet clinical needs. Safe and efficient delivery remains a significant challenge that has delayed the realization of the full therapeutic potential of nucleic acids. Nanoscale nonviral vectors offer an attractive alternative to viral vectors as natural and synthetic polymers or polypeptides may be rationally designed to meet the unique demands of individual applications. A mechanistic understanding of cellular barriers is necessary to develop guidelines for designing custom gene carriers which are expected to greatly impact this delivery challenge. The work herein focused on the relationships among nanocomplex stability, intracellular trafficking and unpacking kinetics, and DNA degradation. Ultrasensitive nanosensors based on QD-FRET were developed to characterize the biophysical properties of nanocomplexes and study these rate-limiting steps. Quantitative image analysis enabled the distributions of the subpopulation of condensed or released DNA to be determined within the major cellular compartments encountered during gene transfer. The steady state stability and unpacking kinetics within these compartments were found to impact transgene expression, elucidating multiple design strategies to achieve efficient gene transfer. To address enzymatic barriers, a novel two-step QD-FRET nanosensor was developed to analyze unpacking and DNA degradation simultaneously, which has not been accomplished previously. Bioresponsive strategies such as disulfide crosslinking and thermosensitivity were evaluated by QD-FRET and quantitative compartmental analysis as case studies to determine appropriate design specifications for thiolated polymers and thermoresponsive polypeptides. Relevant nanobiophotonic tools were developed as a platform to study major rate-limiting barriers to nanomedicine and demonstrated the feasibility of using mechanistic information gained from these tools to guide the rational design of gene carriers and achieve the desired properties that enable efficient gene transfer.

  5. 3D Printing and Digital Rock Physics for Geomaterials

    NASA Astrophysics Data System (ADS)

    Martinez, M. J.; Yoon, H.; Dewers, T. A.

    2015-12-01

    Imaging techniques for the analysis of porous structures have revolutionized our ability to quantitatively characterize geomaterials. Digital representations of rock from CT images and physics modeling based on these pore structures provide the opportunity to further advance our quantitative understanding of fluid flow, geomechanics, and geochemistry, and the emergence of coupled behaviors. Additive manufacturing, commonly known as 3D printing, has revolutionized production of custom parts with complex internal geometries. For the geosciences, recent advances in 3D printing technology may be co-opted to print reproducible porous structures derived from CT-imaging of actual rocks for experimental testing. The use of 3D printed microstructure allows us to surmount typical problems associated with sample-to-sample heterogeneity that plague rock physics testing and to test material response independent from pore-structure variability. Together, imaging, digital rocks and 3D printing potentially enables a new workflow for understanding coupled geophysical processes in a real, but well-defined setting circumventing typical issues associated with reproducibility, enabling full characterization and thus connection of physical phenomena to structure. In this talk we will discuss the possibilities that these technologies can bring to geosciences and present early experiences with coupled multiscale experimental and numerical analysis using 3D printed fractured rock specimens. In particular, we discuss the processes of selection and printing of transparent fractured specimens based on 3D reconstruction of micro-fractured rock to study fluid flow characterization and manipulation. Micro-particle image velocimetry is used to directly visualize 3D single and multiphase flow velocity in 3D fracture networks. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  6. Harnessing Connectivity in a Large-Scale Small-Molecule Sensitivity Dataset.

    PubMed

    Seashore-Ludlow, Brinton; Rees, Matthew G; Cheah, Jaime H; Cokol, Murat; Price, Edmund V; Coletti, Matthew E; Jones, Victor; Bodycombe, Nicole E; Soule, Christian K; Gould, Joshua; Alexander, Benjamin; Li, Ava; Montgomery, Philip; Wawer, Mathias J; Kuru, Nurdan; Kotz, Joanne D; Hon, C Suk-Yee; Munoz, Benito; Liefeld, Ted; Dančík, Vlado; Bittker, Joshua A; Palmer, Michelle; Bradner, James E; Shamji, Alykhan F; Clemons, Paul A; Schreiber, Stuart L

    2015-11-01

    Identifying genetic alterations that prime a cancer cell to respond to a particular therapeutic agent can facilitate the development of precision cancer medicines. Cancer cell-line (CCL) profiling of small-molecule sensitivity has emerged as an unbiased method to assess the relationships between genetic or cellular features of CCLs and small-molecule response. Here, we developed annotated cluster multidimensional enrichment analysis to explore the associations between groups of small molecules and groups of CCLs in a new, quantitative sensitivity dataset. This analysis reveals insights into small-molecule mechanisms of action, and genomic features that associate with CCL response to small-molecule treatment. We are able to recapitulate known relationships between FDA-approved therapies and cancer dependencies and to uncover new relationships, including for KRAS-mutant cancers and neuroblastoma. To enable the cancer community to explore these data, and to generate novel hypotheses, we created an updated version of the Cancer Therapeutic Response Portal (CTRP v2). We present the largest CCL sensitivity dataset yet available, and an analysis method integrating information from multiple CCLs and multiple small molecules to identify CCL response predictors robustly. We updated the CTRP to enable the cancer research community to leverage these data and analyses. ©2015 American Association for Cancer Research.

  7. Cross-Population Joint Analysis of eQTLs: Fine Mapping and Functional Annotation

    PubMed Central

    Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-01-01

    Mapping expression quantitative trait loci (eQTLs) has been shown as a powerful tool to uncover the genetic underpinnings of many complex traits at molecular level. In this paper, we present an integrative analysis approach that leverages eQTL data collected from multiple population groups. In particular, our approach effectively identifies multiple independent cis-eQTL signals that are consistent across populations, accounting for population heterogeneity in allele frequencies and linkage disequilibrium patterns. Furthermore, by integrating genomic annotations, our analysis framework enables high-resolution functional analysis of eQTLs. We applied our statistical approach to analyze the GEUVADIS data consisting of samples from five population groups. From this analysis, we concluded that i) jointly analysis across population groups greatly improves the power of eQTL discovery and the resolution of fine mapping of causal eQTL ii) many genes harbor multiple independent eQTLs in their cis regions iii) genetic variants that disrupt transcription factor binding are significantly enriched in eQTLs (p-value = 4.93 × 10-22). PMID:25906321

  8. Analysis of Inorganic Nanoparticles by Single-particle Inductively Coupled Plasma Time-of-Flight Mass Spectrometry.

    PubMed

    Hendriks, Lyndsey; Gundlach-Graham, Alexander; Günther, Detlef

    2018-04-25

    Due to the rapid development of nanotechnologies, engineered nanomaterials (ENMs) and nanoparticles (ENPs) are becoming a part of everyday life: nanotechnologies are quickly migrating from laboratory benches to store shelves and industrial processes. As the use of ENPs continues to expand, their release into the environment is unavoidable; however, understanding the mechanisms and degree of ENP release is only possible through direct detection of these nanospecies in relevant matrices and at realistic concentrations. Key analytical requirements for quantitative detection of ENPs include high sensitivity to detect small particles at low total mass concentrations and the need to separate signals of ENPs from a background of dissolved elemental species and natural nanoparticles (NNPs). To this end, an emerging method called single-particle inductively coupled plasma mass spectrometry (sp-ICPMS) has demonstrated great potential for the characterization of inorganic nanoparticles (NPs) at environmentally relevant concentrations. Here, we comment on the capabilities of modern sp-ICPMS analysis with particular focus on the measurement possibilities offered by ICP-time-of-flight mass spectrometry (ICP-TOFMS). ICP-TOFMS delivers complete elemental mass spectra for individual NPs, which allows for high-throughput, untargeted quantitative analysis of dispersed NPs in natural matrices. Moreover, the multi-element detection capabilities of ICP-TOFMS enable new NP-analysis strategies, including online calibration via microdroplets for accurate NP mass quantification and matrix compensation.

  9. miRNet - dissecting miRNA-target interactions and functional associations through network-based visual analysis

    PubMed Central

    Fan, Yannan; Siklenka, Keith; Arora, Simran K.; Ribeiro, Paula; Kimmins, Sarah; Xia, Jianguo

    2016-01-01

    MicroRNAs (miRNAs) can regulate nearly all biological processes and their dysregulation is implicated in various complex diseases and pathological conditions. Recent years have seen a growing number of functional studies of miRNAs using high-throughput experimental technologies, which have produced a large amount of high-quality data regarding miRNA target genes and their interactions with small molecules, long non-coding RNAs, epigenetic modifiers, disease associations, etc. These rich sets of information have enabled the creation of comprehensive networks linking miRNAs with various biologically important entities to shed light on their collective functions and regulatory mechanisms. Here, we introduce miRNet, an easy-to-use web-based tool that offers statistical, visual and network-based approaches to help researchers understand miRNAs functions and regulatory mechanisms. The key features of miRNet include: (i) a comprehensive knowledge base integrating high-quality miRNA-target interaction data from 11 databases; (ii) support for differential expression analysis of data from microarray, RNA-seq and quantitative PCR; (iii) implementation of a flexible interface for data filtering, refinement and customization during network creation; (iv) a powerful fully featured network visualization system coupled with enrichment analysis. miRNet offers a comprehensive tool suite to enable statistical analysis and functional interpretation of various data generated from current miRNA studies. miRNet is freely available at http://www.mirnet.ca. PMID:27105848

  10. Multiparametric or practical quantitative liver MRI: towards millisecond, fat fraction, kilopascal and function era.

    PubMed

    Unal, Emre; Idilman, Ilkay Sedakat; Karçaaltıncaba, Muşturay

    2017-02-01

    New advances in liver magnetic resonance imaging (MRI) may enable diagnosis of unseen pathologies by conventional techniques. Normal T1 (550-620 ms for 1.5 T and 700-850 ms for 3 T), T2, T2* (>20 ms), T1rho (40-50 ms) mapping, proton density fat fraction (PDFF) (≤5%) and stiffness (2-3kPa) values can enable differentiation of a normal liver from chronic liver and diffuse diseases. Gd-EOB-DTPA can enable assessment of liver function by using postcontrast hepatobiliary phase or T1 reduction rate (normally above 60%). T1 mapping can be important for the assessment of fibrosis, amyloidosis and copper overload. T1rho mapping is promising for the assessment of liver collagen deposition. PDFF can allow objective treatment assessment in NAFLD and NASH patients. T2 and T2* are used for iron overload determination. MR fingerprinting may enable single slice acquisition and easy implementation of multiparametric MRI and follow-up of patients. Areas covered: T1, T2, T2*, PDFF and stiffness, diffusion weighted imaging, intravoxel incoherent motion imaging (ADC, D, D* and f values) and function analysis are reviewed. Expert commentary: Multiparametric MRI can enable biopsyless diagnosis and more objective staging of diffuse liver disease, cirrhosis and predisposing diseases. A comprehensive approach is needed to understand and overcome the effects of iron, fat, fibrosis, edema, inflammation and copper on MR relaxometry values in diffuse liver disease.

  11. Introduction of a method for quantitative evaluation of spontaneous motor activity development with age in infants.

    PubMed

    Disselhorst-Klug, Catherine; Heinze, Franziska; Breitbach-Faller, Nico; Schmitz-Rode, Thomas; Rau, Günter

    2012-04-01

    Coordination between perception and action is required to interact with the environment successfully. This is already trained by very young infants who perform spontaneous movements to learn how their body interacts with the environment. The strategies used by the infants for this purpose change with age. Therefore, very early progresses in action control made by the infants can be investigated by monitoring the development of spontaneous motor activity. In this paper, an objective method is introduced, which allows the quantitative evaluation of the development of spontaneous motor activity in newborns. The introduced methodology is based on the acquisition of spontaneous movement trajectories of the feet by 3D movement analysis and subsequent calculation of specific movement parameters from them. With these movement-based parameters, it was possible to provide an objective description of age-dependent developmental steps in healthy newborns younger than 6 months. Furthermore, it has been shown that pathologies like infantile cerebral palsy influence development of motor activity significantly. Since the introduced methodology is objective and quantitative, it is suitable to monitor how newborns train their cognitive processes, which will enable them to cope with their environment by motor interaction.

  12. Diffusion tensor imaging with quantitative evaluation and fiber tractography of lumbar nerve roots in sciatica.

    PubMed

    Shi, Yin; Zong, Min; Xu, Xiaoquan; Zou, Yuefen; Feng, Yang; Liu, Wei; Wang, Chuanbing; Wang, Dehang

    2015-04-01

    To quantitatively evaluate nerve roots by measuring fractional anisotropy (FA) values in healthy volunteers and sciatica patients, visualize nerve roots by tractography, and compare the diagnostic efficacy between conventional magnetic resonance imaging (MRI) and DTI. Seventy-five sciatica patients and thirty-six healthy volunteers underwent MR imaging using DTI. FA values for L5-S1 lumbar nerve roots were calculated at three levels from DTI images. Tractography was performed on L3-S1 nerve roots. ROC analysis was performed for FA values. The lumbar nerve roots were visualized and FA values were calculated in all subjects. FA values decreased in compressed nerve roots and declined from proximal to distal along the compressed nerve tracts. Mean FA values were more sensitive and specific than MR imaging for differentiating compressed nerve roots, especially in the far lateral zone at distal nerves. DTI can quantitatively evaluate compressed nerve roots, and DTT enables visualization of abnormal nerve tracts, providing vivid anatomic information and localization of probable nerve compression. DTI has great potential utility for evaluating lumbar nerve compression in sciatica. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  13. Quantitative Evaluation of Performance during Robot-assisted Treatment.

    PubMed

    Peri, E; Biffi, E; Maghini, C; Servodio Iammarrone, F; Gagliardi, C; Germiniasi, C; Pedrocchi, A; Turconi, A C; Reni, G

    2016-01-01

    This article is part of the Focus Theme of Methods of Information in Medicine on "Methodologies, Models and Algorithms for Patients Rehabilitation". The great potential of robots in extracting quantitative and meaningful data is not always exploited in clinical practice. The aim of the present work is to describe a simple parameter to assess the performance of subjects during upper limb robotic training exploiting data automatically recorded by the robot, with no additional effort for patients and clinicians. Fourteen children affected by cerebral palsy (CP) performed a training with Armeo®Spring. Each session was evaluated with P, a simple parameter that depends on the overall performance recorded, and median and interquartile values were computed to perform a group analysis. Median (interquartile) values of P significantly increased from 0.27 (0.21) at T0 to 0.55 (0.27) at T1 . This improvement was functionally validated by a significant increase of the Melbourne Assessment of Unilateral Upper Limb Function. The parameter described here was able to show variations in performance over time and enabled a quantitative evaluation of motion abilities in a way that is reliable with respect to a well-known clinical scale.

  14. A volumetric meter chip for point-of-care quantitative detection of bovine catalase for food safety control.

    PubMed

    Cui, Xingye; Hu, Jie; Choi, Jane Ru; Huang, Yalin; Wang, Xuemin; Lu, Tian Jian; Xu, Feng

    2016-09-07

    A volumetric meter chip was developed for quantitative point-of-care (POC) analysis of bovine catalase, a bioindicator of bovine mastitis, in milk samples. The meter chip displays multiplexed quantitative results by presenting the distance of ink bar advancement that is detectable by the naked eye. The meter chip comprises a poly(methyl methacrylate) (PMMA) layer, a double-sided adhesive (DSA) layer and a glass slide layer fabricated by the laser-etching method, which is typically simple, rapid (∼3 min per chip), and cost effective (∼$0.2 per chip). Specially designed "U shape" reaction cells are covered by an adhesive tape that serves as an on-off switch, enabling the simple operation of the assay. As a proof of concept, we employed the developed meter chip for the quantification of bovine catalase in raw milk samples to detect catalase concentrations as low as 20 μg/mL. The meter chip has great potential to detect various target analytes for a wide range of POC applications. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Micro/nano-computed tomography technology for quantitative dynamic, multi-scale imaging of morphogenesis.

    PubMed

    Gregg, Chelsea L; Recknagel, Andrew K; Butcher, Jonathan T

    2015-01-01

    Tissue morphogenesis and embryonic development are dynamic events challenging to quantify, especially considering the intricate events that happen simultaneously in different locations and time. Micro- and more recently nano-computed tomography (micro/nanoCT) has been used for the past 15 years to characterize large 3D fields of tortuous geometries at high spatial resolution. We and others have advanced micro/nanoCT imaging strategies for quantifying tissue- and organ-level fate changes throughout morphogenesis. Exogenous soft tissue contrast media enables visualization of vascular lumens and tissues via extravasation. Furthermore, the emergence of antigen-specific tissue contrast enables direct quantitative visualization of protein and mRNA expression. Micro-CT X-ray doses appear to be non-embryotoxic, enabling longitudinal imaging studies in live embryos. In this chapter we present established soft tissue contrast protocols for obtaining high-quality micro/nanoCT images and the image processing techniques useful for quantifying anatomical and physiological information from the data sets.

  16. Multiplex N-terminome analysis of MMP-2 and MMP-9 substrate degradomes by iTRAQ-TAILS quantitative proteomics.

    PubMed

    Prudova, Anna; auf dem Keller, Ulrich; Butler, Georgina S; Overall, Christopher M

    2010-05-01

    Proteolysis is a major protein posttranslational modification that, by altering protein structure, affects protein function and, by truncating the protein sequence, alters peptide signatures of proteins analyzed by proteomics. To identify such modified and shortened protease-generated neo-N-termini on a proteome-wide basis, we developed a whole protein isobaric tag for relative and absolute quantitation (iTRAQ) labeling method that simultaneously labels and blocks all primary amines including protein N- termini and lysine side chains. Blocking lysines limits trypsin cleavage to arginine, which effectively elongates the proteolytically truncated peptides for improved MS/MS analysis and peptide identification. Incorporating iTRAQ whole protein labeling with terminal amine isotopic labeling of substrates (iTRAQ-TAILS) to enrich the N-terminome by negative selection of the blocked mature original N-termini and neo-N-termini has many advantages. It enables simultaneous characterization of the natural N-termini of proteins, their N-terminal modifications, and proteolysis product and cleavage site identification. Furthermore, iTRAQ-TAILS also enables multiplex N-terminomics analysis of up to eight samples and allows for quantification in MS2 mode, thus preventing an increase in spectral complexity and extending proteome coverage by signal amplification of low abundance proteins. We compared the substrate degradomes of two closely related matrix metalloproteinases, MMP-2 (gelatinase A) and MMP-9 (gelatinase B), in fibroblast secreted proteins. Among 3,152 unique N-terminal peptides identified corresponding to 1,054 proteins, we detected 201 cleavage products for MMP-2 and unexpectedly only 19 for the homologous MMP-9 under identical conditions. Novel substrates identified and biochemically validated include insulin-like growth factor binding protein-4, complement C1r component A, galectin-1, dickkopf-related protein-3, and thrombospondin-2. Hence, N-terminomics analyses using iTRAQ-TAILS links gelatinases with new mechanisms of action in angiogenesis and reveals unpredicted restrictions in substrate repertoires for these two very similar proteases.

  17. Chemical profiling: A tool to decipher the structure and organisation of illicit drug markets: An 8-year study in Western Switzerland.

    PubMed

    Broséus, Julian; Baechler, Simon; Gentile, Natacha; Esseiva, Pierre

    2016-09-01

    Illicit drug analyses usually focus on the identification and quantitation of questioned material to support the judicial process. In parallel, more and more laboratories develop physical and chemical profiling methods in a forensic intelligence perspective. The analysis of large databases resulting from this approach enables not only to draw tactical and operational intelligence, but may also contribute to the strategic overview of drugs markets. In Western Switzerland, the chemical analysis of illicit drug seizures is centralised in a laboratory hosted by the University of Lausanne. For over 8 years, this laboratory has analysed 5875 cocaine and 2728 heroin specimens, coming from respectively 1138 and 614 seizures operated by police and border guards or customs. Chemical (major and minor alkaloids, purity, cutting agents, chemical class), physical (packaging and appearance) as well as circumstantial (criminal case number, mass of drug seized, date and place of seizure) information are collated in a dedicated database for each specimen. The study capitalises on this extended database and defines several indicators to characterise the structure of drugs markets, to follow-up on their evolution and to compare cocaine and heroin markets. Relational, spatial, temporal and quantitative analyses of data reveal the emergence and importance of distribution networks. They enable to evaluate the cross-jurisdictional character of drug trafficking and the observation time of drug batches, as well as the quantity of drugs entering the market every year. Results highlight the stable nature of drugs markets over the years despite the very dynamic flows of distribution and consumption. This research work illustrates how the systematic analysis of forensic data may elicit knowledge on criminal activities at a strategic level. In combination with information from other sources, such knowledge can help to devise intelligence-based preventive and repressive measures and to discuss the impact of countermeasures. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  18. Multiplex N-terminome Analysis of MMP-2 and MMP-9 Substrate Degradomes by iTRAQ-TAILS Quantitative Proteomics*

    PubMed Central

    Prudova, Anna; auf dem Keller, Ulrich; Butler, Georgina S.; Overall, Christopher M.

    2010-01-01

    Proteolysis is a major protein posttranslational modification that, by altering protein structure, affects protein function and, by truncating the protein sequence, alters peptide signatures of proteins analyzed by proteomics. To identify such modified and shortened protease-generated neo-N-termini on a proteome-wide basis, we developed a whole protein isobaric tag for relative and absolute quantitation (iTRAQ) labeling method that simultaneously labels and blocks all primary amines including protein N- termini and lysine side chains. Blocking lysines limits trypsin cleavage to arginine, which effectively elongates the proteolytically truncated peptides for improved MS/MS analysis and peptide identification. Incorporating iTRAQ whole protein labeling with terminal amine isotopic labeling of substrates (iTRAQ-TAILS) to enrich the N-terminome by negative selection of the blocked mature original N-termini and neo-N-termini has many advantages. It enables simultaneous characterization of the natural N-termini of proteins, their N-terminal modifications, and proteolysis product and cleavage site identification. Furthermore, iTRAQ-TAILS also enables multiplex N-terminomics analysis of up to eight samples and allows for quantification in MS2 mode, thus preventing an increase in spectral complexity and extending proteome coverage by signal amplification of low abundance proteins. We compared the substrate degradomes of two closely related matrix metalloproteinases, MMP-2 (gelatinase A) and MMP-9 (gelatinase B), in fibroblast secreted proteins. Among 3,152 unique N-terminal peptides identified corresponding to 1,054 proteins, we detected 201 cleavage products for MMP-2 and unexpectedly only 19 for the homologous MMP-9 under identical conditions. Novel substrates identified and biochemically validated include insulin-like growth factor binding protein-4, complement C1r component A, galectin-1, dickkopf-related protein-3, and thrombospondin-2. Hence, N-terminomics analyses using iTRAQ-TAILS links gelatinases with new mechanisms of action in angiogenesis and reveals unpredicted restrictions in substrate repertoires for these two very similar proteases. PMID:20305284

  19. Quantitation of Staphylococcus aureus in Seawater Using CHROMagar™ SA

    PubMed Central

    Pombo, David; Hui, Jennifer; Kurano, Michelle; Bankowski, Matthew J; Seifried, Steven E

    2010-01-01

    A microbiological algorithm has been developed to analyze beach water samples for the determination of viable colony forming units (CFU) of Staphylococcus aureus (S. aureus). Membrane filtration enumeration of S. aureus from recreational beach waters using the chromogenic media CHROMagar™SA alone yields a positive predictive value (PPV) of 70%. Presumptive CHROMagar™SA colonies were confirmed as S. aureus by 24-hour tube coagulase test. Combined, these two tests yield a PPV of 100%. This algorithm enables accurate quantitation of S. aureus in seawater in 72 hours and could support risk-prediction processes for recreational waters. A more rapid protocol, utilizing a 4-hour tube coagulase confirmatory test, enables a 48-hour turnaround time with a modest false negative rate of less than 10%. PMID:20222490

  20. High-throughput quantitation of amino acids in rat and mouse biological matrices using stable isotope labeling and UPLC-MS/MS analysis.

    PubMed

    Takach, Edward; O'Shea, Thomas; Liu, Hanlan

    2014-08-01

    Quantifying amino acids in biological matrices is typically performed using liquid chromatography (LC) coupled with fluorescent detection (FLD), requiring both derivatization and complete baseline separation of all amino acids. Due to its high specificity and sensitivity, the use of UPLC-MS/MS eliminates the derivatization step and allows for overlapping amino acid retention times thereby shortening the analysis time. Furthermore, combining UPLC-MS/MS with stable isotope labeling (e.g., isobaric tag for relative and absolute quantitation, i.e., iTRAQ) of amino acids enables quantitation while maintaining sensitivity, selectivity and speed of analysis. In this study, we report combining UPLC-MS/MS analysis with iTRAQ labeling of amino acids resulting in the elution and quantitation of 44 amino acids within 5 min demonstrating the speed and convenience of this assay over established approaches. This chromatographic analysis time represented a 5-fold improvement over the conventional HPLC-MS/MS method developed in our laboratory. In addition, the UPLC-MS/MS method demonstrated improvements in both specificity and sensitivity without loss of precision. In comparing UPLC-MS/MS and HPLC-MS/MS results of 32 detected amino acids, only 2 amino acids exhibited imprecision (RSD) >15% using UPLC-MS/MS, while 9 amino acids exhibited RSD >15% using HPLC-MS/MS. Evaluating intra- and inter-assay precision over 3 days, the quantitation range for 32 detected amino acids in rat plasma was 0.90-497 μM, with overall mean intra-day precision of less than 15% and mean inter-day precision of 12%. This UPLC-MS/MS assay was successfully implemented for the quantitative analysis of amino acids in rat and mouse plasma, along with mouse urine and tissue samples, resulting in the following concentration ranges: 0.98-431 μM in mouse plasma for 32 detected amino acids; 0.62-443 μM in rat plasma for 32 detected amino acids; 0.44-8590μM in mouse liver for 33 detected amino acids; 0.61-1241 μM in mouse kidney for 37 detected amino acids; and 1.39-1,681 μM in rat urine for 34 detected amino acids. The utility of the assay was further demonstrated by measuring and comparing plasma amino acid levels between pre-diabetic Zucker diabetic fatty rats (ZDF/Gmi fa/fa) and their lean littermates (ZDF/Gmi fa/?). Significant differences (P<0.001) in 9 amino acid concentrations were observed, with the majority ranging from a 2- to 5-fold increase in pre-diabetic ZDF rats on comparison with ZDF lean rats, consistent with previous literature reports. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Developing a national dental education research strategy: priorities, barriers and enablers

    PubMed Central

    Barton, Karen L; Dennis, Ashley A; Rees, Charlotte E

    2017-01-01

    Objectives This study aimed to identify national dental education research (DER) priorities for the next 3–5 years and to identify barriers and enablers to DER. Setting Scotland. Participants In this two-stage online questionnaire study, we collected data with multiple dental professions (eg, dentistry, dental nursing and dental hygiene) and stakeholder groups (eg, learners, clinicians, educators, managers, researchers and academics). Eighty-five participants completed the Stage 1 qualitative questionnaire and 649 participants the Stage 2 quantitative questionnaire. Results Eight themes were identified at Stage 1. Of the 24 DER priorities identified, the top three were: role of assessments in identifying competence; undergraduate curriculum prepares for practice and promoting teamwork. Following exploratory factor analysis, the 24 items loaded onto four factors: teamwork and professionalism, measuring and enhancing performance, dental workforce issues and curriculum integration and innovation. Barriers and enablers existed at multiple levels: individual, interpersonal, institutional structures and cultures and technology. Conclusions This priority setting exercise provides a necessary first step to developing a national DER strategy capturing multiple perspectives. Promoting DER requires improved resourcing alongside efforts to overcome peer stigma and lack of valuing and motivation. PMID:28360237

  2. 3D TOCSY-HSQC NMR for metabolic flux analysis using non-uniform sampling

    DOE PAGES

    Reardon, Patrick N.; Marean-Reardon, Carrie L.; Bukovec, Melanie A.; ...

    2016-02-05

    13C-Metabolic Flux Analysis ( 13C-MFA) is rapidly being recognized as the authoritative method for determining fluxes through metabolic networks. Site-specific 13C enrichment information obtained using NMR spectroscopy is a valuable input for 13C-MFA experiments. Chemical shift overlaps in the 1D or 2D NMR experiments typically used for 13C-MFA frequently hinder assignment and quantitation of site-specific 13C enrichment. Here we propose the use of a 3D TOCSY-HSQC experiment for 13C-MFA. We employ Non-Uniform Sampling (NUS) to reduce the acquisition time of the experiment to a few hours, making it practical for use in 13C-MFA experiments. Our data show that the NUSmore » experiment is linear and quantitative. Identification of metabolites in complex mixtures, such as a biomass hydrolysate, is simplified by virtue of the 13C chemical shift obtained in the experiment. In addition, the experiment reports 13C-labeling information that reveals the position specific labeling of subsets of isotopomers. As a result, the information provided by this technique will enable more accurate estimation of metabolic fluxes in larger metabolic networks.« less

  3. Complete polarization characterization of single plasmonic nanoparticle enabled by a novel Dark-field Mueller matrix spectroscopy system

    PubMed Central

    Chandel, Shubham; Soni, Jalpa; Ray, Subir kumar; Das, Anwesh; Ghosh, Anirudha; Raj, Satyabrata; Ghosh, Nirmalya

    2016-01-01

    Information on the polarization properties of scattered light from plasmonic systems are of paramount importance due to fundamental interest and potential applications. However, such studies are severely compromised due to the experimental difficulties in recording full polarization response of plasmonic nanostructures. Here, we report on a novel Mueller matrix spectroscopic system capable of acquiring complete polarization information from single isolated plasmonic nanoparticle/nanostructure. The outstanding issues pertaining to reliable measurements of full 4 × 4 spectroscopic scattering Mueller matrices from single nanoparticle/nanostructures are overcome by integrating an efficient Mueller matrix measurement scheme and a robust eigenvalue calibration method with a dark-field microscopic spectroscopy arrangement. Feasibility of quantitative Mueller matrix polarimetry and its potential utility is illustrated on a simple plasmonic system, that of gold nanorods. The demonstrated ability to record full polarization information over a broad wavelength range and to quantify the intrinsic plasmon polarimetry characteristics via Mueller matrix inverse analysis should lead to a novel route towards quantitative understanding, analysis/interpretation of a number of intricate plasmonic effects and may also prove useful towards development of polarization-controlled novel sensing schemes. PMID:27212687

  4. Electrochemical Branched-DNA Assay for Polymerase Chain Reaction-Free Detection and Quantification of Oncogenes in Messenger RNA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Ai Cheng; Dai, Ziyu; Chen, Baowei

    2008-12-01

    We describe a novel electrochemical branched-DNA (bDNA) assay for polymerase chain reaction (PCR)-free detection and quantification of p185 BCR-ABL leukemia fusion transcript in the population of messenger RNA (mRNA) extracted from cell lines. The bDNA amplifier carrying high loading of alkaline phosphatase (ALP) tracers was used to amplify targets signal. The targets were captured on microplate well surfaces through cooperative sandwich hybridization prior to the labeling of bDNA. The activity of captured ALP was monitored by square-wave voltammetric (SWV) analysis of the electroactive enzymatic product in the presence of 1-napthyl-phosphate. The specificity and sensitivity of assay enabled direct detection ofmore » target transcript in as little as 4.6 ng mRNA without PCR amplification. In combination with the use of a well-quantified standard, the electrochemical bDNA assay was capable of direct use for a PCR-free quantitative analysis of target transcript in total mRNA population. The approach thus provides a simple, sensitive, accurate and quantitative tool alternate to the RQ-PCR for early disease diagnosis.« less

  5. Spectral imaging of histological and cytological specimens

    NASA Astrophysics Data System (ADS)

    Rothmann, Chana; Malik, Zvi

    1999-05-01

    Evaluation of cell morphology by bright field microscopy is the pillar of histopathological diagnosis. The need for quantitative and objective parameters for diagnosis has given rise to the development of morphometric methods. The development of spectral imaging for biological and medical applications introduced both fields to large amounts of information extracted from a single image. Spectroscopic analysis is based on the ability of a stained histological specimen to absorb, reflect, or emit photons in ways characteristic to its interactions with specific dyes. Spectral information obtained from a histological specimen is stored in a cube whose appellate signifies the two spatial dimensions of a flat sample (x and y) and the third dimension, the spectrum, representing the light intensity for every wavelength. The spectral information stored in the cube can be further processed by morphometric analysis and quantitative procedures. Such a procedure is spectral-similarity mapping (SSM), which enables the demarcation of areas occupied by the same type of material. SSM constructs new images of the specimen, revealing areas with similar stain-macromolecule characteristics and enhancing subcellular features. Spectral imaging combined with SSM reveals nuclear organization through the differentiation stages as well as in apoptotic and necrotic conditions and identifies specifically the nucleoli domains.

  6. A Ligand-observed Mass Spectrometry Approach Integrated into the Fragment Based Lead Discovery Pipeline

    PubMed Central

    Chen, Xin; Qin, Shanshan; Chen, Shuai; Li, Jinlong; Li, Lixin; Wang, Zhongling; Wang, Quan; Lin, Jianping; Yang, Cheng; Shui, Wenqing

    2015-01-01

    In fragment-based lead discovery (FBLD), a cascade combining multiple orthogonal technologies is required for reliable detection and characterization of fragment binding to the target. Given the limitations of the mainstream screening techniques, we presented a ligand-observed mass spectrometry approach to expand the toolkits and increase the flexibility of building a FBLD pipeline especially for tough targets. In this study, this approach was integrated into a FBLD program targeting the HCV RNA polymerase NS5B. Our ligand-observed mass spectrometry analysis resulted in the discovery of 10 hits from a 384-member fragment library through two independent screens of complex cocktails and a follow-up validation assay. Moreover, this MS-based approach enabled quantitative measurement of weak binding affinities of fragments which was in general consistent with SPR analysis. Five out of the ten hits were then successfully translated to X-ray structures of fragment-bound complexes to lay a foundation for structure-based inhibitor design. With distinctive strengths in terms of high capacity and speed, minimal method development, easy sample preparation, low material consumption and quantitative capability, this MS-based assay is anticipated to be a valuable addition to the repertoire of current fragment screening techniques. PMID:25666181

  7. Automated Detection of Electroencephalography Artifacts in Human, Rodent and Canine Subjects using Machine Learning.

    PubMed

    Levitt, Joshua; Nitenson, Adam; Koyama, Suguru; Heijmans, Lonne; Curry, James; Ross, Jason T; Kamerling, Steven; Saab, Carl Y

    2018-06-23

    Electroencephalography (EEG) invariably contains extra-cranial artifacts that are commonly dealt with based on qualitative and subjective criteria. Failure to account for EEG artifacts compromises data interpretation. We have developed a quantitative and automated support vector machine (SVM)-based algorithm to accurately classify artifactual EEG epochs in awake rodent, canine and humans subjects. An embodiment of this method also enables the determination of 'eyes open/closed' states in human subjects. The levels of SVM accuracy for artifact classification in humans, Sprague Dawley rats and beagle dogs were 94.17%, 83.68%, and 85.37%, respectively, whereas 'eyes open/closed' states in humans were labeled with 88.60% accuracy. Each of these results was significantly higher than chance. Comparison with Existing Methods: Other existing methods, like those dependent on Independent Component Analysis, have not been tested in non-human subjects, and require full EEG montages, instead of only single channels, as this method does. We conclude that our EEG artifact detection algorithm provides a valid and practical solution to a common problem in the quantitative analysis and assessment of EEG in pre-clinical research settings across evolutionary spectra. Copyright © 2018. Published by Elsevier B.V.

  8. Simultaneous Quantification of Multiple Alternatively Spliced mRNA Transcripts Using Droplet Digital PCR.

    PubMed

    Sun, Bing; Zheng, Yun-Ling

    2018-01-01

    Currently there is no sensitive, precise, and reproducible method to quantitate alternative splicing of mRNA transcripts. Droplet digital™ PCR (ddPCR™) analysis allows for accurate digital counting for quantification of gene expression. Human telomerase reverse transcriptase (hTERT) is one of the essential components required for telomerase activity and for the maintenance of telomeres. Several alternatively spliced forms of hTERT mRNA in human primary and tumor cells have been reported in the literature. Using one pair of primers and two probes for hTERT, four alternatively spliced forms of hTERT (α-/β+, α+/β- single deletions, α-/β- double deletion, and nondeletion α+/β+) were accurately quantified through a novel analysis method via data collected from a single ddPCR reaction. In this chapter, we describe this ddPCR method that enables direct quantitative comparison of four alternatively spliced forms of the hTERT messenger RNA without the need for internal standards or multiple pairs of primers specific for each variant, eliminating the technical variation due to differential PCR amplification efficiency for different amplicons and the challenges of quantification using standard curves. This simple and straightforward method should have general utility for quantifying alternatively spliced gene transcripts.

  9. CASTIN: a system for comprehensive analysis of cancer-stromal interactome.

    PubMed

    Komura, Daisuke; Isagawa, Takayuki; Kishi, Kazuki; Suzuki, Ryohei; Sato, Reiko; Tanaka, Mariko; Katoh, Hiroto; Yamamoto, Shogo; Tatsuno, Kenji; Fukayama, Masashi; Aburatani, Hiroyuki; Ishikawa, Shumpei

    2016-11-09

    Cancer microenvironment plays a vital role in cancer development and progression, and cancer-stromal interactions have been recognized as important targets for cancer therapy. However, identifying relevant and druggable cancer-stromal interactions is challenging due to the lack of quantitative methods to analyze whole cancer-stromal interactome. We present CASTIN (CAncer-STromal INteractome analysis), a novel framework for the evaluation of cancer-stromal interactome from RNA-Seq data using cancer xenograft models. For each ligand-receptor interaction which is derived from curated protein-protein interaction database, CASTIN summarizes gene expression profiles of cancer and stroma into three evaluation indices. These indices provide quantitative evaluation and comprehensive visualization of interactome, and thus enable to identify critical cancer-microenvironment interactions, which would be potential drug targets. We applied CASTIN to the dataset of pancreas ductal adenocarcinoma, and successfully characterized the individual cancer in terms of cancer-stromal relationships, and identified both well-known and less-characterized druggable interactions. CASTIN provides comprehensive view of cancer-stromal interactome and is useful to identify critical interactions which may serve as potential drug targets in cancer-microenvironment. CASTIN is available at: http://github.com/tmd-gpat/CASTIN .

  10. Quantitative electroencephalography in a swine model of blast-induced brain injury.

    PubMed

    Chen, Chaoyang; Zhou, Chengpeng; Cavanaugh, John M; Kallakuri, Srinivasu; Desai, Alok; Zhang, Liying; King, Albert I

    2017-01-01

    Electroencephalography (EEG) was used to examine brain activity abnormalities earlier after blast exposure using a swine model to develop a qEEG data analysis protocol. Anaesthetized swine were exposed to 420-450 Kpa blast overpressure and survived for 3 days after blast. EEG recordings were performed at 15 minutes before the blast and 15 minutes, 30 minutes, 2 hours and 1, 2 and 3 days post-blast using surface recording electrodes and a Biopac 4-channel data acquisition system. Off-line quantitative EEG (qEEG) data analysis was performed to determine qEEG changes. Blast induced qEEG changes earlier after blast exposure, including a decrease of mean amplitude (MAMP), an increase of delta band power, a decrease of alpha band root mean square (RMS) and a decrease of 90% spectral edge frequency (SEF90). This study demonstrated that qEEG is sensitive for cerebral injury. The changes of qEEG earlier after the blast indicate the potential of utilization of multiple parameters of qEEG for diagnosis of blast-induced brain injury. Early detection of blast induced brain injury will allow early screening and assessment of brain abnormalities in soldiers to enable timely therapeutic intervention.

  11. Effectiveness of suicide prevention programs for emergency and protective services employees: A systematic review and meta-analysis.

    PubMed

    Witt, Katrina; Milner, Allison; Allisey, Amanda; Davenport, Lauren; LaMontagne, Anthony D

    2017-04-01

    This brief report summarizes the international literature on the effectiveness of suicide prevention programs for protective and emergency services employees. A systematic search of 11 electronic databases was undertaken until June 30, 2015. Quantitative meta-analysis was undertaken to investigate the effectiveness of these programs on suicide rates at post-intervention. Qualitative analyses were also used to identify program components that may be associated with reductions in suicide rates. A total of 13 studies were included. Only six reported sufficient information on suicide rates to enable inclusion in quantitative analyses, however. On average, these programs were associated with an approximate halving in suicide rates over an average follow-up period of 5.25 years (SD = 4.2; range: 1-11) (Incidence Rate Ratio 0.45, 95%CI 0.31-0.65; five studies; I 2 14.8%). Few programs integrated activities at the primary prevention level. A greater focus on the relatively neglected area of workplace primary prevention could further improve suicide prevention effectiveness. Am. J. Ind. Med. 60:394-407, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  12. A Statistics-based Platform for Quantitative N-terminome Analysis and Identification of Protease Cleavage Products*

    PubMed Central

    auf dem Keller, Ulrich; Prudova, Anna; Gioia, Magda; Butler, Georgina S.; Overall, Christopher M.

    2010-01-01

    Terminal amine isotopic labeling of substrates (TAILS), our recently introduced platform for quantitative N-terminome analysis, enables wide dynamic range identification of original mature protein N-termini and protease cleavage products. Modifying TAILS by use of isobaric tag for relative and absolute quantification (iTRAQ)-like labels for quantification together with a robust statistical classifier derived from experimental protease cleavage data, we report reliable and statistically valid identification of proteolytic events in complex biological systems in MS2 mode. The statistical classifier is supported by a novel parameter evaluating ion intensity-dependent quantification confidences of single peptide quantifications, the quantification confidence factor (QCF). Furthermore, the isoform assignment score (IAS) is introduced, a new scoring system for the evaluation of single peptide-to-protein assignments based on high confidence protein identifications in the same sample prior to negative selection enrichment of N-terminal peptides. By these approaches, we identified and validated, in addition to known substrates, low abundance novel bioactive MMP-2 targets including the plasminogen receptor S100A10 (p11) and the proinflammatory cytokine proEMAP/p43 that were previously undescribed. PMID:20305283

  13. Acoustic Facies Analysis of Side-Scan Sonar Data

    NASA Astrophysics Data System (ADS)

    Dwan, Fa Shu

    Acoustic facies analysis methods have allowed the generation of system-independent values for the quantitative seafloor acoustic parameter, backscattering strength, from GLORIA and (TAMU) ^2 side-scan sonar data. The resulting acoustic facies parameters enable quantitative comparisons of data collected by different sonar systems, data from different environments, and measurements made with survey geometries. Backscattering strength values were extracted from the sonar amplitude data by inversion based on the sonar equation. Image processing products reveal seafloor features and patterns of relative intensity. To quantitatively compare data collected at different times or by different systems, and to ground truth-measurements and geoacoustic models, quantitative corrections must be made on any given data set for system source level, beam pattern, time-varying gain, processing gain, transmission loss, absorption, insonified area contribution, and grazing angle effects. In the sonar equation, backscattering strength is the sonar parameter which is directly related to seafloor properties. The GLORIA data used in this study are from the edge of a distal lobe of the Monterey Fan. An interfingered region of strong and weak seafloor signal returns from a flat seafloor region provides an ideal data set for this study. Inversion of imagery data from the region allows the quantitative definition of different acoustic facies. The (TAMU) ^2 data used are from a calibration site near the Green Canyon area of the Gulf of Mexico. Acoustic facies analysis techniques were implemented to generate statistical information for acoustic facies based on the estimates of backscattering strength. The backscattering strength values have been compared with Lambert's Law and other functions to parameterize the description of the acoustic facies. The resulting Lambertian constant values range from -26 dB to -36 dB. A modified Lambert relationship, which consists of both intercept and slope terms, appears to represent the BSS versus grazing angle profiles better based on chi^2 testing and error ellipse generation. Different regression functions, composed of trigonometric functions, were analyzed for different segments of the BSS profiles. A cotangent or sine/cosine function shows promising results for representing the entire grazing angle span of the BSS profiles.

  14. Evaluation of peptide adsorption-controlled liquid chromatography-tandem mass spectrometric (PAC-LC-MS/MS) method for simple and simultaneous quantitation of amyloid β 1-38, 1-40, 1-42 and 1-43 peptides in dog cerebrospinal fluid.

    PubMed

    Goda, Ryoya; Kobayashi, Nobuhiro

    2012-05-01

    To evaluate the usefulness of the peptide adsorption-controlled liquid chromatography-tandem mass spectrometry (PAC-LC-MS/MS) for reproducible measurement of peptides in biological fluids, simultaneous quantitation of amyloid β 1-38, 1-40, 1-42 and 1-43 peptides (Aβ38, Aβ40, Aβ42 and Aβ43) in dog cerebrospinal fluid (CSF) was tried. Each stable isotope labeled Aβ was used as the internal standard to minimize the influence of CSF matrix on the reproducible Aβ quantitation. To reduce a loss of Aβ during the pretreatment procedures, the dog CSF diluted by water-acetic acid-methanol (2:6:1, v/v/v) was loaded on PAC-LC-MS/MS directly. Quantification of the Aβ in the diluted dog CSF was carried out using multiple reaction monitoring (MRM) mode. The [M+5H(5+)] and b(5+) ion fragment of each peptide were chosen as the precursor and product ions for MRM transitions of each peptide. The calibration curves were drawn from Aβ standard calibration solutions using PAC-LC-MS/MS. Analysis of dog CSF samples suggests that the basal concentration of Aβ38, Aβ40, Aβ42 and Aβ43 in dog CSF is approximately 300, 900, 200 and 30 pM, respectively. This is the first time Aβ concentrations in dog CSF have been reported. Additionally, the evaluation of intra- and inter-day reproducibility of analysis of Aβ standard solution, the freeze-thaw stability and the room temperature stability of Aβ standard solution suggest that the PAC-LC-MS/MS method enables reproducible Aβ quantitation. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Uncertainty Analysis for Angle Calibrations Using Circle Closure

    PubMed Central

    Estler, W. Tyler

    1998-01-01

    We analyze two types of full-circle angle calibrations: a simple closure in which a single set of unknown angular segments is sequentially compared with an unknown reference angle, and a dual closure in which two divided circles are simultaneously calibrated by intercomparison. In each case, the constraint of circle closure provides auxiliary information that (1) enables a complete calibration process without reference to separately calibrated reference artifacts, and (2) serves to reduce measurement uncertainty. We derive closed-form expressions for the combined standard uncertainties of angle calibrations, following guidelines published by the International Organization for Standardization (ISO) and NIST. The analysis includes methods for the quantitative evaluation of the standard uncertainty of small angle measurement using electronic autocollimators, including the effects of calibration uncertainty and air turbulence. PMID:28009359

  16. Photo ion spectrometer

    DOEpatents

    Gruen, Dieter M.; Young, Charles E.; Pellin, Michael J.

    1989-01-01

    A charged particle spectrometer for performing ultrasensitive quantitative analysis of selected atomic components removed from a sample. Significant improvements in performing energy and angular refocusing spectroscopy are accomplished by means of a two dimensional structure for generating predetermined electromagnetic field boundary conditions. Both resonance and non-resonance ionization of selected neutral atomic components allow accumulation of increased chemical information. A multiplexed operation between a SIMS mode and a neutral atomic component ionization mode with EARTOF analysis enables comparison of chemical information from secondary ions and neutral atomic components removed from the sample. An electronic system is described for switching high level signals, such as SIMS signals, directly to a transient recorder and through a charge amplifier to the transient recorder for a low level signal pulse counting mode, such as for a neutral atomic component ionization mode.

  17. Image analysis of speckle patterns as a probe of melting transitions in laser-heated diamond anvil cell experiments.

    PubMed

    Salem, Ran; Matityahu, Shlomi; Melchior, Aviva; Nikolaevsky, Mark; Noked, Ori; Sterer, Eran

    2015-09-01

    The precision of melting curve measurements using laser-heated diamond anvil cell (LHDAC) is largely limited by the correct and reliable determination of the onset of melting. We present a novel image analysis of speckle interference patterns in the LHDAC as a way to define quantitative measures which enable an objective determination of the melting transition. Combined with our low-temperature customized IR pyrometer, designed for measurements down to 500 K, our setup allows studying the melting curve of materials with low melting temperatures, with relatively high precision. As an application, the melting curve of Te was measured up to 35 GPa. The results are found to be in good agreement with previous data obtained at pressures up to 10 GPa.

  18. Rocketdyne PSAM: In-house enhancement/application

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Rajagopal, K. R.; Ohara, K.

    1991-01-01

    The development was initiated of the Probabilistic Design Analysis (PDA) Process for rocket engines. This will enable engineers a quantitative assessment of calculated reliability during the design process. The PDA will help choose better designs, make them more robust, and help decide on critical tests to help demonstrate key reliability issues to aid in improving the confidence of the engine capabilities. Rockedyne's involvement with the Composite Loads Spectra (CLS) and Probabilistic Structural Analysis Methodology (PSAM) contracts started this effort and are key elements in the on-going developments. Internal development efforts and hardware applications complement and extend the CLS and PSAM efforts. The completion of the CLS option work and the follow-on PSAM developments will also be integral parts of this methodology. A brief summary of these efforts is presented.

  19. Review of Software Tools for Design and Analysis of Large scale MRM Proteomic Datasets

    PubMed Central

    Colangelo, Christopher M.; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi

    2013-01-01

    Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow. PMID:23702368

  20. Cell wall staining with Trypan blue enables quantitative analysis of morphological changes in yeast cells.

    PubMed

    Liesche, Johannes; Marek, Magdalena; Günther-Pomorski, Thomas

    2015-01-01

    Yeast cells are protected by a cell wall that plays an important role in the exchange of substances with the environment. The cell wall structure is dynamic and can adapt to different physiological states or environmental conditions. For the investigation of morphological changes, selective staining with fluorescent dyes is a valuable tool. Furthermore, cell wall staining is used to facilitate sub-cellular localization experiments with fluorescently-labeled proteins and the detection of yeast cells in non-fungal host tissues. Here, we report staining of Saccharomyces cerevisiae cell wall with Trypan Blue, which emits strong red fluorescence upon binding to chitin and yeast glucan; thereby, it facilitates cell wall analysis by confocal and super-resolution microscopy. The staining pattern of Trypan Blue was similar to that of the widely used UV-excitable, blue fluorescent cell wall stain Calcofluor White. Trypan Blue staining facilitated quantification of cell size and cell wall volume when utilizing the optical sectioning capacity of a confocal microscope. This enabled the quantification of morphological changes during growth under anaerobic conditions and in the presence of chemicals, demonstrating the potential of this approach for morphological investigations or screening assays.

Top