Sample records for allowing quantitative analysis

  1. Tissue microarrays and quantitative tissue-based image analysis as a tool for oncology biomarker and diagnostic development.

    PubMed

    Dolled-Filhart, Marisa P; Gustavson, Mark D

    2012-11-01

    Translational oncology has been improved by using tissue microarrays (TMAs), which facilitate biomarker analysis of large cohorts on a single slide. This has allowed for rapid analysis and validation of potential biomarkers for prognostic and predictive value, as well as for evaluation of biomarker prevalence. Coupled with quantitative analysis of immunohistochemical (IHC) staining, objective and standardized biomarker data from tumor samples can further advance companion diagnostic approaches for the identification of drug-responsive or resistant patient subpopulations. This review covers the advantages, disadvantages and applications of TMAs for biomarker research. Research literature and reviews of TMAs and quantitative image analysis methodology have been surveyed for this review (with an AQUA® analysis focus). Applications such as multi-marker diagnostic development and pathway-based biomarker subpopulation analyses are described. Tissue microarrays are a useful tool for biomarker analyses including prevalence surveys, disease progression assessment and addressing potential prognostic or predictive value. By combining quantitative image analysis with TMAs, analyses will be more objective and reproducible, allowing for more robust IHC-based diagnostic test development. Quantitative multi-biomarker IHC diagnostic tests that can predict drug response will allow for greater success of clinical trials for targeted therapies and provide more personalized clinical decision making.

  2. Targeted Quantitation of Proteins by Mass Spectrometry

    PubMed Central

    2013-01-01

    Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement. PMID:23517332

  3. Targeted quantitation of proteins by mass spectrometry.

    PubMed

    Liebler, Daniel C; Zimmerman, Lisa J

    2013-06-04

    Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement.

  4. Data from quantitative label free proteomics analysis of rat spleen.

    PubMed

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.

  5. Time-Gated Raman Spectroscopy for Quantitative Determination of Solid-State Forms of Fluorescent Pharmaceuticals.

    PubMed

    Lipiäinen, Tiina; Pessi, Jenni; Movahedi, Parisa; Koivistoinen, Juha; Kurki, Lauri; Tenhunen, Mari; Yliruusi, Jouko; Juppo, Anne M; Heikkonen, Jukka; Pahikkala, Tapio; Strachan, Clare J

    2018-04-03

    Raman spectroscopy is widely used for quantitative pharmaceutical analysis, but a common obstacle to its use is sample fluorescence masking the Raman signal. Time-gating provides an instrument-based method for rejecting fluorescence through temporal resolution of the spectral signal and allows Raman spectra of fluorescent materials to be obtained. An additional practical advantage is that analysis is possible in ambient lighting. This study assesses the efficacy of time-gated Raman spectroscopy for the quantitative measurement of fluorescent pharmaceuticals. Time-gated Raman spectroscopy with a 128 × (2) × 4 CMOS SPAD detector was applied for quantitative analysis of ternary mixtures of solid-state forms of the model drug, piroxicam (PRX). Partial least-squares (PLS) regression allowed quantification, with Raman-active time domain selection (based on visual inspection) improving performance. Model performance was further improved by using kernel-based regularized least-squares (RLS) regression with greedy feature selection in which the data use in both the Raman shift and time dimensions was statistically optimized. Overall, time-gated Raman spectroscopy, especially with optimized data analysis in both the spectral and time dimensions, shows potential for sensitive and relatively routine quantitative analysis of photoluminescent pharmaceuticals during drug development and manufacturing.

  6. High performance thin layer chromatography (HPTLC) and high performance liquid chromatography (HPLC) for the qualitative and quantitative analysis of Calendula officinalis-advantages and limitations.

    PubMed

    Loescher, Christine M; Morton, David W; Razic, Slavica; Agatonovic-Kustrin, Snezana

    2014-09-01

    Chromatography techniques such as HPTLC and HPLC are commonly used to produce a chemical fingerprint of a plant to allow identification and quantify the main constituents within the plant. The aims of this study were to compare HPTLC and HPLC, for qualitative and quantitative analysis of the major constituents of Calendula officinalis and to investigate the effect of different extraction techniques on the C. officinalis extract composition from different parts of the plant. The results found HPTLC to be effective for qualitative analysis, however, HPLC was found to be more accurate for quantitative analysis. A combination of the two methods may be useful in a quality control setting as it would allow rapid qualitative analysis of herbal material while maintaining accurate quantification of extract composition. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Qualitative and quantitative analysis of mixtures of compounds containing both hydrogen and deuterium

    NASA Technical Reports Server (NTRS)

    Crespi, H. L.; Harkness, L.; Katz, J. J.; Norman, G.; Saur, W.

    1969-01-01

    Method allows qualitative and quantitative analysis of mixtures of partially deuterated compounds. Nuclear magnetic resonance spectroscopy determines location and amount of deuterium in organic compounds but not fully deuterated compounds. Mass spectroscopy can detect fully deuterated species but not the location.

  8. Developing Sampling Frame for Case Study: Challenges and Conditions

    ERIC Educational Resources Information Center

    Ishak, Noriah Mohd; Abu Bakar, Abu Yazid

    2014-01-01

    Due to statistical analysis, the issue of random sampling is pertinent to any quantitative study. Unlike quantitative study, the elimination of inferential statistical analysis, allows qualitative researchers to be more creative in dealing with sampling issue. Since results from qualitative study cannot be generalized to the bigger population,…

  9. Renal geology (quantitative renal stone analysis) by 'Fourier transform infrared spectroscopy'.

    PubMed

    Singh, Iqbal

    2008-01-01

    To prospectively determine the precise stone composition (quantitative analysis) by using infrared spectroscopy in patients with urinary stone disease presenting to our clinic. To determine an ideal method for stone analysis suitable for use in a clinical setting. After routine and a detailed metabolic workup of all patients of urolithiasis, stone samples of 50 patients of urolithiasis satisfying the entry criteria were subjected to the Fourier transform infrared spectroscopic analysis after adequate sample homogenization at a single testing center. Calcium oxalate monohydrate and dihydrate stone mixture was most commonly encountered in 35 (71%) followed by calcium phosphate, carbonate apatite, magnesium ammonium hexahydrate and xanthine stones. Fourier transform infrared spectroscopy allows an accurate, reliable quantitative method of stone analysis. It also helps in maintaining a computerized large reference library. Knowledge of precise stone composition may allow the institution of appropriate prophylactic therapy despite the absence of any detectable metabolic abnormalities. This may prevent and or delay stone recurrence.

  10. Relating interesting quantitative time series patterns with text events and text features

    NASA Astrophysics Data System (ADS)

    Wanner, Franz; Schreck, Tobias; Jentner, Wolfgang; Sharalieva, Lyubka; Keim, Daniel A.

    2013-12-01

    In many application areas, the key to successful data analysis is the integrated analysis of heterogeneous data. One example is the financial domain, where time-dependent and highly frequent quantitative data (e.g., trading volume and price information) and textual data (e.g., economic and political news reports) need to be considered jointly. Data analysis tools need to support an integrated analysis, which allows studying the relationships between textual news documents and quantitative properties of the stock market price series. In this paper, we describe a workflow and tool that allows a flexible formation of hypotheses about text features and their combinations, which reflect quantitative phenomena observed in stock data. To support such an analysis, we combine the analysis steps of frequent quantitative and text-oriented data using an existing a-priori method. First, based on heuristics we extract interesting intervals and patterns in large time series data. The visual analysis supports the analyst in exploring parameter combinations and their results. The identified time series patterns are then input for the second analysis step, in which all identified intervals of interest are analyzed for frequent patterns co-occurring with financial news. An a-priori method supports the discovery of such sequential temporal patterns. Then, various text features like the degree of sentence nesting, noun phrase complexity, the vocabulary richness, etc. are extracted from the news to obtain meta patterns. Meta patterns are defined by a specific combination of text features which significantly differ from the text features of the remaining news data. Our approach combines a portfolio of visualization and analysis techniques, including time-, cluster- and sequence visualization and analysis functionality. We provide two case studies, showing the effectiveness of our combined quantitative and textual analysis work flow. The workflow can also be generalized to other application domains such as data analysis of smart grids, cyber physical systems or the security of critical infrastructure, where the data consists of a combination of quantitative and textual time series data.

  11. SDAR 1.0 a New Quantitative Toolkit for Analyze Stratigraphic Data

    NASA Astrophysics Data System (ADS)

    Ortiz, John; Moreno, Carlos; Cardenas, Andres; Jaramillo, Carlos

    2015-04-01

    Since the foundation of stratigraphy geoscientists have recognized that data obtained from stratigraphic columns (SC), two dimensional schemes recording descriptions of both geological and paleontological features (e.g., thickness of rock packages, grain size, fossil and lithological components, and sedimentary structures), are key elements for establishing reliable hypotheses about the distribution in space and time of rock sequences, and ancient sedimentary environmental and paleobiological dynamics. Despite the tremendous advances on the way geoscientists store, plot, and quantitatively analyze sedimentological and paleontological data (e.g., Macrostrat [http://www.macrostrat.org/], Paleobiology Database [http://www.paleodb.org/], respectively), there is still a lack of computational methodologies designed to quantitatively examine data from a highly detailed SCs. Moreover, frequently the stratigraphic information is plotted "manually" using vector graphics editors (e.g., Corel Draw, Illustrator), however, this information although store on a digital format, cannot be used readily for any quantitative analysis. Therefore, any attempt to examine the stratigraphic data in an analytical fashion necessarily takes further steps. Given these issues, we have developed the sofware 'Stratigraphic Data Analysis in R' (SDAR), which stores in a database all sedimentological, stratigraphic, and paleontological information collected from a SC, allowing users to generate high-quality graphic plots (including one or multiple features stored in the database). SDAR also encompasses quantitative analyses helping users to quantify stratigraphic information (e.g. grain size, sorting and rounding, proportion of sand/shale). Finally, given that the SDAR analysis module, has been written in the open-source high-level computer language "R graphics/statistics language" [R Development Core Team, 2014], it is already loaded with many of the crucial features required to accomplish basic and complex tasks of statistical analysis (i.e., R language provide more than hundred spatial libraries that allow users to explore various Geostatistics and spatial analysis). Consequently, SDAR allows a deeper exploration of the stratigraphic data collected in the field, it will allow the geoscientific community in the near future to develop complex analyses related with the distribution in space and time of rock sequences, such as lithofacial correlations, by a multivariate comparison between empirical SCs with quantitative lithofacial models established from modern sedimentary environments.

  12. Control of separation and quantitative analysis by GC-FTIR

    NASA Astrophysics Data System (ADS)

    Semmoud, A.; Huvenne, Jean P.; Legrand, P.

    1992-03-01

    Software for 3-D representations of the 'Absorbance-Wavenumber-Retention time' is used to control the quality of the GC separation. Spectral information given by the FTIR detection allows the user to be sure that a chromatographic peak is 'pure.' The analysis of peppermint essential oil is presented as an example. This assurance is absolutely required for quantitative applications. In these conditions, we have worked out a quantitative analysis of caffeine. Correlation coefficients between integrated absorbance measurements and concentration of caffeine are discussed at two steps of the data treatment.

  13. Fluorescence-based Western blotting for quantitation of protein biomarkers in clinical samples.

    PubMed

    Zellner, Maria; Babeluk, Rita; Diestinger, Michael; Pirchegger, Petra; Skeledzic, Senada; Oehler, Rudolf

    2008-09-01

    Since most high throughput techniques used in biomarker discovery are very time and cost intensive, highly specific and quantitative analytical alternative application methods are needed for the routine analysis. Conventional Western blotting allows detection of specific proteins to the level of single isotypes while its quantitative accuracy is rather limited. We report a novel and improved quantitative Western blotting method. The use of fluorescently labelled secondary antibodies strongly extends the dynamic range of the quantitation and improves the correlation with the protein amount (r=0.997). By an additional fluorescent staining of all proteins immediately after their transfer to the blot membrane, it is possible to visualise simultaneously the antibody binding and the total protein profile. This allows for an accurate correction for protein load. Applying this normalisation it could be demonstrated that fluorescence-based Western blotting is able to reproduce a quantitative analysis of two specific proteins in blood platelet samples from 44 subjects with different diseases as initially conducted by 2D-DIGE. These results show that the proposed fluorescence-based Western blotting is an adequate application technique for biomarker quantitation and suggest possibilities of employment that go far beyond.

  14. Reinventing the ames test as a quantitative lab that connects classical and molecular genetics.

    PubMed

    Goodson-Gregg, Nathan; De Stasio, Elizabeth A

    2009-01-01

    While many institutions use a version of the Ames test in the undergraduate genetics laboratory, students typically are not exposed to techniques or procedures beyond qualitative analysis of phenotypic reversion, thereby seriously limiting the scope of learning. We have extended the Ames test to include both quantitative analysis of reversion frequency and molecular analysis of revertant gene sequences. By giving students a role in designing their quantitative methods and analyses, students practice and apply quantitative skills. To help students connect classical and molecular genetic concepts and techniques, we report here procedures for characterizing the molecular lesions that confer a revertant phenotype. We suggest undertaking reversion of both missense and frameshift mutants to allow a more sophisticated molecular genetic analysis. These modifications and additions broaden the educational content of the traditional Ames test teaching laboratory, while simultaneously enhancing students' skills in experimental design, quantitative analysis, and data interpretation.

  15. Stable Isotope Quantitative N-Glycan Analysis by Liquid Separation Techniques and Mass Spectrometry.

    PubMed

    Mittermayr, Stefan; Albrecht, Simone; Váradi, Csaba; Millán-Martín, Silvia; Bones, Jonathan

    2017-01-01

    Liquid phase separation analysis and subsequent quantitation remains a challenging task for protein-derived oligosaccharides due to their inherent structural complexity and diversity. Incomplete resolution or co-detection of multiple glycan species complicates peak area-based quantitation and associated statistical analysis when optical detection methods are used. The approach outlined herein describes the utilization of stable isotope variants of commonly used fluorescent tags that allow for mass-based glycan identification and relative quantitation following separation by liquid chromatography (LC) or capillary electrophoresis (CE). Comparability assessment of glycoprotein-derived oligosaccharides is performed by derivatization with commercially available isotope variants of 2-aminobenzoic acid or aniline and analysis by LC- and CE-mass spectrometry. Quantitative information is attained from the extracted ion chromatogram/electropherogram ratios generated from the light and heavy isotope clusters.

  16. Quantitative analysis of single-molecule superresolution images

    PubMed Central

    Coltharp, Carla; Yang, Xinxing; Xiao, Jie

    2014-01-01

    This review highlights the quantitative capabilities of single-molecule localization-based superresolution imaging methods. In addition to revealing fine structural details, the molecule coordinate lists generated by these methods provide the critical ability to quantify the number, clustering, and colocalization of molecules with 10 – 50 nm resolution. Here we describe typical workflows and precautions for quantitative analysis of single-molecule superresolution images. These guidelines include potential pitfalls and essential control experiments, allowing critical assessment and interpretation of superresolution images. PMID:25179006

  17. Quantitative Analysis of Cancer Cell Migration in Gradients Of EGF, HGF, and SDF-alpha Using a Microfluidic Chemotaxis Device

    DTIC Science & Technology

    2005-01-01

    Quantitative Analysis of Cancer Cell Migration in Gradients of EGF, HGF, and SDF-alpha Using a Microfluidic Chemotaxis Device The University of California...allowing for parallel analysis . Additionally, simple methods of localizing gels into microdevices are demonstrated. The device was characterized by...To overcome some of these drawbacks, several approaches have utilized free diffusion to produce gradients in static environ - ments.5-9 However

  18. Tannin structural elucidation and quantitative ³¹P NMR analysis. 2. Hydrolyzable tannins and proanthocyanidins.

    PubMed

    Melone, Federica; Saladino, Raffaele; Lange, Heiko; Crestini, Claudia

    2013-10-02

    An unprecedented analytical method that allows simultaneous structural and quantitative characterization of all functional groups present in tannins is reported. In situ labeling of all labile H groups (aliphatic and phenolic hydroxyls and carboxylic acids) with a phosphorus-containing reagent (Cl-TMDP) followed by quantitative ³¹P NMR acquisition constitutes a novel fast and reliable analytical tool for the analysis of tannins and proanthocyanidins with significant implications for the fields of food and feed analyses, tannery, and the development of natural polyphenolics containing products.

  19. Benefit-risk analysis : a brief review and proposed quantitative approaches.

    PubMed

    Holden, William L

    2003-01-01

    Given the current status of benefit-risk analysis as a largely qualitative method, two techniques for a quantitative synthesis of a drug's benefit and risk are proposed to allow a more objective approach. The recommended methods, relative-value adjusted number-needed-to-treat (RV-NNT) and its extension, minimum clinical efficacy (MCE) analysis, rely upon efficacy or effectiveness data, adverse event data and utility data from patients, describing their preferences for an outcome given potential risks. These methods, using hypothetical data for rheumatoid arthritis drugs, demonstrate that quantitative distinctions can be made between drugs which would better inform clinicians, drug regulators and patients about a drug's benefit-risk profile. If the number of patients needed to treat is less than the relative-value adjusted number-needed-to-harm in an RV-NNT analysis, patients are willing to undergo treatment with the experimental drug to derive a certain benefit knowing that they may be at risk for any of a series of potential adverse events. Similarly, the results of an MCE analysis allow for determining the worth of a new treatment relative to an older one, given not only the potential risks of adverse events and benefits that may be gained, but also by taking into account the risk of disease without any treatment. Quantitative methods of benefit-risk analysis have a place in the evaluative armamentarium of pharmacovigilance, especially those that incorporate patients' perspectives.

  20. Assessment and monitoring of forest ecosystem structure

    Treesearch

    Oscar A. Aguirre Calderón; Javier Jiménez Pérez; Horst Kramer

    2006-01-01

    Characterization of forest ecosystems structure must be based on quantitative indices that allow objective analysis of human influences or natural succession processes. The objective of this paper is the compilation of diverse quantitative variables to describe structural attributes from the arboreal stratum of the ecosystem, as well as different methods of forest...

  1. Quantitative characterization of nanoscale polycrystalline magnets with electron magnetic circular dichroism.

    PubMed

    Muto, Shunsuke; Rusz, Ján; Tatsumi, Kazuyoshi; Adam, Roman; Arai, Shigeo; Kocevski, Vancho; Oppeneer, Peter M; Bürgler, Daniel E; Schneider, Claus M

    2014-01-01

    Electron magnetic circular dichroism (EMCD) allows the quantitative, element-selective determination of spin and orbital magnetic moments, similar to its well-established X-ray counterpart, X-ray magnetic circular dichroism (XMCD). As an advantage over XMCD, EMCD measurements are made using transmission electron microscopes, which are routinely operated at sub-nanometre resolution, thereby potentially allowing nanometre magnetic characterization. However, because of the low intensity of the EMCD signal, it has not yet been possible to obtain quantitative information from EMCD signals at the nanoscale. Here we demonstrate a new approach to EMCD measurements that considerably enhances the outreach of the technique. The statistical analysis introduced here yields robust quantitative EMCD signals. Moreover, we demonstrate that quantitative magnetic information can be routinely obtained using electron beams of only a few nanometres in diameter without imposing any restriction regarding the crystalline order of the specimen.

  2. Dark Field Microscopy for Analytical Laboratory Courses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Augspurger, Ashley E; Stender, Anthony S; Marchuk, Kyle

    2014-06-10

    An innovative and inexpensive optical microscopy experiment for a quantitative analysis or an instrumental analysis chemistry course is described. The students have hands-on experience with a dark field microscope and investigate the wavelength dependence of localized surface plasmon resonance in gold and silver nanoparticles. Students also observe and measure individual crystal growth during a replacement reaction between copper and silver nitrate. The experiment allows for quantitative, qualitative, and image data analyses for undergraduate students.

  3. Evaluation of a web based informatics system with data mining tools for predicting outcomes with quantitative imaging features in stroke rehabilitation clinical trials

    NASA Astrophysics Data System (ADS)

    Wang, Ximing; Kim, Bokkyu; Park, Ji Hoon; Wang, Erik; Forsyth, Sydney; Lim, Cody; Ravi, Ragini; Karibyan, Sarkis; Sanchez, Alexander; Liu, Brent

    2017-03-01

    Quantitative imaging biomarkers are used widely in clinical trials for tracking and evaluation of medical interventions. Previously, we have presented a web based informatics system utilizing quantitative imaging features for predicting outcomes in stroke rehabilitation clinical trials. The system integrates imaging features extraction tools and a web-based statistical analysis tool. The tools include a generalized linear mixed model(GLMM) that can investigate potential significance and correlation based on features extracted from clinical data and quantitative biomarkers. The imaging features extraction tools allow the user to collect imaging features and the GLMM module allows the user to select clinical data and imaging features such as stroke lesion characteristics from the database as regressors and regressands. This paper discusses the application scenario and evaluation results of the system in a stroke rehabilitation clinical trial. The system was utilized to manage clinical data and extract imaging biomarkers including stroke lesion volume, location and ventricle/brain ratio. The GLMM module was validated and the efficiency of data analysis was also evaluated.

  4. Quantitative phase-digital holographic microscopy: a new imaging modality to identify original cellular biomarkers of diseases

    NASA Astrophysics Data System (ADS)

    Marquet, P.; Rothenfusser, K.; Rappaz, B.; Depeursinge, C.; Jourdain, P.; Magistretti, P. J.

    2016-03-01

    Quantitative phase microscopy (QPM) has recently emerged as a powerful label-free technique in the field of living cell imaging allowing to non-invasively measure with a nanometric axial sensitivity cell structure and dynamics. Since the phase retardation of a light wave when transmitted through the observed cells, namely the quantitative phase signal (QPS), is sensitive to both cellular thickness and intracellular refractive index related to the cellular content, its accurate analysis allows to derive various cell parameters and monitor specific cell processes, which are very likely to identify new cell biomarkers. Specifically, quantitative phase-digital holographic microscopy (QP-DHM), thanks to its numerical flexibility facilitating parallelization and automation processes, represents an appealing imaging modality to both identify original cellular biomarkers of diseases as well to explore the underlying pathophysiological processes.

  5. An economic analysis methodology for project evaluation and programming.

    DOT National Transportation Integrated Search

    2013-08-01

    Economic analysis is a critical component of a comprehensive project or program evaluation methodology that considers all key : quantitative and qualitative impacts of highway investments. It allows highway agencies to identify, quantify, and value t...

  6. NAIMA: target amplification strategy allowing quantitative on-chip detection of GMOs.

    PubMed

    Morisset, Dany; Dobnik, David; Hamels, Sandrine; Zel, Jana; Gruden, Kristina

    2008-10-01

    We have developed a novel multiplex quantitative DNA-based target amplification method suitable for sensitive, specific and quantitative detection on microarray. This new method named NASBA Implemented Microarray Analysis (NAIMA) was applied to GMO detection in food and feed, but its application can be extended to all fields of biology requiring simultaneous detection of low copy number DNA targets. In a first step, the use of tailed primers allows the multiplex synthesis of template DNAs in a primer extension reaction. A second step of the procedure consists of transcription-based amplification using universal primers. The cRNA product is further on directly ligated to fluorescent dyes labelled 3DNA dendrimers allowing signal amplification and hybridized without further purification on an oligonucleotide probe-based microarray for multiplex detection. Two triplex systems have been applied to test maize samples containing several transgenic lines, and NAIMA has shown to be sensitive down to two target copies and to provide quantitative data on the transgenic contents in a range of 0.1-25%. Performances of NAIMA are comparable to singleplex quantitative real-time PCR. In addition, NAIMA amplification is faster since 20 min are sufficient to achieve full amplification.

  7. NAIMA: target amplification strategy allowing quantitative on-chip detection of GMOs

    PubMed Central

    Morisset, Dany; Dobnik, David; Hamels, Sandrine; Žel, Jana; Gruden, Kristina

    2008-01-01

    We have developed a novel multiplex quantitative DNA-based target amplification method suitable for sensitive, specific and quantitative detection on microarray. This new method named NASBA Implemented Microarray Analysis (NAIMA) was applied to GMO detection in food and feed, but its application can be extended to all fields of biology requiring simultaneous detection of low copy number DNA targets. In a first step, the use of tailed primers allows the multiplex synthesis of template DNAs in a primer extension reaction. A second step of the procedure consists of transcription-based amplification using universal primers. The cRNA product is further on directly ligated to fluorescent dyes labelled 3DNA dendrimers allowing signal amplification and hybridized without further purification on an oligonucleotide probe-based microarray for multiplex detection. Two triplex systems have been applied to test maize samples containing several transgenic lines, and NAIMA has shown to be sensitive down to two target copies and to provide quantitative data on the transgenic contents in a range of 0.1–25%. Performances of NAIMA are comparable to singleplex quantitative real-time PCR. In addition, NAIMA amplification is faster since 20 min are sufficient to achieve full amplification. PMID:18710880

  8. Putative regulatory sites unraveled by network-embedded thermodynamic analysis of metabolome data

    PubMed Central

    Kümmel, Anne; Panke, Sven; Heinemann, Matthias

    2006-01-01

    As one of the most recent members of the omics family, large-scale quantitative metabolomics data are currently complementing our systems biology data pool and offer the chance to integrate the metabolite level into the functional analysis of cellular networks. Network-embedded thermodynamic analysis (NET analysis) is presented as a framework for mechanistic and model-based analysis of these data. By coupling the data to an operating metabolic network via the second law of thermodynamics and the metabolites' Gibbs energies of formation, NET analysis allows inferring functional principles from quantitative metabolite data; for example it identifies reactions that are subject to active allosteric or genetic regulation as exemplified with quantitative metabolite data from Escherichia coli and Saccharomyces cerevisiae. Moreover, the optimization framework of NET analysis was demonstrated to be a valuable tool to systematically investigate data sets for consistency, for the extension of sub-omic metabolome data sets and for resolving intracompartmental concentrations from cell-averaged metabolome data. Without requiring any kind of kinetic modeling, NET analysis represents a perfectly scalable and unbiased approach to uncover insights from quantitative metabolome data. PMID:16788595

  9. Chemical analysis and quantitation of the tapetum lucidum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gee, N.A.; Fisher, G.L.; Nash, C.P.

    1975-06-01

    A study was conducted to provide a basis for the evaluation of the biochemical nature of the $sup 226$Ra alterations of the beagle tapetum. Results indicated that zinc and/or melanin determinations in the tapetum nigrum and tapetum lucidum may allow quantitation of tapetum lucidum tissue without the need for physical separation of the tapetal layers. (HLW)

  10. The influence of biological and technical factors on quantitative analysis of amyloid PET: Points to consider and recommendations for controlling variability in longitudinal data.

    PubMed

    Schmidt, Mark E; Chiao, Ping; Klein, Gregory; Matthews, Dawn; Thurfjell, Lennart; Cole, Patricia E; Margolin, Richard; Landau, Susan; Foster, Norman L; Mason, N Scott; De Santi, Susan; Suhy, Joyce; Koeppe, Robert A; Jagust, William

    2015-09-01

    In vivo imaging of amyloid burden with positron emission tomography (PET) provides a means for studying the pathophysiology of Alzheimer's and related diseases. Measurement of subtle changes in amyloid burden requires quantitative analysis of image data. Reliable quantitative analysis of amyloid PET scans acquired at multiple sites and over time requires rigorous standardization of acquisition protocols, subject management, tracer administration, image quality control, and image processing and analysis methods. We review critical points in the acquisition and analysis of amyloid PET, identify ways in which technical factors can contribute to measurement variability, and suggest methods for mitigating these sources of noise. Improved quantitative accuracy could reduce the sample size necessary to detect intervention effects when amyloid PET is used as a treatment end point and allow more reliable interpretation of change in amyloid burden and its relationship to clinical course. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  11. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    NASA Astrophysics Data System (ADS)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  12. Quantitative 3D investigation of Neuronal network in mouse spinal cord model

    NASA Astrophysics Data System (ADS)

    Bukreeva, I.; Campi, G.; Fratini, M.; Spanò, R.; Bucci, D.; Battaglia, G.; Giove, F.; Bravin, A.; Uccelli, A.; Venturi, C.; Mastrogiacomo, M.; Cedola, A.

    2017-01-01

    The investigation of the neuronal network in mouse spinal cord models represents the basis for the research on neurodegenerative diseases. In this framework, the quantitative analysis of the single elements in different districts is a crucial task. However, conventional 3D imaging techniques do not have enough spatial resolution and contrast to allow for a quantitative investigation of the neuronal network. Exploiting the high coherence and the high flux of synchrotron sources, X-ray Phase-Contrast multiscale-Tomography allows for the 3D investigation of the neuronal microanatomy without any aggressive sample preparation or sectioning. We investigated healthy-mouse neuronal architecture by imaging the 3D distribution of the neuronal-network with a spatial resolution of 640 nm. The high quality of the obtained images enables a quantitative study of the neuronal structure on a subject-by-subject basis. We developed and applied a spatial statistical analysis on the motor neurons to obtain quantitative information on their 3D arrangement in the healthy-mice spinal cord. Then, we compared the obtained results with a mouse model of multiple sclerosis. Our approach paves the way to the creation of a “database” for the characterization of the neuronal network main features for a comparative investigation of neurodegenerative diseases and therapies.

  13. Condenser: a statistical aggregation tool for multi-sample quantitative proteomic data from Matrix Science Mascot Distiller™.

    PubMed

    Knudsen, Anders Dahl; Bennike, Tue; Kjeldal, Henrik; Birkelund, Svend; Otzen, Daniel Erik; Stensballe, Allan

    2014-05-30

    We describe Condenser, a freely available, comprehensive open-source tool for merging multidimensional quantitative proteomics data from the Matrix Science Mascot Distiller Quantitation Toolbox into a common format ready for subsequent bioinformatic analysis. A number of different relative quantitation technologies, such as metabolic (15)N and amino acid stable isotope incorporation, label-free and chemical-label quantitation are supported. The program features multiple options for curative filtering of the quantified peptides, allowing the user to choose data quality thresholds appropriate for the current dataset, and ensure the quality of the calculated relative protein abundances. Condenser also features optional global normalization, peptide outlier removal, multiple testing and calculation of t-test statistics for highlighting and evaluating proteins with significantly altered relative protein abundances. Condenser provides an attractive addition to the gold-standard quantitative workflow of Mascot Distiller, allowing easy handling of larger multi-dimensional experiments. Source code, binaries, test data set and documentation are available at http://condenser.googlecode.com/. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Development of Nomarski microscopy for quantitative determination of surface topography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hartman, J. S.; Gordon, R. L.; Lessor, D. L.

    1979-01-01

    The use of Nomarski differential interference contrast (DIC) microscopy has been extended to provide nondestructive, quantitative analysis of a sample's surface topography. Theoretical modeling has determined the dependence of the image intensity on the microscope's optical components, the sample's optical properties, and the sample's surface orientation relative to the microscope. Results include expressions to allow the inversion of image intensity data to determine sample surface slopes. A commercial Nomarski system has been modified and characterized to allow the evaluation of the optical model. Data have been recorded with smooth, planar samples that verify the theoretical predictions.

  15. Nerves and Tissue Repair.

    DTIC Science & Technology

    1992-05-21

    complete dependence on nerves. Organ culture of sciatic nerves, combined with an assay for axolotl transferrin developed earlier, allows quantitative study...axonal release of various unknown proteins. Combining this approach with the ELISA for quantitative measurement of axolotl transferrin developed with...light microscope autoradiographic analysis following binding of radiolabelled Tf. Studies of Tf synthesis will employ cDNA probes for axolotl Tf mRNA

  16. LC-MS Data Processing with MAVEN: A Metabolomic Analysis and Visualization Engine

    PubMed Central

    Clasquin, Michelle F.; Melamud, Eugene; Rabinowitz, Joshua D.

    2014-01-01

    MAVEN is an open-source software program for interactive processing of LC-MS-based metabolomics data. MAVEN enables rapid and reliable metabolite quantitation from multiple reaction monitoring data or high-resolution full-scan mass spectrometry data. It automatically detects and reports peak intensities for isotope-labeled metabolites. Menu-driven, click-based navigation allows visualization of raw and analyzed data. Here we provide a User Guide for MAVEN. Step-by-step instructions are provided for data import, peak alignment across samples, identification of metabolites that differ strongly between biological conditions, quantitation and visualization of isotope-labeling patterns, and export of tables of metabolite-specific peak intensities. Together, these instructions describe a workflow that allows efficient processing of raw LC-MS data into a form ready for biological analysis. PMID:22389014

  17. LC-MS data processing with MAVEN: a metabolomic analysis and visualization engine.

    PubMed

    Clasquin, Michelle F; Melamud, Eugene; Rabinowitz, Joshua D

    2012-03-01

    MAVEN is an open-source software program for interactive processing of LC-MS-based metabolomics data. MAVEN enables rapid and reliable metabolite quantitation from multiple reaction monitoring data or high-resolution full-scan mass spectrometry data. It automatically detects and reports peak intensities for isotope-labeled metabolites. Menu-driven, click-based navigation allows visualization of raw and analyzed data. Here we provide a User Guide for MAVEN. Step-by-step instructions are provided for data import, peak alignment across samples, identification of metabolites that differ strongly between biological conditions, quantitation and visualization of isotope-labeling patterns, and export of tables of metabolite-specific peak intensities. Together, these instructions describe a workflow that allows efficient processing of raw LC-MS data into a form ready for biological analysis.

  18. Computer-Assisted Analysis of Spontaneous Speech: Quantification of Basic Parameters in Aphasic and Unimpaired Language

    ERIC Educational Resources Information Center

    Hussmann, Katja; Grande, Marion; Meffert, Elisabeth; Christoph, Swetlana; Piefke, Martina; Willmes, Klaus; Huber, Walter

    2012-01-01

    Although generally accepted as an important part of aphasia assessment, detailed analysis of spontaneous speech is rarely carried out in clinical practice mostly due to time limitations. The Aachener Sprachanalyse (ASPA; Aachen Speech Analysis) is a computer-assisted method for the quantitative analysis of German spontaneous speech that allows for…

  19. Quantitative analysis of surface characteristics and morphology in Death Valley, California using AIRSAR data

    NASA Technical Reports Server (NTRS)

    Kierein-Young, K. S.; Kruse, F. A.; Lefkoff, A. B.

    1992-01-01

    The Jet Propulsion Laboratory Airborne Synthetic Aperture Radar (JPL-AIRSAR) is used to collect full polarimetric measurements at P-, L-, and C-bands. These data are analyzed using the radar analysis and visualization environment (RAVEN). The AIRSAR data are calibrated using in-scene corner reflectors to allow for quantitative analysis of the radar backscatter. RAVEN is used to extract surface characteristics. Inversion models are used to calculate quantitative surface roughness values and fractal dimensions. These values are used to generate synthetic surface plots that represent the small-scale surface structure of areas in Death Valley. These procedures are applied to a playa, smooth salt-pan, and alluvial fan surfaces in Death Valley. Field measurements of surface roughness are used to verify the accuracy.

  20. Quantitative high-resolution genomic analysis of single cancer cells.

    PubMed

    Hannemann, Juliane; Meyer-Staeckling, Sönke; Kemming, Dirk; Alpers, Iris; Joosse, Simon A; Pospisil, Heike; Kurtz, Stefan; Görndt, Jennifer; Püschel, Klaus; Riethdorf, Sabine; Pantel, Klaus; Brandt, Burkhard

    2011-01-01

    During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics.

  1. Method for the Simultaneous Quantitation of Apolipoprotein E Isoforms using Tandem Mass Spectrometry

    PubMed Central

    Wildsmith, Kristin R.; Han, Bomie; Bateman, Randall J.

    2009-01-01

    Using Apolipoprotein E (ApoE) as a model protein, we developed a protein isoform analysis method utilizing Stable Isotope Labeling Tandem Mass Spectrometry (SILT MS). ApoE isoforms are quantitated using the intensities of the b and y ions of the 13C-labeled tryptic isoform-specific peptides versus unlabeled tryptic isoform-specific peptides. The ApoE protein isoform analysis using SILT allows for the simultaneous detection and relative quantitation of different ApoE isoforms from the same sample. This method provides a less biased assessment of ApoE isoforms compared to antibody-dependent methods, and may lead to a better understanding of the biological differences between isoforms. PMID:19653990

  2. ImatraNMR: novel software for batch integration and analysis of quantitative NMR spectra.

    PubMed

    Mäkelä, A V; Heikkilä, O; Kilpeläinen, I; Heikkinen, S

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D (1)H and (13)C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request. Copyright © 2011 Elsevier Inc. All rights reserved.

  3. Calibration method for spectroscopic systems

    DOEpatents

    Sandison, David R.

    1998-01-01

    Calibration spots of optically-characterized material placed in the field of view of a spectroscopic system allow calibration of the spectroscopic system. Response from the calibration spots is measured and used to calibrate for varying spectroscopic system operating parameters. The accurate calibration achieved allows quantitative spectroscopic analysis of responses taken at different times, different excitation conditions, and of different targets.

  4. Calibration method for spectroscopic systems

    DOEpatents

    Sandison, D.R.

    1998-11-17

    Calibration spots of optically-characterized material placed in the field of view of a spectroscopic system allow calibration of the spectroscopic system. Response from the calibration spots is measured and used to calibrate for varying spectroscopic system operating parameters. The accurate calibration achieved allows quantitative spectroscopic analysis of responses taken at different times, different excitation conditions, and of different targets. 3 figs.

  5. Higher Education Students' Attitudes towards Experiential Learning in International Business

    ERIC Educational Resources Information Center

    Chavan, Meena

    2011-01-01

    Using qualitative and quantitative analysis this paper presents a teaching model based on experiential learning in a large "International Business" unit. Preliminary analysis of 92 student evaluations determined the effectiveness of experiential learning to allow students to explore the association between theory and practice. The…

  6. Optical contrast and refractive index of natural van der Waals heterostructure nanosheets of franckeite

    PubMed Central

    Gant, Patricia; Ghasemi, Foad; Maeso, David; Munuera, Carmen; López-Elvira, Elena; Frisenda, Riccardo; De Lara, David Pérez; Rubio-Bollinger, Gabino; Garcia-Hernandez, Mar

    2017-01-01

    We study mechanically exfoliated nanosheets of franckeite by quantitative optical microscopy. The analysis of transmission-mode and epi-illumination-mode optical microscopy images provides a rapid method to estimate the thickness of the exfoliated flakes at first glance. A quantitative analysis of the optical contrast spectra by means of micro-reflectance allows one to determine the refractive index of franckeite over a broad range of the visible spectrum through a fit of the acquired spectra to a model based on the Fresnel law. PMID:29181292

  7. TANGO: a generic tool for high-throughput 3D image analysis for studying nuclear organization.

    PubMed

    Ollion, Jean; Cochennec, Julien; Loll, François; Escudé, Christophe; Boudier, Thomas

    2013-07-15

    The cell nucleus is a highly organized cellular organelle that contains the genetic material. The study of nuclear architecture has become an important field of cellular biology. Extracting quantitative data from 3D fluorescence imaging helps understand the functions of different nuclear compartments. However, such approaches are limited by the requirement for processing and analyzing large sets of images. Here, we describe Tools for Analysis of Nuclear Genome Organization (TANGO), an image analysis tool dedicated to the study of nuclear architecture. TANGO is a coherent framework allowing biologists to perform the complete analysis process of 3D fluorescence images by combining two environments: ImageJ (http://imagej.nih.gov/ij/) for image processing and quantitative analysis and R (http://cran.r-project.org) for statistical processing of measurement results. It includes an intuitive user interface providing the means to precisely build a segmentation procedure and set-up analyses, without possessing programming skills. TANGO is a versatile tool able to process large sets of images, allowing quantitative study of nuclear organization. TANGO is composed of two programs: (i) an ImageJ plug-in and (ii) a package (rtango) for R. They are both free and open source, available (http://biophysique.mnhn.fr/tango) for Linux, Microsoft Windows and Macintosh OSX. Distribution is under the GPL v.2 licence. thomas.boudier@snv.jussieu.fr Supplementary data are available at Bioinformatics online.

  8. Translational PK/PD of Anti-Infective Therapeutics

    PubMed Central

    Rathi, Chetan; Lee, Richard E.; Meibohm, Bernd

    2016-01-01

    Translational PK/PD modeling has emerged as a critical technique for quantitative analysis of the relationship between dose, exposure and response of antibiotics. By combining model components for pharmacokinetics, bacterial growth kinetics and concentration-dependent drug effects, these models are able to quantitatively capture and simulate the complex interplay between antibiotic, bacterium and host organism. Fine-tuning of these basic model structures allows to further account for complicating factors such as resistance development, combination therapy, or host responses. With this tool set at hand, mechanism-based PK/PD modeling and simulation allows to develop optimal dosing regimens for novel and established antibiotics for maximum efficacy and minimal resistance development. PMID:27978987

  9. Electron Probe Microanalysis | Materials Science | NREL

    Science.gov Websites

    surveys of the area of interest before performing a more accurate quantitative analysis with WDS. WDS - Four spectrometers with ten diffracting crystals. The use of a single-channel analyzer allows much

  10. Advances in Surface Plasmon Resonance Imaging allowing for quantitative measurement of laterally heterogeneous samples

    NASA Astrophysics Data System (ADS)

    Raegen, Adam; Reiter, Kyle; Clarke, Anthony; Lipkowski, Jacek; Dutcher, John

    2012-02-01

    The Surface Plasmon Resonance (SPR) phenomenon is routinely exploited to qualitatively probe changes to materials on metallic surfaces for use in probes and sensors. Unfortunately, extracting truly quantitative information is usually limited to a select few cases -- uniform absorption/desorption of small biomolecules and films, in which a continuous ``slab'' model is a good approximation. We present advancements in the SPR technique that expand the number of cases for which the technique can provide meaningful results. Use of a custom, angle-scanning SPR imaging system, together with a refined data analysis method, allow for quantitative kinetic measurements of laterally heterogeneous systems. The degradation of cellulose microfibrils and bundles of microfibrils due to the action of cellulolytic enzymes will be presented as an excellent example of the capabilities of the SPR imaging system.

  11. A Dimensionally Reduced Clustering Methodology for Heterogeneous Occupational Medicine Data Mining.

    PubMed

    Saâdaoui, Foued; Bertrand, Pierre R; Boudet, Gil; Rouffiac, Karine; Dutheil, Frédéric; Chamoux, Alain

    2015-10-01

    Clustering is a set of techniques of the statistical learning aimed at finding structures of heterogeneous partitions grouping homogenous data called clusters. There are several fields in which clustering was successfully applied, such as medicine, biology, finance, economics, etc. In this paper, we introduce the notion of clustering in multifactorial data analysis problems. A case study is conducted for an occupational medicine problem with the purpose of analyzing patterns in a population of 813 individuals. To reduce the data set dimensionality, we base our approach on the Principal Component Analysis (PCA), which is the statistical tool most commonly used in factorial analysis. However, the problems in nature, especially in medicine, are often based on heterogeneous-type qualitative-quantitative measurements, whereas PCA only processes quantitative ones. Besides, qualitative data are originally unobservable quantitative responses that are usually binary-coded. Hence, we propose a new set of strategies allowing to simultaneously handle quantitative and qualitative data. The principle of this approach is to perform a projection of the qualitative variables on the subspaces spanned by quantitative ones. Subsequently, an optimal model is allocated to the resulting PCA-regressed subspaces.

  12. Transcriptome discovery in non-model wild fish species for the development of quantitative transcript abundance assays

    USGS Publications Warehouse

    Hahn, Cassidy M.; Iwanowicz, Luke R.; Cornman, Robert S.; Mazik, Patricia M.; Blazer, Vicki S.

    2016-01-01

    Environmental studies increasingly identify the presence of both contaminants of emerging concern (CECs) and legacy contaminants in aquatic environments; however, the biological effects of these compounds on resident fishes remain largely unknown. High throughput methodologies were employed to establish partial transcriptomes for three wild-caught, non-model fish species; smallmouth bass (Micropterus dolomieu), white sucker (Catostomus commersonii) and brown bullhead (Ameiurus nebulosus). Sequences from these transcriptome databases were utilized in the development of a custom nCounter CodeSet that allowed for direct multiplexed measurement of 50 transcript abundance endpoints in liver tissue. Sequence information was also utilized in the development of quantitative real-time PCR (qPCR) primers. Cross-species hybridization allowed the smallmouth bass nCounter CodeSet to be used for quantitative transcript abundance analysis of an additional non-model species, largemouth bass (Micropterus salmoides). We validated the nCounter analysis data system with qPCR for a subset of genes and confirmed concordant results. Changes in transcript abundance biomarkers between sexes and seasons were evaluated to provide baseline data on transcript modulation for each species of interest.

  13. The Singing Rod (in the Modern Age)

    ERIC Educational Resources Information Center

    Lasby, B.; O'Meara, J. M.; Williams, M.

    2014-01-01

    This is a classic classroom demonstration of resonance, nodes, anti-nodes, and standing waves that has been described elsewhere. The modern age twist that we are advocating is the coupling of this classic demo with free (or relatively inexpensive) sound analysis software, thereby allowing for quantitative analysis of resonance while experimenting…

  14. Advanced IR System For Supersonic Boundary Layer Transition Flight Experiment

    NASA Technical Reports Server (NTRS)

    Banks, Daniel W.

    2008-01-01

    Infrared thermography is a preferred method investigating transition in flight: a) Global and non-intrusive; b) Can also be used to visualize and characterize other fluid mechanic phenomena such as shock impingement, separation etc. F-15 based system was updated with new camera and digital video recorder to support high Reynolds number transition tests. Digital Recording improves image quality and analysis capability and allows for accurate quantitative (temperature) measurements and greater enhancement through image processing allows analysis of smaller scale phenomena.

  15. Detection and Characterization of Boundary-Layer Transition in Flight at Supersonic Conditions Using Infrared Thermography

    NASA Technical Reports Server (NTRS)

    Banks, Daniel W.

    2008-01-01

    Infrared thermography is a powerful tool for investigating fluid mechanics on flight vehicles. (Can be used to visualize and characterize transition, shock impingement, separation etc.). Updated onboard F-15 based system was used to visualize supersonic boundary layer transition test article. (Tollmien-Schlichting and cross-flow dominant flow fields). Digital Recording improves image quality and analysis capability. (Allows accurate quantitative (temperature) measurements, Greater enhancement through image processing allows analysis of smaller scale phenomena).

  16. Obscure phenomena in statistical analysis of quantitative structure-activity relationships. Part 1: Multicollinearity of physicochemical descriptors.

    PubMed

    Mager, P P; Rothe, H

    1990-10-01

    Multicollinearity of physicochemical descriptors leads to serious consequences in quantitative structure-activity relationship (QSAR) analysis, such as incorrect estimators and test statistics of regression coefficients of the ordinary least-squares (OLS) model applied usually to QSARs. Beside the diagnosis of the known simple collinearity, principal component regression analysis (PCRA) also allows the diagnosis of various types of multicollinearity. Only if the absolute values of PCRA estimators are order statistics that decrease monotonically, the effects of multicollinearity can be circumvented. Otherwise, obscure phenomena may be observed, such as good data recognition but low predictive model power of a QSAR model.

  17. Quantitative High-Resolution Genomic Analysis of Single Cancer Cells

    PubMed Central

    Hannemann, Juliane; Meyer-Staeckling, Sönke; Kemming, Dirk; Alpers, Iris; Joosse, Simon A.; Pospisil, Heike; Kurtz, Stefan; Görndt, Jennifer; Püschel, Klaus; Riethdorf, Sabine; Pantel, Klaus; Brandt, Burkhard

    2011-01-01

    During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics. PMID:22140428

  18. COMPARISON OF GENETIC METHODS TO OPTICAL METHODS IN THE IDENTIFICATION AND ASSESSMENT OF MOLD IN THE BUILT ENVIRONMENT -- COMPARISON OF TAQMAN AND MICROSCOPIC ANALYSIS OF CLADOSPORIUM SPORES RETRIEVED FROM ZEFON AIR-O-CELL TRACES

    EPA Science Inventory

    Recent advances in the sequencing of relevant water intrusion fungi by the EPA, combined with the development of probes and primers have allowed for the unequivocal quantitative and qualitative identification of fungi in selected matrices.

    In this pilot study, quantitative...

  19. Quantitative Appearance Inspection for Film Coated Tablets.

    PubMed

    Yoshino, Hiroyuki; Yamashita, Kazunari; Iwao, Yasunori; Noguchi, Shuji; Itai, Shigeru

    2016-01-01

    The decision criteria for the physical appearance of pharmaceutical products are subjective and qualitative means of evaluation that are based entirely on human interpretation. In this study, we have developed a comprehensive method for the quantitative analysis of the physical appearance of film coated tablets. Three different kinds of film coated tablets with considerable differences in their physical appearances were manufactured as models, and their surface roughness, contact angle, color measurements and physicochemical properties were investigated as potential characteristics for the quantitative analysis of their physical appearance. All of these characteristics were useful for the quantitative evaluation of the physical appearances of the tablets, and could potentially be used to establish decision criteria to assess the quality of tablets. In particular, the analysis of the surface roughness and film coating properties of the tablets by terahertz spectroscopy allowed for an effective evaluation of the tablets' properties. These results indicated the possibility of inspecting the appearance of tablets during the film coating process.

  20. Informatics in radiology: automated structured reporting of imaging findings using the AIM standard and XML.

    PubMed

    Zimmerman, Stefan L; Kim, Woojin; Boonn, William W

    2011-01-01

    Quantitative and descriptive imaging data are a vital component of the radiology report and are frequently of paramount importance to the ordering physician. Unfortunately, current methods of recording these data in the report are both inefficient and error prone. In addition, the free-text, unstructured format of a radiology report makes aggregate analysis of data from multiple reports difficult or even impossible without manual intervention. A structured reporting work flow has been developed that allows quantitative data created at an advanced imaging workstation to be seamlessly integrated into the radiology report with minimal radiologist intervention. As an intermediary step between the workstation and the reporting software, quantitative and descriptive data are converted into an extensible markup language (XML) file in a standardized format specified by the Annotation and Image Markup (AIM) project of the National Institutes of Health Cancer Biomedical Informatics Grid. The AIM standard was created to allow image annotation data to be stored in a uniform machine-readable format. These XML files containing imaging data can also be stored on a local database for data mining and analysis. This structured work flow solution has the potential to improve radiologist efficiency, reduce errors, and facilitate storage of quantitative and descriptive imaging data for research. Copyright © RSNA, 2011.

  1. Coding Early Naturalists' Accounts into Long-Term Fish Community Changes in the Adriatic Sea (1800–2000)

    PubMed Central

    Fortibuoni, Tomaso; Libralato, Simone; Raicevich, Saša; Giovanardi, Otello; Solidoro, Cosimo

    2010-01-01

    The understanding of fish communities' changes over the past centuries has important implications for conservation policy and marine resource management. However, reconstructing these changes is difficult because information on marine communities before the second half of the 20th century is, in most cases, anecdotal and merely qualitative. Therefore, historical qualitative records and modern quantitative data are not directly comparable, and their integration for long-term analyses is not straightforward. We developed a methodology that allows the coding of qualitative information provided by early naturalists into semi-quantitative information through an intercalibration with landing proportions. This approach allowed us to reconstruct and quantitatively analyze a 200-year-long time series of fish community structure indicators in the Northern Adriatic Sea (Mediterranean Sea). Our analysis provides evidence of long-term changes in fish community structure, including the decline of Chondrichthyes, large-sized and late-maturing species. This work highlights the importance of broadening the time-frame through which we look at marine ecosystem changes and provides a methodology to exploit, in a quantitative framework, historical qualitative sources. To the purpose, naturalists' eyewitness accounts proved to be useful for extending the analysis on fish community back in the past, well before the onset of field-based monitoring programs. PMID:21103349

  2. Highway Safety Manual applied in Missouri - freeway/software.

    DOT National Transportation Integrated Search

    2016-06-01

    AASHTOs Highway Safety Manual (HSM) facilitates the quantitative safety analysis of highway facilities. In a 2014 : supplement, freeway facilities were added to the original HSM manual which allows the modeling of highway : interchanges. This repo...

  3. High-Content Screening for Quantitative Cell Biology.

    PubMed

    Mattiazzi Usaj, Mojca; Styles, Erin B; Verster, Adrian J; Friesen, Helena; Boone, Charles; Andrews, Brenda J

    2016-08-01

    High-content screening (HCS), which combines automated fluorescence microscopy with quantitative image analysis, allows the acquisition of unbiased multiparametric data at the single cell level. This approach has been used to address diverse biological questions and identify a plethora of quantitative phenotypes of varying complexity in numerous different model systems. Here, we describe some recent applications of HCS, ranging from the identification of genes required for specific biological processes to the characterization of genetic interactions. We review the steps involved in the design of useful biological assays and automated image analysis, and describe major challenges associated with each. Additionally, we highlight emerging technologies and future challenges, and discuss how the field of HCS might be enhanced in the future. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Missing Value Monitoring Enhances the Robustness in Proteomics Quantitation.

    PubMed

    Matafora, Vittoria; Corno, Andrea; Ciliberto, Andrea; Bachi, Angela

    2017-04-07

    In global proteomic analysis, it is estimated that proteins span from millions to less than 100 copies per cell. The challenge of protein quantitation by classic shotgun proteomic techniques relies on the presence of missing values in peptides belonging to low-abundance proteins that lowers intraruns reproducibility affecting postdata statistical analysis. Here, we present a new analytical workflow MvM (missing value monitoring) able to recover quantitation of missing values generated by shotgun analysis. In particular, we used confident data-dependent acquisition (DDA) quantitation only for proteins measured in all the runs, while we filled the missing values with data-independent acquisition analysis using the library previously generated in DDA. We analyzed cell cycle regulated proteins, as they are low abundance proteins with highly dynamic expression levels. Indeed, we found that cell cycle related proteins are the major components of the missing values-rich proteome. Using the MvM workflow, we doubled the number of robustly quantified cell cycle related proteins, and we reduced the number of missing values achieving robust quantitation for proteins over ∼50 molecules per cell. MvM allows lower quantification variance among replicates for low abundance proteins with respect to DDA analysis, which demonstrates the potential of this novel workflow to measure low abundance, dynamically regulated proteins.

  5. GProX, a user-friendly platform for bioinformatics analysis and visualization of quantitative proteomics data.

    PubMed

    Rigbolt, Kristoffer T G; Vanselow, Jens T; Blagoev, Blagoy

    2011-08-01

    Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)(1). The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net.

  6. GProX, a User-Friendly Platform for Bioinformatics Analysis and Visualization of Quantitative Proteomics Data*

    PubMed Central

    Rigbolt, Kristoffer T. G.; Vanselow, Jens T.; Blagoev, Blagoy

    2011-01-01

    Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)1. The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net. PMID:21602510

  7. Project risk management in the construction of high-rise buildings

    NASA Astrophysics Data System (ADS)

    Titarenko, Boris; Hasnaoui, Amir; Titarenko, Roman; Buzuk, Liliya

    2018-03-01

    This paper shows the project risk management methods, which allow to better identify risks in the construction of high-rise buildings and to manage them throughout the life cycle of the project. One of the project risk management processes is a quantitative analysis of risks. The quantitative analysis usually includes the assessment of the potential impact of project risks and their probabilities. This paper shows the most popular methods of risk probability assessment and tries to indicate the advantages of the robust approach over the traditional methods. Within the framework of the project risk management model a robust approach of P. Huber is applied and expanded for the tasks of regression analysis of project data. The suggested algorithms used to assess the parameters in statistical models allow to obtain reliable estimates. A review of the theoretical problems of the development of robust models built on the methodology of the minimax estimates was done and the algorithm for the situation of asymmetric "contamination" was developed.

  8. A versatile pipeline for the multi-scale digital reconstruction and quantitative analysis of 3D tissue architecture

    PubMed Central

    Morales-Navarrete, Hernán; Segovia-Miranda, Fabián; Klukowski, Piotr; Meyer, Kirstin; Nonaka, Hidenori; Marsico, Giovanni; Chernykh, Mikhail; Kalaidzidis, Alexander; Zerial, Marino; Kalaidzidis, Yannis

    2015-01-01

    A prerequisite for the systems biology analysis of tissues is an accurate digital three-dimensional reconstruction of tissue structure based on images of markers covering multiple scales. Here, we designed a flexible pipeline for the multi-scale reconstruction and quantitative morphological analysis of tissue architecture from microscopy images. Our pipeline includes newly developed algorithms that address specific challenges of thick dense tissue reconstruction. Our implementation allows for a flexible workflow, scalable to high-throughput analysis and applicable to various mammalian tissues. We applied it to the analysis of liver tissue and extracted quantitative parameters of sinusoids, bile canaliculi and cell shapes, recognizing different liver cell types with high accuracy. Using our platform, we uncovered an unexpected zonation pattern of hepatocytes with different size, nuclei and DNA content, thus revealing new features of liver tissue organization. The pipeline also proved effective to analyse lung and kidney tissue, demonstrating its generality and robustness. DOI: http://dx.doi.org/10.7554/eLife.11214.001 PMID:26673893

  9. Photocleavable DNA barcode-antibody conjugates allow sensitive and multiplexed protein analysis in single cells.

    PubMed

    Agasti, Sarit S; Liong, Monty; Peterson, Vanessa M; Lee, Hakho; Weissleder, Ralph

    2012-11-14

    DNA barcoding is an attractive technology, as it allows sensitive and multiplexed target analysis. However, DNA barcoding of cellular proteins remains challenging, primarily because barcode amplification and readout techniques are often incompatible with the cellular microenvironment. Here we describe the development and validation of a photocleavable DNA barcode-antibody conjugate method for rapid, quantitative, and multiplexed detection of proteins in single live cells. Following target binding, this method allows DNA barcodes to be photoreleased in solution, enabling easy isolation, amplification, and readout. As a proof of principle, we demonstrate sensitive and multiplexed detection of protein biomarkers in a variety of cancer cells.

  10. A rapid fluorescence assay for danofloxacin in beef muscle: effect of muscle type on limit of quantitation.

    PubMed

    Schneider, Marilyn J

    2008-08-01

    A simple, rapid fluorescence screening assay was applied to the analysis of beef muscle for danofloxacin at the U.S. tolerance level of 200 ng/g. Muscle samples were homogenized in acetic acid-acetonitrile, the resultant mixture centrifuged, and fluorescence of the supernatants was then measured. The significant difference between the fluorescence of control muscle sample extracts and extracts of samples fortified at 200 ng/g allowed for successful discrimination between the samples. Setting a threshold level at the average 200 ng/g fortified sample extract fluorescence -3sigma allowed for identification of potentially violative samples. Successful analysis of a group of blind fortified samples over a range of concentrations was accomplished in this manner, without any false-negative results. The limits of quantitation for danofloxacin, as well as enrofloxacin, using this assay were determined in three types of beef muscle (hanging tenderloin, neck, and eye round steak), as well as in serum. Significant differences in limits of quantitation were found among the three different muscle types examined, with hanging tenderloin muscle providing the lowest value. This work not only shows the potential for use of the fluorescence screening assay as an alternative to currently used microbial or antibody-based assays for the analysis of danofloxacin in beef muscle, but also suggests that assays using beef muscle may vary in performance depending on the specific muscle selected for analysis.

  11. Urban Multisensory Laboratory, AN Approach to Model Urban Space Human Perception

    NASA Astrophysics Data System (ADS)

    González, T.; Sol, D.; Saenz, J.; Clavijo, D.; García, H.

    2017-09-01

    An urban sensory lab (USL or LUS an acronym in Spanish) is a new and avant-garde approach for studying and analyzing a city. The construction of this approach allows the development of new methodologies to identify the emotional response of public space users. The laboratory combines qualitative analysis proposed by urbanists and quantitative measures managed by data analysis applications. USL is a new approach to go beyond the borders of urban knowledge. The design thinking strategy allows us to implement methods to understand the results provided by our technique. In this first approach, the interpretation is made by hand. However, our goal is to combine design thinking and machine learning in order to analyze the qualitative and quantitative data automatically. Now, the results are being used by students from the Urbanism and Architecture courses in order to get a better understanding of public spaces in Puebla, Mexico and its interaction with people.

  12. Quantitative measurement of a candidate serum biomarker peptide derived from α2-HS-glycoprotein, and a preliminary trial of multidimensional peptide analysis in females with pregnancy-induced hypertension.

    PubMed

    Hamamura, Kensuke; Yanagida, Mitsuaki; Ishikawa, Hitoshi; Banzai, Michio; Yoshitake, Hiroshi; Nonaka, Daisuke; Tanaka, Kenji; Sakuraba, Mayumi; Miyakuni, Yasuka; Takamori, Kenji; Nojima, Michio; Yoshida, Koyo; Fujiwara, Hiroshi; Takeda, Satoru; Araki, Yoshihiko

    2018-03-01

    Purpose We previously attempted to develop quantitative enzyme-linked immunosorbent assay (ELISA) systems for the PDA039/044/071 peptides, potential serum disease biomarkers (DBMs) of pregnancy-induced hypertension (PIH), primarily identified by a peptidomic approach (BLOTCHIP®-mass spectrometry (MS)). However, our methodology did not extend to PDA071 (cysteinyl α2-HS-glycoprotein 341-367 ), due to difficulty to produce a specific antibody against the peptide. The aim of the present study was to establish an alternative PDA071 quantitation system using liquid chromatography-multiple reaction monitoring (LC-MRM)/MS, to explore the potential utility of PDA071 as a DBM for PIH. Methods We tested heat/acid denaturation methods in efforts to purify serum PDA071 and developed an LC-MRM/MS method allowing for specific quantitation thereof. We measured serum PDA071 concentrations, and these results were validated including by three-dimensional (3D) plotting against PDA039 (kininogen-1 439-456 )/044 (kininogen-1 438-456 ) concentrations, followed by discriminant analysis. Results PDA071 was successfully extracted from serum using a heat denaturation method. Optimum conditions for quantitation via LC-MRM/MS were developed; the assayed serum PDA071 correlated well with the BLOTCHIP® assay values. Although the PDA071 alone did not significantly differ between patients and controls, 3D plotting of PDA039/044/071 peptide concentrations and construction of a Jackknife classification matrix were satisfactory in terms of PIH diagnostic precision. Conclusions Combination analysis using both PDA071 and PDA039/044 concentrations allowed PIH diagnostic accuracy to be attained, and our method will be valuable in future pathophysiological studies of hypertensive disorders of pregnancy.

  13. SedCT: MATLAB™ tools for standardized and quantitative processing of sediment core computed tomography (CT) data collected using a medical CT scanner

    NASA Astrophysics Data System (ADS)

    Reilly, B. T.; Stoner, J. S.; Wiest, J.

    2017-08-01

    Computed tomography (CT) of sediment cores allows for high-resolution images, three-dimensional volumes, and down core profiles. These quantitative data are generated through the attenuation of X-rays, which are sensitive to sediment density and atomic number, and are stored in pixels as relative gray scale values or Hounsfield units (HU). We present a suite of MATLAB™ tools specifically designed for routine sediment core analysis as a means to standardize and better quantify the products of CT data collected on medical CT scanners. SedCT uses a graphical interface to process Digital Imaging and Communications in Medicine (DICOM) files, stitch overlapping scanned intervals, and create down core HU profiles in a manner robust to normal coring imperfections. Utilizing a random sampling technique, SedCT reduces data size and allows for quick processing on typical laptop computers. SedCTimage uses a graphical interface to create quality tiff files of CT slices that are scaled to a user-defined HU range, preserving the quantitative nature of CT images and easily allowing for comparison between sediment cores with different HU means and variance. These tools are presented along with examples from lacustrine and marine sediment cores to highlight the robustness and quantitative nature of this method.

  14. Simulations of Carnival Rides and Rube Goldberg Machines for the Visualization of Concepts of Statics and Dynamics

    ERIC Educational Resources Information Center

    Howard, William; Williams, Richard; Yao, Jason

    2010-01-01

    Solid modeling is widely used as a teaching tool in summer activities with high school students. The addition of motion analysis allows concepts from statics and dynamics to be introduced to students in both qualitative and quantitative ways. Two sets of solid modeling projects--carnival rides and Rube Goldberg machines--are shown to allow the…

  15. Quantitative and qualitative approaches in the study of poverty and adolescent development: separation or integration?

    PubMed

    Leung, Janet T Y; Shek, Daniel T L

    2011-01-01

    This paper examines the use of quantitative and qualitative approaches to study the impact of economic disadvantage on family processes and adolescent development. Quantitative research has the merits of objectivity, good predictive and explanatory power, parsimony, precision and sophistication of analysis. Qualitative research, in contrast, provides a detailed, holistic, in-depth understanding of social reality and allows illumination of new insights. With the pragmatic considerations of methodological appropriateness, design flexibility, and situational responsiveness in responding to the research inquiry, a mixed methods approach could be a possibility of integrating quantitative and qualitative approaches and offers an alternative strategy to study the impact of economic disadvantage on family processes and adolescent development.

  16. Quantitative structure-activity relationship of organosulphur compounds as soybean 15-lipoxygenase inhibitors using CoMFA and CoMSIA.

    PubMed

    Caballero, Julio; Fernández, Michael; Coll, Deysma

    2010-12-01

    Three-dimensional quantitative structure-activity relationship studies were carried out on a series of 28 organosulphur compounds as 15-lipoxygenase inhibitors using comparative molecular field analysis and comparative molecular similarity indices analysis. Quantitative information on structure-activity relationships is provided for further rational development and direction of selective synthesis. All models were carried out over a training set including 22 compounds. The best comparative molecular field analysis model only included steric field and had a good Q² = 0.789. Comparative molecular similarity indices analysis overcame the comparative molecular field analysis results: the best comparative molecular similarity indices analysis model also only included steric field and had a Q² = 0.894. In addition, this model predicted adequately the compounds contained in the test set. Furthermore, plots of steric comparative molecular similarity indices analysis field allowed conclusions to be drawn for the choice of suitable inhibitors. In this sense, our model should prove useful in future 15-lipoxygenase inhibitor design studies. © 2010 John Wiley & Sons A/S.

  17. FLIPPER, a combinatorial probe for correlated live imaging and electron microscopy, allows identification and quantitative analysis of various cells and organelles.

    PubMed

    Kuipers, Jeroen; van Ham, Tjakko J; Kalicharan, Ruby D; Veenstra-Algra, Anneke; Sjollema, Klaas A; Dijk, Freark; Schnell, Ulrike; Giepmans, Ben N G

    2015-04-01

    Ultrastructural examination of cells and tissues by electron microscopy (EM) yields detailed information on subcellular structures. However, EM is typically restricted to small fields of view at high magnification; this makes quantifying events in multiple large-area sample sections extremely difficult. Even when combining light microscopy (LM) with EM (correlated LM and EM: CLEM) to find areas of interest, the labeling of molecules is still a challenge. We present a new genetically encoded probe for CLEM, named "FLIPPER", which facilitates quantitative analysis of ultrastructural features in cells. FLIPPER consists of a fluorescent protein (cyan, green, orange, or red) for LM visualization, fused to a peroxidase allowing visualization of targets at the EM level. The use of FLIPPER is straightforward and because the module is completely genetically encoded, cells can be optimally prepared for EM examination. We use FLIPPER to quantify cellular morphology at the EM level in cells expressing a normal and disease-causing point-mutant cell-surface protein called EpCAM (epithelial cell adhesion molecule). The mutant protein is retained in the endoplasmic reticulum (ER) and could therefore alter ER function and morphology. To reveal possible ER alterations, cells were co-transfected with color-coded full-length or mutant EpCAM and a FLIPPER targeted to the ER. CLEM examination of the mixed cell population allowed color-based cell identification, followed by an unbiased quantitative analysis of the ER ultrastructure by EM. Thus, FLIPPER combines bright fluorescent proteins optimized for live imaging with high sensitivity for EM labeling, thereby representing a promising tool for CLEM.

  18. Quantitation of Mycotoxins Using Direct Analysis in Real Time Mass Spectrometry (DART-MS).

    PubMed

    Busman, Mark

    2018-05-01

    Ambient ionization represents a new generation of MS ion sources and is used for the rapid ionization of small molecules under ambient conditions. The combination of ambient ionization and MS allows the analysis of multiple food samples with simple or no sample treatment or in conjunction with prevailing sample preparation methods. Two ambient ionization methods, desorptive electrospray ionization (DESI) and direct analysis in real time (DART) have been adapted for food safety application. Both ionization techniques provide unique advantages and capabilities. DART has been used for a variety of qualitative and quantitative applications. In particular, mycotoxin contamination of food and feed materials has been addressed by DART-MS. Applications to mycotoxin analysis by ambient ionization MS and particularly DART-MS are summarized.

  19. Using PSEA-Quant for Protein Set Enrichment Analysis of Quantitative Mass Spectrometry-Based Proteomics

    PubMed Central

    Lavallée-Adam, Mathieu

    2017-01-01

    PSEA-Quant analyzes quantitative mass spectrometry-based proteomics datasets to identify enrichments of annotations contained in repositories such as the Gene Ontology and Molecular Signature databases. It allows users to identify the annotations that are significantly enriched for reproducibly quantified high abundance proteins. PSEA-Quant is available on the web and as a command-line tool. It is compatible with all label-free and isotopic labeling-based quantitative proteomics methods. This protocol describes how to use PSEA-Quant and interpret its output. The importance of each parameter as well as troubleshooting approaches are also discussed. PMID:27010334

  20. Noninvasive characterization of the fission yeast cell cycle by monitoring dry mass with digital holographic microscopy.

    PubMed

    Rappaz, Benjamin; Cano, Elena; Colomb, Tristan; Kühn, Jonas; Depeursinge, Christian; Simanis, Viesturs; Magistretti, Pierre J; Marquet, Pierre

    2009-01-01

    Digital holography microscopy (DHM) is an optical technique which provides phase images yielding quantitative information about cell structure and cellular dynamics. Furthermore, the quantitative phase images allow the derivation of other parameters, including dry mass production, density, and spatial distribution. We have applied DHM to study the dry mass production rate and the dry mass surface density in wild-type and mutant fission yeast cells. Our study demonstrates the applicability of DHM as a tool for label-free quantitative analysis of the cell cycle and opens the possibility for its use in high-throughput screening.

  1. Critically appraising qualitative research: a guide for clinicians more familiar with quantitative techniques.

    PubMed

    Kisely, Stephen; Kendall, Elizabeth

    2011-08-01

    Papers using qualitative methods are increasingly common in psychiatric journals. This overview is an introduction to critically appraising a qualitative paper for clinicians who are more familiar with quantitative methods. Qualitative research uses data from interviews (semi-structured or unstructured), focus groups, observations or written materials. Data analysis is inductive, allowing meaning to emerge from the data, rather than the more deductive, hypothesis centred approach of quantitative research. This overview compares and contrasts quantitative and qualitative research methods. Quantitative concepts such as reliability, validity, statistical power, bias and generalisability have qualitative equivalents. These include triangulation, trustworthiness, saturation, reflexivity and applicability. Reflexivity also shares features of transference. Qualitative approaches include: ethnography, action-assessment, grounded theory, case studies and mixed methods. Qualitative research can complement quantitative approaches. An understanding of both is useful in critically appraising the psychiatric literature.

  2. Manual on performance of traffic signal systems: assessment of operations and maintenance : [summary].

    DOT National Transportation Integrated Search

    2017-05-01

    In this project, Florida Atlantic University researchers developed a methodology and software tools that allow objective, quantitative analysis of the performance of signal systems. : The researchers surveyed the state of practice for traffic signal ...

  3. MetaFluxNet: the management of metabolic reaction information and quantitative metabolic flux analysis.

    PubMed

    Lee, Dong-Yup; Yun, Hongsoek; Park, Sunwon; Lee, Sang Yup

    2003-11-01

    MetaFluxNet is a program package for managing information on the metabolic reaction network and for quantitatively analyzing metabolic fluxes in an interactive and customized way. It allows users to interpret and examine metabolic behavior in response to genetic and/or environmental modifications. As a result, quantitative in silico simulations of metabolic pathways can be carried out to understand the metabolic status and to design the metabolic engineering strategies. The main features of the program include a well-developed model construction environment, user-friendly interface for metabolic flux analysis (MFA), comparative MFA of strains having different genotypes under various environmental conditions, and automated pathway layout creation. http://mbel.kaist.ac.kr/ A manual for MetaFluxNet is available as PDF file.

  4. Fluorescence-labeled methylation-sensitive amplified fragment length polymorphism (FL-MS-AFLP) analysis for quantitative determination of DNA methylation and demethylation status.

    PubMed

    Kageyama, Shinji; Shinmura, Kazuya; Yamamoto, Hiroko; Goto, Masanori; Suzuki, Koichi; Tanioka, Fumihiko; Tsuneyoshi, Toshihiro; Sugimura, Haruhiko

    2008-04-01

    The PCR-based DNA fingerprinting method called the methylation-sensitive amplified fragment length polymorphism (MS-AFLP) analysis is used for genome-wide scanning of methylation status. In this study, we developed a method of fluorescence-labeled MS-AFLP (FL-MS-AFLP) analysis by applying a fluorescence-labeled primer and fluorescence-detecting electrophoresis apparatus to the existing method of MS-AFLP analysis. The FL-MS-AFLP analysis enables quantitative evaluation of more than 350 random CpG loci per run. It was shown to allow evaluation of the differences in methylation level of blood DNA of gastric cancer patients and evaluation of hypermethylation and hypomethylation in DNA from gastric cancer tissue in comparison with adjacent non-cancerous tissue.

  5. Dominant Epistasis Between Two Quantitative Trait Loci Governing Sporulation Efficiency in Yeast Saccharomyces cerevisiae

    PubMed Central

    Bergman, Juraj; Mitrikeski, Petar T.

    2015-01-01

    Summary Sporulation efficiency in the yeast Saccharomyces cerevisiae is a well-established model for studying quantitative traits. A variety of genes and nucleotides causing different sporulation efficiencies in laboratory, as well as in wild strains, has already been extensively characterised (mainly by reciprocal hemizygosity analysis and nucleotide exchange methods). We applied a different strategy in order to analyze the variation in sporulation efficiency of laboratory yeast strains. Coupling classical quantitative genetic analysis with simulations of phenotypic distributions (a method we call phenotype modelling) enabled us to obtain a detailed picture of the quantitative trait loci (QTLs) relationships underlying the phenotypic variation of this trait. Using this approach, we were able to uncover a dominant epistatic inheritance of loci governing the phenotype. Moreover, a molecular analysis of known causative quantitative trait genes and nucleotides allowed for the detection of novel alleles, potentially responsible for the observed phenotypic variation. Based on the molecular data, we hypothesise that the observed dominant epistatic relationship could be caused by the interaction of multiple quantitative trait nucleotides distributed across a 60--kb QTL region located on chromosome XIV and the RME1 locus on chromosome VII. Furthermore, we propose a model of molecular pathways which possibly underlie the phenotypic variation of this trait. PMID:27904371

  6. The simultaneous quantitation of ten amino acids in soil extracts by mass fragmentography

    NASA Technical Reports Server (NTRS)

    Pereira, W. E.; Hoyano, Y.; Reynolds, W. E.; Summons, R. E.; Duffield, A. M.

    1972-01-01

    A specific and sensitive method for the identification and simultaneous quantitation by mass fragmentography of ten of the amino acids present in soil was developed. The technique uses a computer driven quadrupole mass spectrometer and a commercial preparation of deuterated amino acids is used as internal standards for purposes of quantitation. The results obtained are comparable with those from an amino acid analyzer. In the quadrupole mass spectrometer-computer system up to 25 pre-selected ions may be monitored sequentially. This allows a maximum of 12 different amino acids (one specific ion in each of the undeuterated and deuterated amino acid spectra) to be quantitated. The method is relatively rapid (analysis time of approximately one hour) and is capable of the quantitation of nanogram quantities of amino acids.

  7. SWATH2stats: An R/Bioconductor Package to Process and Convert Quantitative SWATH-MS Proteomics Data for Downstream Analysis Tools.

    PubMed

    Blattmann, Peter; Heusel, Moritz; Aebersold, Ruedi

    2016-01-01

    SWATH-MS is an acquisition and analysis technique of targeted proteomics that enables measuring several thousand proteins with high reproducibility and accuracy across many samples. OpenSWATH is popular open-source software for peptide identification and quantification from SWATH-MS data. For downstream statistical and quantitative analysis there exist different tools such as MSstats, mapDIA and aLFQ. However, the transfer of data from OpenSWATH to the downstream statistical tools is currently technically challenging. Here we introduce the R/Bioconductor package SWATH2stats, which allows convenient processing of the data into a format directly readable by the downstream analysis tools. In addition, SWATH2stats allows annotation, analyzing the variation and the reproducibility of the measurements, FDR estimation, and advanced filtering before submitting the processed data to downstream tools. These functionalities are important to quickly analyze the quality of the SWATH-MS data. Hence, SWATH2stats is a new open-source tool that summarizes several practical functionalities for analyzing, processing, and converting SWATH-MS data and thus facilitates the efficient analysis of large-scale SWATH/DIA datasets.

  8. A grid for a precise analysis of daily activities.

    PubMed

    Wojtasik, V; Olivier, C; Lekeu, F; Quittre, A; Adam, S; Salmon, E

    2010-01-01

    Assessment of daily living activities is essential in patients with Alzheimer's disease. Most current tools quantitatively assess overall ability but provide little qualitative information on individual difficulties. Only a few tools allow therapists to evaluate stereotyped activities and record different types of errors. We capitalised on the Kitchen Activity Assessment to design a widely applicable analysis grid that provides both qualitative and quantitative data on activity performance. A cooking activity was videotaped in 15 patients with dementia and assessed according to the different steps in the execution of the task. The evaluations obtained with our grid showed good correlations between raters, between versions of the grid and between sessions. Moreover, the degree of independence obtained with our analysis of the task correlated with the Kitchen Activity Assessment score and with a global score of cognitive functioning. We conclude that assessment of a daily living activity with this analysis grid is reproducible and relatively independent of the therapist, and thus provides quantitative and qualitative information useful for both evaluating and caring for demented patients.

  9. A novel iris transillumination grading scale allowing flexible assessment with quantitative image analysis and visual matching.

    PubMed

    Wang, Chen; Brancusi, Flavia; Valivullah, Zaheer M; Anderson, Michael G; Cunningham, Denise; Hedberg-Buenz, Adam; Power, Bradley; Simeonov, Dimitre; Gahl, William A; Zein, Wadih M; Adams, David R; Brooks, Brian

    2018-01-01

    To develop a sensitive scale of iris transillumination suitable for clinical and research use, with the capability of either quantitative analysis or visual matching of images. Iris transillumination photographic images were used from 70 study subjects with ocular or oculocutaneous albinism. Subjects represented a broad range of ocular pigmentation. A subset of images was subjected to image analysis and ranking by both expert and nonexpert reviewers. Quantitative ordering of images was compared with ordering by visual inspection. Images were binned to establish an 8-point scale. Ranking consistency was evaluated using the Kendall rank correlation coefficient (Kendall's tau). Visual ranking results were assessed using Kendall's coefficient of concordance (Kendall's W) analysis. There was a high degree of correlation among the image analysis, expert-based and non-expert-based image rankings. Pairwise comparisons of the quantitative ranking with each reviewer generated an average Kendall's tau of 0.83 ± 0.04 (SD). Inter-rater correlation was also high with Kendall's W of 0.96, 0.95, and 0.95 for nonexpert, expert, and all reviewers, respectively. The current standard for assessing iris transillumination is expert assessment of clinical exam findings. We adapted an image-analysis technique to generate quantitative transillumination values. Quantitative ranking was shown to be highly similar to a ranking produced by both expert and nonexpert reviewers. This finding suggests that the image characteristics used to quantify iris transillumination do not require expert interpretation. Inter-rater rankings were also highly similar, suggesting that varied methods of transillumination ranking are robust in terms of producing reproducible results.

  10. Transcriptome discovery in non-model wild fish species for the development of quantitative transcript abundance assays.

    PubMed

    Hahn, Cassidy M; Iwanowicz, Luke R; Cornman, Robert S; Mazik, Patricia M; Blazer, Vicki S

    2016-12-01

    Environmental studies increasingly identify the presence of both contaminants of emerging concern (CECs) and legacy contaminants in aquatic environments; however, the biological effects of these compounds on resident fishes remain largely unknown. High throughput methodologies were employed to establish partial transcriptomes for three wild-caught, non-model fish species; smallmouth bass (Micropterus dolomieu), white sucker (Catostomus commersonii) and brown bullhead (Ameiurus nebulosus). Sequences from these transcriptome databases were utilized in the development of a custom nCounter CodeSet that allowed for direct multiplexed measurement of 50 transcript abundance endpoints in liver tissue. Sequence information was also utilized in the development of quantitative real-time PCR (qPCR) primers. Cross-species hybridization allowed the smallmouth bass nCounter CodeSet to be used for quantitative transcript abundance analysis of an additional non-model species, largemouth bass (Micropterus salmoides). We validated the nCounter analysis data system with qPCR for a subset of genes and confirmed concordant results. Changes in transcript abundance biomarkers between sexes and seasons were evaluated to provide baseline data on transcript modulation for each species of interest. Published by Elsevier Inc.

  11. Surface plasmon resonance microscopy: achieving a quantitative optical response

    PubMed Central

    Peterson, Alexander W.; Halter, Michael; Plant, Anne L.; Elliott, John T.

    2016-01-01

    Surface plasmon resonance (SPR) imaging allows real-time label-free imaging based on index of refraction, and changes in index of refraction at an interface. Optical parameter analysis is achieved by application of the Fresnel model to SPR data typically taken by an instrument in a prism based configuration. We carry out SPR imaging on a microscope by launching light into a sample, and collecting reflected light through a high numerical aperture microscope objective. The SPR microscope enables spatial resolution that approaches the diffraction limit, and has a dynamic range that allows detection of subnanometer to submicrometer changes in thickness of biological material at a surface. However, unambiguous quantitative interpretation of SPR changes using the microscope system could not be achieved using the Fresnel model because of polarization dependent attenuation and optical aberration that occurs in the high numerical aperture objective. To overcome this problem, we demonstrate a model to correct for polarization diattenuation and optical aberrations in the SPR data, and develop a procedure to calibrate reflectivity to index of refraction values. The calibration and correction strategy for quantitative analysis was validated by comparing the known indices of refraction of bulk materials with corrected SPR data interpreted with the Fresnel model. Subsequently, we applied our SPR microscopy method to evaluate the index of refraction for a series of polymer microspheres in aqueous media and validated the quality of the measurement with quantitative phase microscopy. PMID:27782542

  12. Development of multitissue microfluidic dynamic array for assessing changes in gene expression associated with channel catfish appetite, growth, metabolism, and intestinal health

    USDA-ARS?s Scientific Manuscript database

    Large-scale, gene expression methods allow for high throughput analysis of physiological pathways at a fraction of the cost of individual gene expression analysis. Systems, such as the Fluidigm quantitative PCR array described here, can provide powerful assessments of the effects of diet, environme...

  13. Psychometric Inferences from a Meta-Analysis of Reliability and Internal Consistency Coefficients

    ERIC Educational Resources Information Center

    Botella, Juan; Suero, Manuel; Gambara, Hilda

    2010-01-01

    A meta-analysis of the reliability of the scores from a specific test, also called reliability generalization, allows the quantitative synthesis of its properties from a set of studies. It is usually assumed that part of the variation in the reliability coefficients is due to some unknown and implicit mechanism that restricts and biases the…

  14. The new numerology of immunity mediated by virus-specific CD8(+) T cells.

    PubMed

    Doherty, P C

    1998-08-01

    Our understanding of virus-specific CD8(+) T cell responses is currently being revolutionized by peptide-based assay systems that allow flow cytometric analysis of effector and memory cytotoxic T lymphocyte populations. These techniques are, for the first time, putting the analysis of T-cell-mediated immunity on a quantitative basis.

  15. A Data-Processing System for Quantitative Analysis in Speech Production. CLCS Occasional Paper No. 17.

    ERIC Educational Resources Information Center

    Chasaide, Ailbhe Ni; Davis, Eugene

    The data processing system used at Trinity College's Centre for Language and Communication Studies (Ireland) enables computer-automated collection and analysis of phonetic data and has many advantages for research on speech production. The system allows accurate handling of large quantities of data, eliminates many of the limitations of manual…

  16. Analysis of Radio Frequency Surveillance Systems for Air Traffic Control : Volume 1. Text.

    DOT National Transportation Integrated Search

    1976-02-01

    Performance criteria that afford quantitative evaluation of a variety of current and proposed configurations of the Air Traffic Control Radar Beacon System (ATCRBS) are described in detail. Two analytic system models are developed to allow applicatio...

  17. A Comprehensive Guide for Performing Sample Preparation and Top-Down Protein Analysis

    PubMed Central

    Padula, Matthew P.; Berry, Iain J.; O′Rourke, Matthew B.; Raymond, Benjamin B.A.; Santos, Jerran; Djordjevic, Steven P.

    2017-01-01

    Methodologies for the global analysis of proteins in a sample, or proteome analysis, have been available since 1975 when Patrick O′Farrell published the first paper describing two-dimensional gel electrophoresis (2D-PAGE). This technique allowed the resolution of single protein isoforms, or proteoforms, into single ‘spots’ in a polyacrylamide gel, allowing the quantitation of changes in a proteoform′s abundance to ascertain changes in an organism′s phenotype when conditions change. In pursuit of the comprehensive profiling of the proteome, significant advances in technology have made the identification and quantitation of intact proteoforms from complex mixtures of proteins more routine, allowing analysis of the proteome from the ‘Top-Down’. However, the number of proteoforms detected by Top-Down methodologies such as 2D-PAGE or mass spectrometry has not significantly increased since O’Farrell’s paper when compared to Bottom-Up, peptide-centric techniques. This article explores and explains the numerous methodologies and technologies available to analyse the proteome from the Top-Down with a strong emphasis on the necessity to analyse intact proteoforms as a better indicator of changes in biology and phenotype. We arrive at the conclusion that the complete and comprehensive profiling of an organism′s proteome is still, at present, beyond our reach but the continuing evolution of protein fractionation techniques and mass spectrometry brings comprehensive Top-Down proteome profiling closer. PMID:28387712

  18. A Comprehensive Guide for Performing Sample Preparation and Top-Down Protein Analysis.

    PubMed

    Padula, Matthew P; Berry, Iain J; O Rourke, Matthew B; Raymond, Benjamin B A; Santos, Jerran; Djordjevic, Steven P

    2017-04-07

    Methodologies for the global analysis of proteins in a sample, or proteome analysis, have been available since 1975 when Patrick O'Farrell published the first paper describing two-dimensional gel electrophoresis (2D-PAGE). This technique allowed the resolution of single protein isoforms, or proteoforms, into single 'spots' in a polyacrylamide gel, allowing the quantitation of changes in a proteoform's abundance to ascertain changes in an organism's phenotype when conditions change. In pursuit of the comprehensive profiling of the proteome, significant advances in technology have made the identification and quantitation of intact proteoforms from complex mixtures of proteins more routine, allowing analysis of the proteome from the 'Top-Down'. However, the number of proteoforms detected by Top-Down methodologies such as 2D-PAGE or mass spectrometry has not significantly increased since O'Farrell's paper when compared to Bottom-Up, peptide-centric techniques. This article explores and explains the numerous methodologies and technologies available to analyse the proteome from the Top-Down with a strong emphasis on the necessity to analyse intact proteoforms as a better indicator of changes in biology and phenotype. We arrive at the conclusion that the complete and comprehensive profiling of an organism's proteome is still, at present, beyond our reach but the continuing evolution of protein fractionation techniques and mass spectrometry brings comprehensive Top-Down proteome profiling closer.

  19. Technical Advance: Live-imaging analysis of human dendritic cell migrating behavior under the influence of immune-stimulating reagents in an organotypic model of lung

    PubMed Central

    Nguyen Hoang, Anh Thu; Chen, Puran; Björnfot, Sofia; Högstrand, Kari; Lock, John G.; Grandien, Alf; Coles, Mark; Svensson, Mattias

    2014-01-01

    This manuscript describes technical advances allowing manipulation and quantitative analyses of human DC migratory behavior in lung epithelial tissue. DCs are hematopoietic cells essential for the maintenance of tissue homeostasis and the induction of tissue-specific immune responses. Important functions include cytokine production and migration in response to infection for the induction of proper immune responses. To design appropriate strategies to exploit human DC functional properties in lung tissue for the purpose of clinical evaluation, e.g., candidate vaccination and immunotherapy strategies, we have developed a live-imaging assay based on our previously described organotypic model of the human lung. This assay allows provocations and subsequent quantitative investigations of DC functional properties under conditions mimicking morphological and functional features of the in vivo parental tissue. We present protocols to set up and prepare tissue models for 4D (x, y, z, time) fluorescence-imaging analysis that allow spatial and temporal studies of human DCs in live epithelial tissue, followed by flow cytometry analysis of DCs retrieved from digested tissue models. This model system can be useful for elucidating incompletely defined pathways controlling DC functional responses to infection and inflammation in lung epithelial tissue, as well as the efficacy of locally administered candidate interventions. PMID:24899587

  20. Benchmarking quantitative label-free LC-MS data processing workflows using a complex spiked proteomic standard dataset.

    PubMed

    Ramus, Claire; Hovasse, Agnès; Marcellin, Marlène; Hesse, Anne-Marie; Mouton-Barbosa, Emmanuelle; Bouyssié, David; Vaca, Sebastian; Carapito, Christine; Chaoui, Karima; Bruley, Christophe; Garin, Jérôme; Cianférani, Sarah; Ferro, Myriam; Van Dorssaeler, Alain; Burlet-Schiltz, Odile; Schaeffer, Christine; Couté, Yohann; Gonzalez de Peredo, Anne

    2016-01-30

    Proteomic workflows based on nanoLC-MS/MS data-dependent-acquisition analysis have progressed tremendously in recent years. High-resolution and fast sequencing instruments have enabled the use of label-free quantitative methods, based either on spectral counting or on MS signal analysis, which appear as an attractive way to analyze differential protein expression in complex biological samples. However, the computational processing of the data for label-free quantification still remains a challenge. Here, we used a proteomic standard composed of an equimolar mixture of 48 human proteins (Sigma UPS1) spiked at different concentrations into a background of yeast cell lysate to benchmark several label-free quantitative workflows, involving different software packages developed in recent years. This experimental design allowed to finely assess their performances in terms of sensitivity and false discovery rate, by measuring the number of true and false-positive (respectively UPS1 or yeast background proteins found as differential). The spiked standard dataset has been deposited to the ProteomeXchange repository with the identifier PXD001819 and can be used to benchmark other label-free workflows, adjust software parameter settings, improve algorithms for extraction of the quantitative metrics from raw MS data, or evaluate downstream statistical methods. Bioinformatic pipelines for label-free quantitative analysis must be objectively evaluated in their ability to detect variant proteins with good sensitivity and low false discovery rate in large-scale proteomic studies. This can be done through the use of complex spiked samples, for which the "ground truth" of variant proteins is known, allowing a statistical evaluation of the performances of the data processing workflow. We provide here such a controlled standard dataset and used it to evaluate the performances of several label-free bioinformatics tools (including MaxQuant, Skyline, MFPaQ, IRMa-hEIDI and Scaffold) in different workflows, for detection of variant proteins with different absolute expression levels and fold change values. The dataset presented here can be useful for tuning software tool parameters, and also testing new algorithms for label-free quantitative analysis, or for evaluation of downstream statistical methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Assessing locomotor skills development in childhood using wearable inertial sensor devices: the running paradigm.

    PubMed

    Masci, Ilaria; Vannozzi, Giuseppe; Bergamini, Elena; Pesce, Caterina; Getchell, Nancy; Cappozzo, Aurelio

    2013-04-01

    Objective quantitative evaluation of motor skill development is of increasing importance to carefully drive physical exercise programs in childhood. Running is a fundamental motor skill humans adopt to accomplish locomotion, which is linked to physical activity levels, although the assessment is traditionally carried out using qualitative evaluation tests. The present study aimed at investigating the feasibility of using inertial sensors to quantify developmental differences in the running pattern of young children. Qualitative and quantitative assessment tools were adopted to identify a skill-sensitive set of biomechanical parameters for running and to further our understanding of the factors that determine progression to skilled running performance. Running performances of 54 children between the ages of 2 and 12 years were submitted to both qualitative and quantitative analysis, the former using sequences of developmental level, the latter estimating temporal and kinematic parameters from inertial sensor measurements. Discriminant analysis with running developmental level as dependent variable allowed to identify a set of temporal and kinematic parameters, within those obtained with the sensor, that best classified children into the qualitative developmental levels (accuracy higher than 67%). Multivariate analysis of variance with the quantitative parameters as dependent variables allowed to identify whether and which specific parameters or parameter subsets were differentially sensitive to specific transitions between contiguous developmental levels. The findings showed that different sets of temporal and kinematic parameters are able to tap all steps of the transitional process in running skill described through qualitative observation and can be prospectively used for applied diagnostic and sport training purposes. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Methodology for determining the investment attractiveness of construction of high-rise buildings

    NASA Astrophysics Data System (ADS)

    Nezhnikova, Ekaterina; Kashirin, Valentin; Davydova, Yana; Kazakova, Svetlana

    2018-03-01

    The article presents the analysis of the existing methods for assessing the investment attractiveness of high-rise construction. The authors determined and justified the primary choice of objects and territories that are the most attractive for the development of high-rise construction. A system of risk indicators has been developed that allow making a quantitative adjustment for a particular project in the evaluation of the efficiency of investment projects. The study is aimed at developing basic methodological concepts for a comparative evaluation of the prospects of construction of high-rise facilities that allow to take into consideration the features of investment in construction and to enable quantitative evaluation of the investment effectiveness in high-rise construction.

  3. Subsurface imaging and cell refractometry using quantitative phase/ shear-force feedback microscopy

    NASA Astrophysics Data System (ADS)

    Edward, Kert; Farahi, Faramarz

    2009-10-01

    Over the last few years, several novel quantitative phase imaging techniques have been developed for the study of biological cells. However, many of these techniques are encumbered by inherent limitations including 2π phase ambiguities and diffraction limited spatial resolution. In addition, subsurface information in the phase data is not exploited. We hereby present a novel quantitative phase imaging system without 2 π ambiguities, which also allows for subsurface imaging and cell refractometry studies. This is accomplished by utilizing simultaneously obtained shear-force topography information. We will demonstrate how the quantitative phase and topography data can be used for subsurface and cell refractometry analysis and will present results for a fabricated structure and a malaria infected red blood cell.

  4. One step screening of retroviral producer clones by real time quantitative PCR.

    PubMed

    Towers, G J; Stockholm, D; Labrousse-Najburg, V; Carlier, F; Danos, O; Pagès, J C

    1999-01-01

    Recombinant retroviruses are obtained from either stably or transiently transfected retrovirus producer cells. In the case of stably producing lines, a large number of clones must be screened in order to select the one with the highest titre. The multi-step selection of high titre producing clones is time consuming and expensive. We have taken advantage of retroviral endogenous reverse transcription to develop a quantitative PCR assay on crude supernatant from producing clones. We used Taqman PCR technology, which, by using fluorescence measurement at each cycle of amplification, allows PCR product quantification. Fluorescence results from specific degradation of a probe oligonucleotide by the Taq polymerase 3'-5' exonuclease activity. Primers and probe sequences were chosen to anneal to the viral strong stop species, which is the first DNA molecule synthesised during reverse transcription. The protocol consists of a single real time PCR, using as template filtered viral supernatant without any other pre-treatment. We show that the primers and probe described allow quantitation of serially diluted plasmid to as few as 15 plasmid molecules. We then test 200 GFP-expressing retroviral-producing clones either by FACS analysis of infected cells or by using the quantitative PCR. We confirm that the Taqman protocol allows the detection of virus in supernatant and selection of high titre clones. Furthermore, we can determine infectious titre by quantitative PCR on genomic DNA from infected cells, using an additional set of primers and probe to albumin to normalise for the genomic copy number. We demonstrate that real time quantitative PCR can be used as a powerful and reliable single step, high throughput screen for high titre retroviral producer clones.

  5. Self-Regulated Learning in Virtual Communities

    ERIC Educational Resources Information Center

    Delfino, Manuela; Dettori, Giuliana; Persico, Donatella

    2008-01-01

    This paper investigates self-regulated learning (SRL) in a virtual learning community of adults interacting through asynchronous textual communication. The investigation method chosen is interaction analysis, a qualitative/quantitative approach allowing a systematic study of the contents of the messages exchanged within online communities. The…

  6. An analysis of radio frequency surveillance systems for air traffic control volume II: appendixes

    DOT National Transportation Integrated Search

    1976-02-01

    Performance criteria that afford quantitative evaluation of a variety of current and proposed configurations of the Air Traffic Control Radar Beacon System (ATCRBS) are described in detail. Two analytic system models are developed to allow applicatio...

  7. Blackboard architecture for medical image interpretation

    NASA Astrophysics Data System (ADS)

    Davis, Darryl N.; Taylor, Christopher J.

    1991-06-01

    There is a growing interest in using sophisticated knowledge-based systems for biomedical image interpretation. We present a principled attempt to use artificial intelligence methodologies in interpreting lateral skull x-ray images. Such radiographs are routinely used in cephalometric analysis to provide quantitative measurements useful to clinical orthodontists. Manual and interactive methods of analysis are known to be error prone and previous attempts to automate this analysis typically fail to capture the expertise and adaptability required to cope with the variability in biological structure and image quality. An integrated model-based system has been developed which makes use of a blackboard architecture and multiple knowledge sources. A model definition interface allows quantitative models, of feature appearance and location, to be built from examples as well as more qualitative modelling constructs. Visual task definition and blackboard control modules allow task-specific knowledge sources to act on information available to the blackboard in a hypothesise and test reasoning cycle. Further knowledge-based modules include object selection, location hypothesis, intelligent segmentation, and constraint propagation systems. Alternative solutions to given tasks are permitted.

  8. Improved method for HPLC analysis of polyamines, agmatine and aromatic monoamines in plant tissue

    NASA Technical Reports Server (NTRS)

    Slocum, R. D.; Flores, H. E.; Galston, A. W.; Weinstein, L. H.

    1989-01-01

    The high performance liquid chromatographic (HPLC) method of Flores and Galston (1982 Plant Physiol 69: 701) for the separation and quantitation of benzoylated polyamines in plant tissues has been widely adopted by other workers. However, due to previously unrecognized problems associated with the derivatization of agmatine, this important intermediate in plant polyamine metabolism cannot be quantitated using this method. Also, two polyamines, putrescine and diaminopropane, also are not well resolved using this method. A simple modification of the original HPLC procedure greatly improves the separation and quantitation of these amines, and further allows the simulation analysis of phenethylamine and tyramine, which are major monoamine constituents of tobacco and other plant tissues. We have used this modified HPLC method to characterize amine titers in suspension cultured carrot (Daucas carota L.) cells and tobacco (Nicotiana tabacum L.) leaf tissues.

  9. Resilience Among Naval Recruits: A Quantitative and Qualitative Analysis of Interventions at Recruit Training Command and Implications on Fleet Readiness

    DTIC Science & Technology

    2016-03-01

    associated with higher levels of resilience (Connor & Davidson, 2003). The CD-RISC offers a validated quantitative scale to researchers , allowing for the...a total of 35 recruits and 12 RDCs were interviewed. Four focus groups and 30 personal interviews were conducted. The interviews included recruits...two to four individuals. The interviews and focus groups were semi-structured. A set of questions were identified prior to the interviews as a

  10. Simple preparation of plant epidermal tissue for laser microdissection and downstream quantitative proteome and carbohydrate analysis

    PubMed Central

    Falter, Christian; Ellinger, Dorothea; von Hülsen, Behrend; Heim, René; Voigt, Christian A.

    2015-01-01

    The outwardly directed cell wall and associated plasma membrane of epidermal cells represent the first layers of plant defense against intruding pathogens. Cell wall modifications and the formation of defense structures at sites of attempted pathogen penetration are decisive for plant defense. A precise isolation of these stress-induced structures would allow a specific analysis of regulatory mechanism and cell wall adaption. However, methods for large-scale epidermal tissue preparation from the model plant Arabidopsis thaliana, which would allow proteome and cell wall analysis of complete, laser-microdissected epidermal defense structures, have not been provided. We developed the adhesive tape – liquid cover glass technique (ACT) for simple leaf epidermis preparation from A. thaliana, which is also applicable on grass leaves. This method is compatible with subsequent staining techniques to visualize stress-related cell wall structures, which were precisely isolated from the epidermal tissue layer by laser microdissection (LM) coupled to laser pressure catapulting. We successfully demonstrated that these specific epidermal tissue samples could be used for quantitative downstream proteome and cell wall analysis. The development of the ACT for simple leaf epidermis preparation and the compatibility to LM and downstream quantitative analysis opens new possibilities in the precise examination of stress- and pathogen-related cell wall structures in epidermal cells. Because the developed tissue processing is also applicable on A. thaliana, well-established, model pathosystems that include the interaction with powdery mildews can be studied to determine principal regulatory mechanisms in plant–microbe interaction with their potential outreach into crop breeding. PMID:25870605

  11. Simple preparation of plant epidermal tissue for laser microdissection and downstream quantitative proteome and carbohydrate analysis.

    PubMed

    Falter, Christian; Ellinger, Dorothea; von Hülsen, Behrend; Heim, René; Voigt, Christian A

    2015-01-01

    The outwardly directed cell wall and associated plasma membrane of epidermal cells represent the first layers of plant defense against intruding pathogens. Cell wall modifications and the formation of defense structures at sites of attempted pathogen penetration are decisive for plant defense. A precise isolation of these stress-induced structures would allow a specific analysis of regulatory mechanism and cell wall adaption. However, methods for large-scale epidermal tissue preparation from the model plant Arabidopsis thaliana, which would allow proteome and cell wall analysis of complete, laser-microdissected epidermal defense structures, have not been provided. We developed the adhesive tape - liquid cover glass technique (ACT) for simple leaf epidermis preparation from A. thaliana, which is also applicable on grass leaves. This method is compatible with subsequent staining techniques to visualize stress-related cell wall structures, which were precisely isolated from the epidermal tissue layer by laser microdissection (LM) coupled to laser pressure catapulting. We successfully demonstrated that these specific epidermal tissue samples could be used for quantitative downstream proteome and cell wall analysis. The development of the ACT for simple leaf epidermis preparation and the compatibility to LM and downstream quantitative analysis opens new possibilities in the precise examination of stress- and pathogen-related cell wall structures in epidermal cells. Because the developed tissue processing is also applicable on A. thaliana, well-established, model pathosystems that include the interaction with powdery mildews can be studied to determine principal regulatory mechanisms in plant-microbe interaction with their potential outreach into crop breeding.

  12. The role of PET quantification in cardiovascular imaging.

    PubMed

    Slomka, Piotr; Berman, Daniel S; Alexanderson, Erick; Germano, Guido

    2014-08-01

    Positron Emission Tomography (PET) has several clinical and research applications in cardiovascular imaging. Myocardial perfusion imaging with PET allows accurate global and regional measurements of myocardial perfusion, myocardial blood flow and function at stress and rest in one exam. Simultaneous assessment of function and perfusion by PET with quantitative software is currently the routine practice. Combination of ejection fraction reserve with perfusion information may improve the identification of severe disease. The myocardial viability can be estimated by quantitative comparison of fluorodeoxyglucose ( 18 FDG) and rest perfusion imaging. The myocardial blood flow and coronary flow reserve measurements are becoming routinely included in the clinical assessment due to enhanced dynamic imaging capabilities of the latest PET/CT scanners. Absolute flow measurements allow evaluation of the coronary microvascular dysfunction and provide additional prognostic and diagnostic information for coronary disease. Standard quantitative approaches to compute myocardial blood flow from kinetic PET data in automated and rapid fashion have been developed for 13 N-ammonia, 15 O-water and 82 Rb radiotracers. The agreement between software methods available for such analysis is excellent. Relative quantification of 82 Rb PET myocardial perfusion, based on comparisons to normal databases, demonstrates high performance for the detection of obstructive coronary disease. New tracers, such as 18 F-flurpiridaz may allow further improvements in the disease detection. Computerized analysis of perfusion at stress and rest reduces the variability of the assessment as compared to visual analysis. PET quantification can be enhanced by precise coregistration with CT angiography. In emerging clinical applications, the potential to identify vulnerable plaques by quantification of atherosclerotic plaque uptake of 18 FDG and 18 F-sodium fluoride tracers in carotids, aorta and coronary arteries has been demonstrated.

  13. Apricot DNA as an indicator for persipan: detection and quantitation in marzipan using ligation-dependent probe amplification.

    PubMed

    Luber, Florian; Demmel, Anja; Hosken, Anne; Busch, Ulrich; Engel, Karl-Heinz

    2012-06-13

    The confectionery ingredient marzipan is exclusively prepared from almond kernels and sugar. The potential use of apricot kernels, so-called persipan, is an important issue for the quality assessment of marzipan. Therefore, a ligation-dependent probe amplification (LPA) assay was developed that enables a specific and sensitive detection of apricot DNA, as an indicator for the presence of persipan. The limit of detection was determined to be 0.1% persipan in marzipan. The suitability of the method was confirmed by the analysis of 20 commercially available food samples. The integration of a Prunus -specific probe in the LPA assay as a reference allowed for the relative quantitation of persipan in marzipan. The limit of quantitation was determined to be 0.5% persipan in marzipan. The analysis of two self-prepared mixtures of marzipan and persipan demonstrated the applicability of the quantitation method at concentration levels of practical relevance for quality control.

  14. Investigating the Magnetic Interaction with Geomag and Tracker Video Analysis: Static Equilibrium and Anharmonic Dynamics

    ERIC Educational Resources Information Center

    Onorato, P.; Mascheretti, P.; DeAmbrosis, A.

    2012-01-01

    In this paper, we describe how simple experiments realizable by using easily found and low-cost materials allow students to explore quantitatively the magnetic interaction thanks to the help of an Open Source Physics tool, the Tracker Video Analysis software. The static equilibrium of a "column" of permanents magnets is carefully investigated by…

  15. Quantitative X-ray mapping, scatter diagrams and the generation of correction maps to obtain more information about your material

    NASA Astrophysics Data System (ADS)

    Wuhrer, R.; Moran, K.

    2014-03-01

    Quantitative X-ray mapping with silicon drift detectors and multi-EDS detector systems have become an invaluable analysis technique and one of the most useful methods of X-ray microanalysis today. The time to perform an X-ray map has reduced considerably with the ability to map minor and trace elements very accurately due to the larger detector area and higher count rate detectors. Live X-ray imaging can now be performed with a significant amount of data collected in a matter of minutes. A great deal of information can be obtained from X-ray maps. This includes; elemental relationship or scatter diagram creation, elemental ratio mapping, chemical phase mapping (CPM) and quantitative X-ray maps. In obtaining quantitative x-ray maps, we are able to easily generate atomic number (Z), absorption (A), fluorescence (F), theoretical back scatter coefficient (η), and quantitative total maps from each pixel in the image. This allows us to generate an image corresponding to each factor (for each element present). These images allow the user to predict and verify where they are likely to have problems in our images, and are especially helpful to look at possible interface artefacts. The post-processing techniques to improve the quantitation of X-ray map data and the development of post processing techniques for improved characterisation are covered in this paper.

  16. Infrared Multiphoton Dissociation for Quantitative Shotgun Proteomics

    PubMed Central

    Ledvina, Aaron R.; Lee, M. Violet; McAlister, Graeme C.; Westphall, Michael S.; Coon, Joshua J.

    2012-01-01

    We modified a dual-cell linear ion trap mass spectrometer to perform infrared multiphoton dissociation (IRMPD) in the low pressure trap of a dual-cell quadrupole linear ion trap (dual cell QLT) and perform large-scale IRMPD analyses of complex peptide mixtures. Upon optimization of activation parameters (precursor q-value, irradiation time, and photon flux), IRMPD subtly, but significantly outperforms resonant excitation CAD for peptides identified at a 1% false-discovery rate (FDR) from a yeast tryptic digest (95% confidence, p = 0.019). We further demonstrate that IRMPD is compatible with the analysis of isobaric-tagged peptides. Using fixed QLT RF amplitude allows for the consistent retention of reporter ions, but necessitates the use of variable IRMPD irradiation times, dependent upon precursor mass-to-charge (m/z). We show that IRMPD activation parameters can be tuned to allow for effective peptide identification and quantitation simultaneously. We thus conclude that IRMPD performed in a dual-cell ion trap is an effective option for the large-scale analysis of both unmodified and isobaric-tagged peptides. PMID:22480380

  17. Sensorized toys for measuring manipulation capabilities of infants at home.

    PubMed

    Passetti, Giovanni; Cecchi, Francesca; Baldoli, Ilaria; Sgandurra, Giuseppina; Beani, Elena; Cioni, Giovanni; Laschi, Cecilia; Dario, Paolo

    2015-01-01

    Preterm infants, i.e. babies born after a gestation period shorter than 37 weeks, spend less time exploring objects. The quantitative measurement of grasping actions and forces in infants can give insights on their typical or atypical motor development. The aim of this work was to test a new tool, a kit of sensorized toys, to longitudinally measure, monitor and promote preterm infants manipulation capabilities with a purposive training in an ecological environment. This study presents preliminary analysis of grasping activity. Three preterm infants performed 4 weeks of daily training at home. Sensorized toys with embedded pressure sensors were used as part of the training to allow quantitative analysis of grasping (pressure and acceleration applied to toys while playing). Each toy was placed on the midline, while the infant was in supine position. Preliminary data show differences in the grasping parameters in relation to infants age and the performed daily training. Ongoing clinical trial will allow a full validation of this new tool for promoting object exploration in preterm infants.

  18. 3D Filament Network Segmentation with Multiple Active Contours

    NASA Astrophysics Data System (ADS)

    Xu, Ting; Vavylonis, Dimitrios; Huang, Xiaolei

    2014-03-01

    Fluorescence microscopy is frequently used to study two and three dimensional network structures formed by cytoskeletal polymer fibers such as actin filaments and microtubules. While these cytoskeletal structures are often dilute enough to allow imaging of individual filaments or bundles of them, quantitative analysis of these images is challenging. To facilitate quantitative, reproducible and objective analysis of the image data, we developed a semi-automated method to extract actin networks and retrieve their topology in 3D. Our method uses multiple Stretching Open Active Contours (SOACs) that are automatically initialized at image intensity ridges and then evolve along the centerlines of filaments in the network. SOACs can merge, stop at junctions, and reconfigure with others to allow smooth crossing at junctions of filaments. The proposed approach is generally applicable to images of curvilinear networks with low SNR. We demonstrate its potential by extracting the centerlines of synthetic meshwork images, actin networks in 2D TIRF Microscopy images, and 3D actin cable meshworks of live fission yeast cells imaged by spinning disk confocal microscopy.

  19. Composition and Quantitation of Microalgal Lipids by ERETIC 1H NMR Method

    PubMed Central

    Nuzzo, Genoveffa; Gallo, Carmela; d’Ippolito, Giuliana; Cutignano, Adele; Sardo, Angela; Fontana, Angelo

    2013-01-01

    Accurate characterization of biomass constituents is a crucial aspect of research in the biotechnological application of natural products. Here we report an efficient, fast and reproducible method for the identification and quantitation of fatty acids and complex lipids (triacylglycerols, glycolipids, phospholipids) in microalgae under investigation for the development of functional health products (probiotics, food ingredients, drugs, etc.) or third generation biofuels. The procedure consists of extraction of the biological matrix by modified Folch method and direct analysis of the resulting material by proton nuclear magnetic resonance (1H NMR). The protocol uses a reference electronic signal as external standard (ERETIC method) and allows assessment of total lipid content, saturation degree and class distribution in both high throughput screening of algal collection and metabolic analysis during genetic or culturing studies. As proof of concept, the methodology was applied to the analysis of three microalgal species (Thalassiosira weissflogii, Cyclotella cryptica and Nannochloropsis salina) which drastically differ for the qualitative and quantitative composition of their fatty acid-based lipids. PMID:24084790

  20. Quantitative risk assessment system (QRAS)

    NASA Technical Reports Server (NTRS)

    Tan, Zhibin (Inventor); Mosleh, Ali (Inventor); Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Chang, Yung-Hsien (Inventor); Groen, Francisco J (Inventor); Swaminathan, Sankaran (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  1. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    NASA Technical Reports Server (NTRS)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  2. In vivo confocal microscopy of the cornea: New developments in image acquisition, reconstruction and analysis using the HRT-Rostock Corneal Module

    PubMed Central

    Petroll, W. Matthew; Robertson, Danielle M.

    2015-01-01

    The optical sectioning ability of confocal microscopy allows high magnification images to be obtained from different depths within a thick tissue specimen, and is thus ideally suited to the study of intact tissue in living subjects. In vivo confocal microscopy has been used in a variety of corneal research and clinical applications since its development over 25 years ago. In this article we review the latest developments in quantitative corneal imaging with the Heidelberg Retinal Tomograph with Rostock Corneal Module (HRT-RCM). We provide an overview of the unique strengths and weaknesses of the HRT-RCM. We discuss techniques for performing 3-D imaging with the HRT-RCM, including hardware and software modifications that allow full thickness confocal microscopy through focusing (CMTF) of the cornea, which can provide quantitative measurements of corneal sublayer thicknesses, stromal cell and extracellular matrix backscatter, and depth dependent changes in corneal keratocyte density. We also review current approaches for quantitative imaging of the subbasal nerve plexus, which require a combination of advanced image acquisition and analysis procedures, including wide field mapping and 3-D reconstruction of nerve structures. The development of new hardware, software, and acquisition techniques continues to expand the number of applications of the HRT-RCM for quantitative in vivo corneal imaging at the cellular level. Knowledge of these rapidly evolving strategies should benefit corneal clinicians and basic scientists alike. PMID:25998608

  3. RNA-binding Protein Immunoprecipitation (RIP) to Examine AUF1 Binding to Senescence-Associated Secretory Phenotype (SASP) Factor mRNA

    PubMed Central

    Alspach, Elise; Stewart, Sheila A.

    2016-01-01

    Immunoprecipitation and subsequent isolation of nucleic acids allows for the investigation of protein:nucleic acid interactions. RNA-binding protein immunoprecipitation (RIP) is used for the analysis of protein interactions with mRNA. Combining RIP with quantitative real-time PCR (qRT-PCR) further enhances the RIP technique by allowing for the quantitative assessment of RNA-binding protein interactions with their target mRNAs, and how these interactions change in different cellular settings. Here, we describe the immunoprecipitation of the RNA-binding protein AUF1 with several different factors associated with the senescence-associated secretory phenotype (SASP) (Alspach and Stewart, 2013), specifically IL6 and IL8. This protocol was originally published in Alspach et al. (2014). PMID:27453911

  4. Quantitative probe of the transition metal redox in battery electrodes through soft x-ray absorption spectroscopy

    NASA Astrophysics Data System (ADS)

    Li, Qinghao; Qiao, Ruimin; Wray, L. Andrew; Chen, Jun; Zhuo, Zengqing; Chen, Yanxue; Yan, Shishen; Pan, Feng; Hussain, Zahid; Yang, Wanli

    2016-10-01

    Most battery positive electrodes operate with a 3d transition-metal (TM) reaction centre. A direct and quantitative probe of the TM states upon electrochemical cycling is valuable for understanding the detailed cycling mechanism and charge diffusion in the electrodes, which is related with many practical parameters of a battery. This review includes a comprehensive summary of our recent demonstrations of five different types of quantitative analysis of the TM states in battery electrodes based on soft x-ray absorption spectroscopy and multiplet calculations. In LiFePO4, a system of a well-known two-phase transformation type, the TM redox could be strictly determined through a simple linear combination of the two end-members. In Mn-based compounds, the Mn states could also be quantitatively evaluated, but a set of reference spectra with all the three possible Mn valences needs to be deliberately selected and considered in the fitting. Although the fluorescence signals suffer the self-absorption distortion, the multiplet calculations could consider the distortion effect, which allows a quantitative determination of the overall Ni oxidation state in the bulk. With the aid of multiplet calculations, one could also achieve a quasi-quantitative analysis of the Co redox evolution in LiCoO2 based on the energy position of the spectroscopic peak. The benefit of multiplet calculations is more important for studying electrode materials with TMs of mixed spin states, as exemplified by the quantitative analysis of the mixed spin Na2-x Fe2(CN)6 system. At the end, we showcase that such quantitative analysis could provide valuable information for optimizing the electrochemical performance of Na0.44MnO2 electrodes for Na-ion batteries. The methodology summarized in this review could be extended to other energy application systems with TM redox centre for detailed analysis, for example, fuel cell and catalytic materials.

  5. Visualization and Hierarchical Analysis of Flow in Discrete Fracture Network Models

    NASA Astrophysics Data System (ADS)

    Aldrich, G. A.; Gable, C. W.; Painter, S. L.; Makedonska, N.; Hamann, B.; Woodring, J.

    2013-12-01

    Flow and transport in low permeability fractured rock is primary in interconnected fracture networks. Prediction and characterization of flow and transport in fractured rock has important implications in underground repositories for hazardous materials (eg. nuclear and chemical waste), contaminant migration and remediation, groundwater resource management, and hydrocarbon extraction. We have developed methods to explicitly model flow in discrete fracture networks and track flow paths using passive particle tracking algorithms. Visualization and analysis of particle trajectory through the fracture network is important to understanding fracture connectivity, flow patterns, potential contaminant pathways and fast paths through the network. However, occlusion due to the large number of highly tessellated and intersecting fracture polygons preclude the effective use of traditional visualization methods. We would also like quantitative analysis methods to characterize the trajectory of a large number of particle paths. We have solved these problems by defining a hierarchal flow network representing the topology of particle flow through the fracture network. This approach allows us to analyses the flow and the dynamics of the system as a whole. We are able to easily query the flow network, and use paint-and-link style framework to filter the fracture geometry and particle traces based on the flow analytics. This allows us to greatly reduce occlusion while emphasizing salient features such as the principal transport pathways. Examples are shown that demonstrate the methodology and highlight how use of this new method allows quantitative analysis and characterization of flow and transport in a number of representative fracture networks.

  6. Separation and quantitation of plant and insect carbohydrate isomers found on the surface of cotton

    USDA-ARS?s Scientific Manuscript database

    Cotton stickiness researchers have worked to create ion chromatography (IC) carbohydrate separation methods which allow for minimal analysis time and reduced operational costs. Researchers have also tried to correlate scientifically backed IC data with the available physical stickiness tests, such ...

  7. RGB color calibration for quantitative image analysis: the "3D thin-plate spline" warping approach.

    PubMed

    Menesatti, Paolo; Angelini, Claudio; Pallottino, Federico; Antonucci, Francesca; Aguzzi, Jacopo; Costa, Corrado

    2012-01-01

    In the last years the need to numerically define color by its coordinates in n-dimensional space has increased strongly. Colorimetric calibration is fundamental in food processing and other biological disciplines to quantitatively compare samples' color during workflow with many devices. Several software programmes are available to perform standardized colorimetric procedures, but they are often too imprecise for scientific purposes. In this study, we applied the Thin-Plate Spline interpolation algorithm to calibrate colours in sRGB space (the corresponding Matlab code is reported in the Appendix). This was compared with other two approaches. The first is based on a commercial calibration system (ProfileMaker) and the second on a Partial Least Square analysis. Moreover, to explore device variability and resolution two different cameras were adopted and for each sensor, three consecutive pictures were acquired under four different light conditions. According to our results, the Thin-Plate Spline approach reported a very high efficiency of calibration allowing the possibility to create a revolution in the in-field applicative context of colour quantification not only in food sciences, but also in other biological disciplines. These results are of great importance for scientific color evaluation when lighting conditions are not controlled. Moreover, it allows the use of low cost instruments while still returning scientifically sound quantitative data.

  8. Find Pairs: The Module for Protein Quantification of the PeakQuant Software Suite

    PubMed Central

    Eisenacher, Martin; Kohl, Michael; Wiese, Sebastian; Hebeler, Romano; Meyer, Helmut E.

    2012-01-01

    Abstract Accurate quantification of proteins is one of the major tasks in current proteomics research. To address this issue, a wide range of stable isotope labeling techniques have been developed, allowing one to quantitatively study thousands of proteins by means of mass spectrometry. In this article, the FindPairs module of the PeakQuant software suite is detailed. It facilitates the automatic determination of protein abundance ratios based on the automated analysis of stable isotope-coded mass spectrometric data. Furthermore, it implements statistical methods to determine outliers due to biological as well as technical variance of proteome data obtained in replicate experiments. This provides an important means to evaluate the significance in obtained protein expression data. For demonstrating the high applicability of FindPairs, we focused on the quantitative analysis of proteome data acquired in 14N/15N labeling experiments. We further provide a comprehensive overview of the features of the FindPairs software, and compare these with existing quantification packages. The software presented here supports a wide range of proteomics applications, allowing one to quantitatively assess data derived from different stable isotope labeling approaches, such as 14N/15N labeling, SILAC, and iTRAQ. The software is publicly available at http://www.medizinisches-proteom-center.de/software and free for academic use. PMID:22909347

  9. Optimal Hotspots of Dynamic Surfaced-Enhanced Raman Spectroscopy for Drugs Quantitative Detection.

    PubMed

    Yan, Xiunan; Li, Pan; Zhou, Binbin; Tang, Xianghu; Li, Xiaoyun; Weng, Shizhuang; Yang, Liangbao; Liu, Jinhuai

    2017-05-02

    Surface-enhanced Raman spectroscopy (SERS) as a powerful qualitative analysis method has been widely applied in many fields. However, SERS for quantitative analysis still suffers from several challenges partially because of the absence of stable and credible analytical strategy. Here, we demonstrate that the optimal hotspots created from dynamic surfaced-enhanced Raman spectroscopy (D-SERS) can be used for quantitative SERS measurements. In situ small-angle X-ray scattering was carried out to in situ real-time monitor the formation of the optimal hotspots, where the optimal hotspots with the most efficient hotspots were generated during the monodisperse Au-sol evaporating process. Importantly, the natural evaporation of Au-sol avoids the nanoparticles instability of salt-induced, and formation of ordered three-dimensional hotspots allows SERS detection with excellent reproducibility. Considering SERS signal variability in the D-SERS process, 4-mercaptopyridine (4-mpy) acted as internal standard to validly correct and improve stability as well as reduce fluctuation of signals. The strongest SERS spectra at the optimal hotspots of D-SERS have been extracted to statistics analysis. By using the SERS signal of 4-mpy as a stable internal calibration standard, the relative SERS intensity of target molecules demonstrated a linear response versus the negative logarithm of concentrations at the point of strongest SERS signals, which illustrates the great potential for quantitative analysis. The public drugs 3,4-methylenedioxymethamphetamine and α-methyltryptamine hydrochloride obtained precise analysis with internal standard D-SERS strategy. As a consequence, one has reason to believe our approach is promising to challenge quantitative problems in conventional SERS analysis.

  10. Multivariate calibration in Laser-Induced Breakdown Spectroscopy quantitative analysis: The dangers of a 'black box' approach and how to avoid them

    NASA Astrophysics Data System (ADS)

    Safi, A.; Campanella, B.; Grifoni, E.; Legnaioli, S.; Lorenzetti, G.; Pagnotta, S.; Poggialini, F.; Ripoll-Seguer, L.; Hidalgo, M.; Palleschi, V.

    2018-06-01

    The introduction of multivariate calibration curve approach in Laser-Induced Breakdown Spectroscopy (LIBS) quantitative analysis has led to a general improvement of the LIBS analytical performances, since a multivariate approach allows to exploit the redundancy of elemental information that are typically present in a LIBS spectrum. Software packages implementing multivariate methods are available in the most diffused commercial and open source analytical programs; in most of the cases, the multivariate algorithms are robust against noise and operate in unsupervised mode. The reverse of the coin of the availability and ease of use of such packages is the (perceived) difficulty in assessing the reliability of the results obtained which often leads to the consideration of the multivariate algorithms as 'black boxes' whose inner mechanism is supposed to remain hidden to the user. In this paper, we will discuss the dangers of a 'black box' approach in LIBS multivariate analysis, and will discuss how to overcome them using the chemical-physical knowledge that is at the base of any LIBS quantitative analysis.

  11. Label-free quantitative proteomic analysis of human plasma-derived microvesicles to find protein signatures of abdominal aortic aneurysms.

    PubMed

    Martinez-Pinna, Roxana; Gonzalez de Peredo, Anne; Monsarrat, Bernard; Burlet-Schiltz, Odile; Martin-Ventura, Jose Luis

    2014-08-01

    To find potential biomarkers of abdominal aortic aneurysms (AAA), we performed a differential proteomic study based on human plasma-derived microvesicles. Exosomes and microparticles isolated from plasma of AAA patients and control subjects (n = 10 each group) were analyzed by a label-free quantitative MS-based strategy. Homemade and publicly available software packages have been used for MS data analysis. The application of two kinds of bioinformatic tools allowed us to find differential protein profiles from AAA patients. Some of these proteins found by the two analysis methods belong to main pathological mechanisms of AAA such as oxidative stress, immune-inflammation, and thrombosis. Data analysis from label-free MS-based experiments requires the use of sophisticated bioinformatic approaches to perform quantitative studies from complex protein mixtures. The application of two of these bioinformatic tools provided us a preliminary list of differential proteins found in plasma-derived microvesicles not previously associated to AAA, which could help us to understand the pathological mechanisms related to this disease. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Technical advance: live-imaging analysis of human dendritic cell migrating behavior under the influence of immune-stimulating reagents in an organotypic model of lung.

    PubMed

    Nguyen Hoang, Anh Thu; Chen, Puran; Björnfot, Sofia; Högstrand, Kari; Lock, John G; Grandien, Alf; Coles, Mark; Svensson, Mattias

    2014-09-01

    This manuscript describes technical advances allowing manipulation and quantitative analyses of human DC migratory behavior in lung epithelial tissue. DCs are hematopoietic cells essential for the maintenance of tissue homeostasis and the induction of tissue-specific immune responses. Important functions include cytokine production and migration in response to infection for the induction of proper immune responses. To design appropriate strategies to exploit human DC functional properties in lung tissue for the purpose of clinical evaluation, e.g., candidate vaccination and immunotherapy strategies, we have developed a live-imaging assay based on our previously described organotypic model of the human lung. This assay allows provocations and subsequent quantitative investigations of DC functional properties under conditions mimicking morphological and functional features of the in vivo parental tissue. We present protocols to set up and prepare tissue models for 4D (x, y, z, time) fluorescence-imaging analysis that allow spatial and temporal studies of human DCs in live epithelial tissue, followed by flow cytometry analysis of DCs retrieved from digested tissue models. This model system can be useful for elucidating incompletely defined pathways controlling DC functional responses to infection and inflammation in lung epithelial tissue, as well as the efficacy of locally administered candidate interventions. © 2014 Society for Leukocyte Biology.

  13. Multispectral colour analysis for quantitative evaluation of pseudoisochromatic color deficiency tests

    NASA Astrophysics Data System (ADS)

    Ozolinsh, Maris; Fomins, Sergejs

    2010-11-01

    Multispectral color analysis was used for spectral scanning of Ishihara and Rabkin color deficiency test book images. It was done using tunable liquid-crystal LC filters built in the Nuance II analyzer. Multispectral analysis keeps both, information on spatial content of tests and on spectral content. Images were taken in the range of 420-720nm with a 10nm step. We calculated retina neural activity charts taking into account cone sensitivity functions, and processed charts in order to find the visibility of latent symbols in color deficiency plates using cross-correlation technique. In such way the quantitative measure is found for each of diagnostics plate for three different color deficiency carrier types - protanopes, deutanopes and tritanopes. Multispectral color analysis allows to determine the CIE xyz color coordinates of pseudoisochromatic plate design elements and to perform statistical analysis of these data to compare the color quality of available color deficiency test books.

  14. Quantitative proteomics in biological research.

    PubMed

    Wilm, Matthias

    2009-10-01

    Proteomics has enabled the direct investigation of biological material, at first through the analysis of individual proteins, then of lysates from cell cultures, and finally of extracts from tissues and biopsies from entire organisms. Its latest manifestation - quantitative proteomics - allows deeper insight into biological systems. This article reviews the different methods used to extract quantitative information from mass spectra. It follows the technical developments aimed toward global proteomics, the attempt to characterize every expressed protein in a cell by at least one peptide. When applications of the technology are discussed, the focus is placed on yeast biology. In particular, differential quantitative proteomics, the comparison between an experiment and its control, is very discriminating for proteins involved in the process being studied. When trying to understand biological processes on a molecular level, differential quantitative proteomics tends to give a clearer picture than global transcription analyses. As a result, MS has become an even more indispensable tool for biochemically motivated biological research.

  15. A Quantitative Approach to Analyzing Architectures in the Presence of Uncertainty

    DTIC Science & Technology

    2009-07-01

    SAR) 18. NUMBER OF PAGES 33 19a. NAME OF RESPONSIBLE PERSON a. REPORT unclassified b . ABSTRACT unclassified c. THIS PAGE unclassified Standard...hence requires appropriate tool support. 3 3.1 Architecture Modeling To facilitate this form of modeling, the modeling language must allow the archi ...can (a) capture the steady-state behavior of the model, ( b ) allow for the analysis of some property in the context of a specific state or condition

  16. Matching Microscopic and Macroscopic Responses in Glasses.

    PubMed

    Baity-Jesi, M; Calore, E; Cruz, A; Fernandez, L A; Gil-Narvion, J M; Gordillo-Guerrero, A; Iñiguez, D; Maiorano, A; Marinari, E; Martin-Mayor, V; Monforte-Garcia, J; Muñoz-Sudupe, A; Navarro, D; Parisi, G; Perez-Gaviro, S; Ricci-Tersenghi, F; Ruiz-Lorenzo, J J; Schifano, S F; Seoane, B; Tarancon, A; Tripiccione, R; Yllanes, D

    2017-04-14

    We first reproduce on the Janus and Janus II computers a milestone experiment that measures the spin-glass coherence length through the lowering of free-energy barriers induced by the Zeeman effect. Secondly, we determine the scaling behavior that allows a quantitative analysis of a new experiment reported in the companion Letter [S. Guchhait and R. Orbach, Phys. Rev. Lett. 118, 157203 (2017)].PRLTAO0031-900710.1103/PhysRevLett.118.157203 The value of the coherence length estimated through the analysis of microscopic correlation functions turns out to be quantitatively consistent with its measurement through macroscopic response functions. Further, nonlinear susceptibilities, recently measured in glass-forming liquids, scale as powers of the same microscopic length.

  17. How to Combine ChIP with qPCR.

    PubMed

    Asp, Patrik

    2018-01-01

    Chromatin immunoprecipitation (ChIP) coupled with quantitative PCR (qPCR) has in the last 15 years become a basic mainstream tool in genomic research. Numerous commercially available ChIP kits, qPCR kits, and real-time PCR systems allow for quick and easy analysis of virtually anything chromatin-related as long as there is an available antibody. However, the highly accurate quantitative dimension added by using qPCR to analyze ChIP samples significantly raises the bar in terms of experimental accuracy, appropriate controls, data analysis, and data presentation. This chapter will address these potential pitfalls by providing protocols and procedures that address the difficulties inherent in ChIP-qPCR assays.

  18. Computer simulation of schlieren images of rotationally symmetric plasma systems: a simple method.

    PubMed

    Noll, R; Haas, C R; Weikl, B; Herziger, G

    1986-03-01

    Schlieren techniques are commonly used methods for quantitative analysis of cylindrical or spherical index of refraction profiles. Many schlieren objects, however, are characterized by more complex geometries, so we have investigated the more general case of noncylindrical, rotationally symmetric distributions of index of refraction n(r,z). Assuming straight ray paths in the schlieren object we have calculated 2-D beam deviation profiles. It is shown that experimental schlieren images of the noncylindrical plasma generated by a plasma focus device can be simulated with these deviation profiles. The computer simulation allows a quantitative analysis of these schlieren images, which yields, for example, the plasma parameters, electron density, and electron density gradients.

  19. Tannin structural elucidation and quantitative ³¹P NMR analysis. 1. Model compounds.

    PubMed

    Melone, Federica; Saladino, Raffaele; Lange, Heiko; Crestini, Claudia

    2013-10-02

    Tannins and flavonoids are secondary metabolites of plants that display a wide array of biological activities. This peculiarity is related to the inhibition of extracellular enzymes that occurs through the complexation of peptides by tannins. Not only the nature of these interactions, but more fundamentally also the structure of these heterogeneous polyphenolic molecules are not completely clear. This first paper describes the development of a new analytical method for the structural characterization of tannins on the basis of tannin model compounds employing an in situ labeling of all labile H groups (aliphatic OH, phenolic OH, and carboxylic acids) with a phosphorus reagent. The ³¹P NMR analysis of ³¹P-labeled samples allowed the unprecedented quantitative and qualitative structural characterization of hydrolyzable tannins, proanthocyanidins, and catechin tannin model compounds, forming the foundations for the quantitative structural elucidation of a variety of actual tannin samples described in part 2 of this series.

  20. Quantitative profiling of immune repertoires for minor lymphocyte counts using unique molecular identifiers.

    PubMed

    Egorov, Evgeny S; Merzlyak, Ekaterina M; Shelenkov, Andrew A; Britanova, Olga V; Sharonov, George V; Staroverov, Dmitriy B; Bolotin, Dmitriy A; Davydov, Alexey N; Barsova, Ekaterina; Lebedev, Yuriy B; Shugay, Mikhail; Chudakov, Dmitriy M

    2015-06-15

    Emerging high-throughput sequencing methods for the analyses of complex structure of TCR and BCR repertoires give a powerful impulse to adaptive immunity studies. However, there are still essential technical obstacles for performing a truly quantitative analysis. Specifically, it remains challenging to obtain comprehensive information on the clonal composition of small lymphocyte populations, such as Ag-specific, functional, or tissue-resident cell subsets isolated by sorting, microdissection, or fine needle aspirates. In this study, we report a robust approach based on unique molecular identifiers that allows profiling Ag receptors for several hundred to thousand lymphocytes while preserving qualitative and quantitative information on clonal composition of the sample. We also describe several general features regarding the data analysis with unique molecular identifiers that are critical for accurate counting of starting molecules in high-throughput sequencing applications. Copyright © 2015 by The American Association of Immunologists, Inc.

  1. Rapid Quadrupole-Time-of-Flight Mass Spectrometry Method Quantifies Oxygen-Rich Lignin Compound in Complex Mixtures

    NASA Astrophysics Data System (ADS)

    Boes, Kelsey S.; Roberts, Michael S.; Vinueza, Nelson R.

    2018-03-01

    Complex mixture analysis is a costly and time-consuming task facing researchers with foci as varied as food science and fuel analysis. When faced with the task of quantifying oxygen-rich bio-oil molecules in a complex diesel mixture, we asked whether complex mixtures could be qualitatively and quantitatively analyzed on a single mass spectrometer with mid-range resolving power without the use of lengthy separations. To answer this question, we developed and evaluated a quantitation method that eliminated chromatography steps and expanded the use of quadrupole-time-of-flight mass spectrometry from primarily qualitative to quantitative as well. To account for mixture complexity, the method employed an ionization dopant, targeted tandem mass spectrometry, and an internal standard. This combination of three techniques achieved reliable quantitation of oxygen-rich eugenol in diesel from 300 to 2500 ng/mL with sufficient linearity (R2 = 0.97 ± 0.01) and excellent accuracy (percent error = 0% ± 5). To understand the limitations of the method, it was compared to quantitation attained on a triple quadrupole mass spectrometer, the gold standard for quantitation. The triple quadrupole quantified eugenol from 50 to 2500 ng/mL with stronger linearity (R2 = 0.996 ± 0.003) than the quadrupole-time-of-flight and comparable accuracy (percent error = 4% ± 5). This demonstrates that a quadrupole-time-of-flight can be used for not only qualitative analysis but also targeted quantitation of oxygen-rich lignin molecules in complex mixtures without extensive sample preparation. The rapid and cost-effective method presented here offers new possibilities for bio-oil research, including: (1) allowing for bio-oil studies that demand repetitive analysis as process parameters are changed and (2) making this research accessible to more laboratories. [Figure not available: see fulltext.

  2. Rapid Quadrupole-Time-of-Flight Mass Spectrometry Method Quantifies Oxygen-Rich Lignin Compound in Complex Mixtures

    NASA Astrophysics Data System (ADS)

    Boes, Kelsey S.; Roberts, Michael S.; Vinueza, Nelson R.

    2017-12-01

    Complex mixture analysis is a costly and time-consuming task facing researchers with foci as varied as food science and fuel analysis. When faced with the task of quantifying oxygen-rich bio-oil molecules in a complex diesel mixture, we asked whether complex mixtures could be qualitatively and quantitatively analyzed on a single mass spectrometer with mid-range resolving power without the use of lengthy separations. To answer this question, we developed and evaluated a quantitation method that eliminated chromatography steps and expanded the use of quadrupole-time-of-flight mass spectrometry from primarily qualitative to quantitative as well. To account for mixture complexity, the method employed an ionization dopant, targeted tandem mass spectrometry, and an internal standard. This combination of three techniques achieved reliable quantitation of oxygen-rich eugenol in diesel from 300 to 2500 ng/mL with sufficient linearity (R2 = 0.97 ± 0.01) and excellent accuracy (percent error = 0% ± 5). To understand the limitations of the method, it was compared to quantitation attained on a triple quadrupole mass spectrometer, the gold standard for quantitation. The triple quadrupole quantified eugenol from 50 to 2500 ng/mL with stronger linearity (R2 = 0.996 ± 0.003) than the quadrupole-time-of-flight and comparable accuracy (percent error = 4% ± 5). This demonstrates that a quadrupole-time-of-flight can be used for not only qualitative analysis but also targeted quantitation of oxygen-rich lignin molecules in complex mixtures without extensive sample preparation. The rapid and cost-effective method presented here offers new possibilities for bio-oil research, including: (1) allowing for bio-oil studies that demand repetitive analysis as process parameters are changed and (2) making this research accessible to more laboratories. [Figure not available: see fulltext.

  3. Rapid Quadrupole-Time-of-Flight Mass Spectrometry Method Quantifies Oxygen-Rich Lignin Compound in Complex Mixtures.

    PubMed

    Boes, Kelsey S; Roberts, Michael S; Vinueza, Nelson R

    2018-03-01

    Complex mixture analysis is a costly and time-consuming task facing researchers with foci as varied as food science and fuel analysis. When faced with the task of quantifying oxygen-rich bio-oil molecules in a complex diesel mixture, we asked whether complex mixtures could be qualitatively and quantitatively analyzed on a single mass spectrometer with mid-range resolving power without the use of lengthy separations. To answer this question, we developed and evaluated a quantitation method that eliminated chromatography steps and expanded the use of quadrupole-time-of-flight mass spectrometry from primarily qualitative to quantitative as well. To account for mixture complexity, the method employed an ionization dopant, targeted tandem mass spectrometry, and an internal standard. This combination of three techniques achieved reliable quantitation of oxygen-rich eugenol in diesel from 300 to 2500 ng/mL with sufficient linearity (R 2 = 0.97 ± 0.01) and excellent accuracy (percent error = 0% ± 5). To understand the limitations of the method, it was compared to quantitation attained on a triple quadrupole mass spectrometer, the gold standard for quantitation. The triple quadrupole quantified eugenol from 50 to 2500 ng/mL with stronger linearity (R 2 = 0.996 ± 0.003) than the quadrupole-time-of-flight and comparable accuracy (percent error = 4% ± 5). This demonstrates that a quadrupole-time-of-flight can be used for not only qualitative analysis but also targeted quantitation of oxygen-rich lignin molecules in complex mixtures without extensive sample preparation. The rapid and cost-effective method presented here offers new possibilities for bio-oil research, including: (1) allowing for bio-oil studies that demand repetitive analysis as process parameters are changed and (2) making this research accessible to more laboratories. Graphical Abstract ᅟ.

  4. Quantitative Clinical Diagnostic Analysis of Acetone in Human Blood by HPLC: A Metabolomic Search for Acetone as Indicator

    PubMed Central

    Akgul Kalkan, Esin; Sahiner, Mehtap; Ulker Cakir, Dilek; Alpaslan, Duygu; Yilmaz, Selehattin

    2016-01-01

    Using high-performance liquid chromatography (HPLC) and 2,4-dinitrophenylhydrazine (2,4-DNPH) as a derivatizing reagent, an analytical method was developed for the quantitative determination of acetone in human blood. The determination was carried out at 365 nm using an ultraviolet-visible (UV-Vis) diode array detector (DAD). For acetone as its 2,4-dinitrophenylhydrazone derivative, a good separation was achieved with a ThermoAcclaim C18 column (15 cm × 4.6 mm × 3 μm) at retention time (t R) 12.10 min and flowrate of 1 mL min−1 using a (methanol/acetonitrile) water elution gradient. The methodology is simple, rapid, sensitive, and of low cost, exhibits good reproducibility, and allows the analysis of acetone in biological fluids. A calibration curve was obtained for acetone using its standard solutions in acetonitrile. Quantitative analysis of acetone in human blood was successfully carried out using this calibration graph. The applied method was validated in parameters of linearity, limit of detection and quantification, accuracy, and precision. We also present acetone as a useful tool for the HPLC-based metabolomic investigation of endogenous metabolism and quantitative clinical diagnostic analysis. PMID:27298750

  5. Platform-independent and label-free quantitation of proteomic data using MS1 extracted ion chromatograms in skyline: application to protein acetylation and phosphorylation.

    PubMed

    Schilling, Birgit; Rardin, Matthew J; MacLean, Brendan X; Zawadzka, Anna M; Frewen, Barbara E; Cusack, Michael P; Sorensen, Dylan J; Bereman, Michael S; Jing, Enxuan; Wu, Christine C; Verdin, Eric; Kahn, C Ronald; Maccoss, Michael J; Gibson, Bradford W

    2012-05-01

    Despite advances in metabolic and postmetabolic labeling methods for quantitative proteomics, there remains a need for improved label-free approaches. This need is particularly pressing for workflows that incorporate affinity enrichment at the peptide level, where isobaric chemical labels such as isobaric tags for relative and absolute quantitation and tandem mass tags may prove problematic or where stable isotope labeling with amino acids in cell culture labeling cannot be readily applied. Skyline is a freely available, open source software tool for quantitative data processing and proteomic analysis. We expanded the capabilities of Skyline to process ion intensity chromatograms of peptide analytes from full scan mass spectral data (MS1) acquired during HPLC MS/MS proteomic experiments. Moreover, unlike existing programs, Skyline MS1 filtering can be used with mass spectrometers from four major vendors, which allows results to be compared directly across laboratories. The new quantitative and graphical tools now available in Skyline specifically support interrogation of multiple acquisitions for MS1 filtering, including visual inspection of peak picking and both automated and manual integration, key features often lacking in existing software. In addition, Skyline MS1 filtering displays retention time indicators from underlying MS/MS data contained within the spectral library to ensure proper peak selection. The modular structure of Skyline also provides well defined, customizable data reports and thus allows users to directly connect to existing statistical programs for post hoc data analysis. To demonstrate the utility of the MS1 filtering approach, we have carried out experiments on several MS platforms and have specifically examined the performance of this method to quantify two important post-translational modifications: acetylation and phosphorylation, in peptide-centric affinity workflows of increasing complexity using mouse and human models.

  6. Platform-independent and Label-free Quantitation of Proteomic Data Using MS1 Extracted Ion Chromatograms in Skyline

    PubMed Central

    Schilling, Birgit; Rardin, Matthew J.; MacLean, Brendan X.; Zawadzka, Anna M.; Frewen, Barbara E.; Cusack, Michael P.; Sorensen, Dylan J.; Bereman, Michael S.; Jing, Enxuan; Wu, Christine C.; Verdin, Eric; Kahn, C. Ronald; MacCoss, Michael J.; Gibson, Bradford W.

    2012-01-01

    Despite advances in metabolic and postmetabolic labeling methods for quantitative proteomics, there remains a need for improved label-free approaches. This need is particularly pressing for workflows that incorporate affinity enrichment at the peptide level, where isobaric chemical labels such as isobaric tags for relative and absolute quantitation and tandem mass tags may prove problematic or where stable isotope labeling with amino acids in cell culture labeling cannot be readily applied. Skyline is a freely available, open source software tool for quantitative data processing and proteomic analysis. We expanded the capabilities of Skyline to process ion intensity chromatograms of peptide analytes from full scan mass spectral data (MS1) acquired during HPLC MS/MS proteomic experiments. Moreover, unlike existing programs, Skyline MS1 filtering can be used with mass spectrometers from four major vendors, which allows results to be compared directly across laboratories. The new quantitative and graphical tools now available in Skyline specifically support interrogation of multiple acquisitions for MS1 filtering, including visual inspection of peak picking and both automated and manual integration, key features often lacking in existing software. In addition, Skyline MS1 filtering displays retention time indicators from underlying MS/MS data contained within the spectral library to ensure proper peak selection. The modular structure of Skyline also provides well defined, customizable data reports and thus allows users to directly connect to existing statistical programs for post hoc data analysis. To demonstrate the utility of the MS1 filtering approach, we have carried out experiments on several MS platforms and have specifically examined the performance of this method to quantify two important post-translational modifications: acetylation and phosphorylation, in peptide-centric affinity workflows of increasing complexity using mouse and human models. PMID:22454539

  7. Quantification of plaque area and characterization of plaque biochemical composition with atherosclerosis progression in ApoE/LDLR(-/-) mice by FT-IR imaging.

    PubMed

    Wrobel, Tomasz P; Mateuszuk, Lukasz; Kostogrys, Renata B; Chlopicki, Stefan; Baranska, Malgorzata

    2013-11-07

    In this work the quantitative determination of atherosclerotic lesion area (ApoE/LDLR(-/-) mice) by FT-IR imaging is presented and validated by comparison with atherosclerotic lesion area determination by classic Oil Red O staining. Cluster analysis of FT-IR-based measurements in the 2800-3025 cm(-1) range allowed for quantitative analysis of the atherosclerosis plaque area, the results of which were highly correlated with those of Oil Red O histological staining (R(2) = 0.935). Moreover, a specific class obtained from a second cluster analysis of the aortic cross-section samples at different stages of disease progression (3, 4 and 6 months old) seemed to represent the macrophages (CD68) area within the atherosclerotic plaque.

  8. Immediate drop on demand technology (I-DOT) coupled with mass spectrometry via an open port sampling interface.

    PubMed

    Van Berkel, Gary J; Kertesz, Vilmos; Boeltz, Harry

    2017-11-01

    The aim of this work was to demonstrate and evaluate the analytical performance of coupling the immediate drop on demand technology to a mass spectrometer via the recently introduced open port sampling interface and ESI. Methodology & results: A maximum sample analysis throughput of 5 s per sample was demonstrated. Signal reproducibility was 10% or better as demonstrated by the quantitative analysis of propranolol and its stable isotope-labeled internal standard propranolol-d7. The ability of the system to multiply charge and analyze macromolecules was demonstrated using the protein cytochrome c. This immediate drop on demand technology/open port sampling interface/ESI-MS combination allowed for the quantitative analysis of relatively small mass analytes and was used for the identification of macromolecules like proteins.

  9. Using PSEA-Quant for Protein Set Enrichment Analysis of Quantitative Mass Spectrometry-Based Proteomics.

    PubMed

    Lavallée-Adam, Mathieu; Yates, John R

    2016-03-24

    PSEA-Quant analyzes quantitative mass spectrometry-based proteomics datasets to identify enrichments of annotations contained in repositories such as the Gene Ontology and Molecular Signature databases. It allows users to identify the annotations that are significantly enriched for reproducibly quantified high abundance proteins. PSEA-Quant is available on the Web and as a command-line tool. It is compatible with all label-free and isotopic labeling-based quantitative proteomics methods. This protocol describes how to use PSEA-Quant and interpret its output. The importance of each parameter as well as troubleshooting approaches are also discussed. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.

  10. Modelling default and likelihood reasoning as probabilistic

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  11. Quantitative and Comparative Profiling of Protease Substrates through a Genetically Encoded Multifunctional Photocrosslinker.

    PubMed

    He, Dan; Xie, Xiao; Yang, Fan; Zhang, Heng; Su, Haomiao; Ge, Yun; Song, Haiping; Chen, Peng R

    2017-11-13

    A genetically encoded, multifunctional photocrosslinker was developed for quantitative and comparative proteomics. By bearing a bioorthogonal handle and a releasable linker in addition to its photoaffinity warhead, this probe enables the enrichment of transient and low-abundance prey proteins after intracellular photocrosslinking and prey-bait separation, which can be subject to stable isotope dimethyl labeling and mass spectrometry analysis. This quantitative strategy (termed isoCAPP) allowed a comparative proteomic approach to be adopted to identify the proteolytic substrates of an E. coli protease-chaperone dual machinery DegP. Two newly identified substrates were subsequently confirmed by proteolysis experiments. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Disrupting the Pipeline: Critical Analyses of Student Pathways through Postsecondary STEM Education

    ERIC Educational Resources Information Center

    Metcalf, Heather E.

    2014-01-01

    Critical mixed methods approaches allow us to reflect upon the ways in which we collect, measure, interpret, and analyze data, providing novel alternatives for quantitative analysis. For institutional researchers, whose work influences institutional policies, programs, and practices, the approach has the transformative ability to expose and create…

  13. Quantitation of mycotoxins using direct analysis in real time (DART)-mass spectrometry (MS)

    USDA-ARS?s Scientific Manuscript database

    Ambient ionization represents a new generation of mass spectrometry ion sources which is used for rapid ionization of small molecules under ambient conditions. The combination of ambient ionization and mass spectrometry allows analyzing multiple food samples with simple or no sample treatment, or in...

  14. Surrogate matrix and surrogate analyte approaches for definitive quantitation of endogenous biomolecules.

    PubMed

    Jones, Barry R; Schultz, Gary A; Eckstein, James A; Ackermann, Bradley L

    2012-10-01

    Quantitation of biomarkers by LC-MS/MS is complicated by the presence of endogenous analytes. This challenge is most commonly overcome by calibration using an authentic standard spiked into a surrogate matrix devoid of the target analyte. A second approach involves use of a stable-isotope-labeled standard as a surrogate analyte to allow calibration in the actual biological matrix. For both methods, parallelism between calibration standards and the target analyte in biological matrix must be demonstrated in order to ensure accurate quantitation. In this communication, the surrogate matrix and surrogate analyte approaches are compared for the analysis of five amino acids in human plasma: alanine, valine, methionine, leucine and isoleucine. In addition, methodology based on standard addition is introduced, which enables a robust examination of parallelism in both surrogate analyte and surrogate matrix methods prior to formal validation. Results from additional assays are presented to introduce the standard-addition methodology and to highlight the strengths and weaknesses of each approach. For the analysis of amino acids in human plasma, comparable precision and accuracy were obtained by the surrogate matrix and surrogate analyte methods. Both assays were well within tolerances prescribed by regulatory guidance for validation of xenobiotic assays. When stable-isotope-labeled standards are readily available, the surrogate analyte approach allows for facile method development. By comparison, the surrogate matrix method requires greater up-front method development; however, this deficit is offset by the long-term advantage of simplified sample analysis.

  15. Quantitative laser speckle flowmetry of the in vivo microcirculation using sidestream dark field microscopy

    PubMed Central

    Nadort, Annemarie; Woolthuis, Rutger G.; van Leeuwen, Ton G.; Faber, Dirk J.

    2013-01-01

    We present integrated Laser Speckle Contrast Imaging (LSCI) and Sidestream Dark Field (SDF) flowmetry to provide real-time, non-invasive and quantitative measurements of speckle decorrelation times related to microcirculatory flow. Using a multi exposure acquisition scheme, precise speckle decorrelation times were obtained. Applying SDF-LSCI in vitro and in vivo allows direct comparison between speckle contrast decorrelation and flow velocities, while imaging the phantom and microcirculation architecture. This resulted in a novel analysis approach that distinguishes decorrelation due to flow from other additive decorrelation sources. PMID:24298399

  16. Aggregation and Disaggregation of Senile Plaques in Alzheimer Disease

    NASA Astrophysics Data System (ADS)

    Cruz, L.; Urbanc, B.; Buldyrev, S. V.; Christie, R.; Gomez-Isla, T.; Havlin, S.; McNamara, M.; Stanley, H. E.; Hyman, B. T.

    1997-07-01

    We quantitatively analyzed, using laser scanning confocal microscopy, the three-dimensional structure of individual senile plaques in Alzheimer disease. We carried out the quantitative analysis using statistical methods to gain insights about the processes that govern Aβ peptide deposition. Our results show that plaques are complex porous structures with characteristic pore sizes. We interpret plaque morphology in the context of a new dynamical model based on competing aggregation and disaggregation processes in kinetic steady-state equilibrium with an additional diffusion process allowing Aβ deposits to diffuse over the surface of plaques.

  17. Nonlinear optical microscopy: use of second harmonic generation and two-photon microscopy for automated quantitative liver fibrosis studies.

    PubMed

    Sun, Wanxin; Chang, Shi; Tai, Dean C S; Tan, Nancy; Xiao, Guangfa; Tang, Huihuan; Yu, Hanry

    2008-01-01

    Liver fibrosis is associated with an abnormal increase in an extracellular matrix in chronic liver diseases. Quantitative characterization of fibrillar collagen in intact tissue is essential for both fibrosis studies and clinical applications. Commonly used methods, histological staining followed by either semiquantitative or computerized image analysis, have limited sensitivity, accuracy, and operator-dependent variations. The fibrillar collagen in sinusoids of normal livers could be observed through second-harmonic generation (SHG) microscopy. The two-photon excited fluorescence (TPEF) images, recorded simultaneously with SHG, clearly revealed the hepatocyte morphology. We have systematically optimized the parameters for the quantitative SHG/TPEF imaging of liver tissue and developed fully automated image analysis algorithms to extract the information of collagen changes and cell necrosis. Subtle changes in the distribution and amount of collagen and cell morphology are quantitatively characterized in SHG/TPEF images. By comparing to traditional staining, such as Masson's trichrome and Sirius red, SHG/TPEF is a sensitive quantitative tool for automated collagen characterization in liver tissue. Our system allows for enhanced detection and quantification of sinusoidal collagen fibers in fibrosis research and clinical diagnostics.

  18. QGene 4.0, an extensible Java QTL-analysis platform.

    PubMed

    Joehanes, Roby; Nelson, James C

    2008-12-01

    Of many statistical methods developed to date for quantitative trait locus (QTL) analysis, only a limited subset are available in public software allowing their exploration, comparison and practical application by researchers. We have developed QGene 4.0, a plug-in platform that allows execution and comparison of a variety of modern QTL-mapping methods and supports third-party addition of new ones. The software accommodates line-cross mating designs consisting of any arbitrary sequence of selfing, backcrossing, intercrossing and haploid-doubling steps that includes map, population, and trait simulators; and is scriptable. Software and documentation are available at http://coding.plantpath.ksu.edu/qgene. Source code is available on request.

  19. Quantitative characterization of gold nanoparticles by size-exclusion and hydrodynamic chromatography, coupled to inductively coupled plasma mass spectrometry and quasi-elastic light scattering.

    PubMed

    Pitkänen, Leena; Montoro Bustos, Antonio R; Murphy, Karen E; Winchester, Michael R; Striegel, André M

    2017-08-18

    The physicochemical characterization of nanoparticles (NPs) is of paramount importance for tailoring and optimizing the properties of these materials as well as for evaluating the environmental fate and impact of the NPs. Characterizing the size and chemical identity of disperse NP sample populations can be accomplished by coupling size-based separation methods to physical and chemical detection methods. Informed decisions regarding the NPs can only be made, however, if the separations themselves are quantitative, i.e., if all or most of the analyte elutes from the column within the course of the experiment. We undertake here the size-exclusion chromatographic characterization of Au NPs spanning a six-fold range in mean size. The main problem which has plagued the size-exclusion chromatography (SEC) analysis of Au NPs, namely lack of quantitation accountability due to generally poor NP recovery from the columns, is overcome by carefully matching eluent formulation with the appropriate stationary phase chemistry, and by the use of on-line inductively coupled plasma mass spectrometry (ICP-MS) detection. Here, for the first time, we demonstrate the quantitative analysis of Au NPs by SEC/ICP-MS, including the analysis of a ternary NP blend. The SEC separations are contrasted to HDC/ICP-MS (HDC: hydrodynamic chromatography) separations employing the same stationary phase chemistry. Additionally, analysis of Au NPs by HDC with on-line quasi-elastic light scattering (QELS) allowed for continuous determination of NP size across the chromatographic profiles, circumventing issues related to the shedding of fines from the SEC columns. The use of chemically homogeneous reference materials with well-defined size range allowed for better assessment of the accuracy and precision of the analyses, and for a more direct interpretation of results, than would be possible employing less rigorously characterized analytes. Published by Elsevier B.V.

  20. Quantitative Assessment of In-solution Digestion Efficiency Identifies Optimal Protocols for Unbiased Protein Analysis*

    PubMed Central

    León, Ileana R.; Schwämmle, Veit; Jensen, Ole N.; Sprenger, Richard R.

    2013-01-01

    The majority of mass spectrometry-based protein quantification studies uses peptide-centric analytical methods and thus strongly relies on efficient and unbiased protein digestion protocols for sample preparation. We present a novel objective approach to assess protein digestion efficiency using a combination of qualitative and quantitative liquid chromatography-tandem MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein fractions. We evaluated nine trypsin-based digestion protocols, based on standard in-solution or on spin filter-aided digestion, including new optimized protocols. We investigated various reagents for protein solubilization and denaturation (dodecyl sulfate, deoxycholate, urea), several trypsin digestion conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents before analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative liquid chromatography-tandem MS workflow quantified over 3700 distinct peptides with 96% completeness between all protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows for efficient, unbiased generation and recovery of peptides from all protein classes, including membrane proteins. This deoxycholate-assisted protocol was also optimal for spin filter-aided digestions as compared with existing methods. PMID:23792921

  1. A quantitative risk analysis approach to port hydrocarbon logistics.

    PubMed

    Ronza, A; Carol, S; Espejo, V; Vílchez, J A; Arnaldos, J

    2006-01-16

    A method is presented that allows quantitative risk analysis to be performed on marine hydrocarbon terminals sited in ports. A significant gap was identified in the technical literature on QRA for the handling of hazardous materials in harbours published prior to this work. The analysis is extended to tanker navigation through port waters and loading and unloading facilities. The steps of the method are discussed, beginning with data collecting. As to accident scenario identification, an approach is proposed that takes into account minor and massive spills due to loading arm failures and tank rupture. Frequency estimation is thoroughly reviewed and a shortcut approach is proposed for frequency calculation. This allows for the two-fold possibility of a tanker colliding/grounding at/near the berth or while navigating to/from the berth. A number of probability data defining the possibility of a cargo spill after an external impact on a tanker are discussed. As to consequence and vulnerability estimates, a scheme is proposed for the use of ratios between the numbers of fatal victims, injured and evacuated people. Finally, an example application is given, based on a pilot study conducted in the Port of Barcelona, where the method was tested.

  2. Usage of "Powergraph" software at laboratory lessons of "general physics" department of MEPhI

    NASA Astrophysics Data System (ADS)

    Klyachin, N. A.; Matronchik, A. Yu.; Khangulyan, E. V.

    2017-01-01

    One considers usage of "PowerGraph" software in laboratory exercise "Study of sodium spectrum" of physical experiment lessons. Togethe with the design of experiment setup, one discusses the sodium spectra digitized with computer audio chip. Usage of "PowerGraph" software in laboratory experiment "Study of sodium spectrum" allows an efficient visualization of the sodium spectrum and analysis of its fine structure. In particular, it allows quantitative measurements of the wavelengths and line relative intensities.

  3. High-throughput analysis of spatio-temporal dynamics in Dictyostelium

    PubMed Central

    Sawai, Satoshi; Guan, Xiao-Juan; Kuspa, Adam; Cox, Edward C

    2007-01-01

    We demonstrate a time-lapse video approach that allows rapid examination of the spatio-temporal dynamics of Dictyostelium cell populations. Quantitative information was gathered by sampling life histories of more than 2,000 mutant clones from a large mutagenesis collection. Approximately 4% of the clonal lines showed a mutant phenotype at one stage. Many of these could be ordered by clustering into functional groups. The dataset allows one to search and retrieve movies on a gene-by-gene and phenotype-by-phenotype basis. PMID:17659086

  4. Frontally eluted components procedure with thin layer chromatography as a mode of sample preparation for high performance liquid chromatography quantitation of acetaminophen in biological matrix.

    PubMed

    Klimek-Turek, A; Sikora, M; Rybicki, M; Dzido, T H

    2016-03-04

    A new concept of using thin-layer chromatography to sample preparation for the quantitative determination of solute/s followed by instrumental techniques is presented Thin-layer chromatography (TLC) is used to completely separate acetaminophen and its internal standard from other components (matrix) and to form a single spot/zone containing them at the solvent front position (after the final stage of the thin-layer chromatogram development). The location of the analytes and internal standard in the solvent front zone allows their easy extraction followed by quantitation by HPLC. The exctraction procedure of the solute/s and internal standard can proceed from whole solute frontal zone or its part without lowering in accuracy of quantitative analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Camera, Hand Lens, and Microscope Probe (CHAMP): An Instrument Proposed for the 2009 MSL Rover Mission

    NASA Technical Reports Server (NTRS)

    Mungas, Greg S.; Beegle, Luther W.; Boynton, John E.; Lee, Pascal; Shidemantle, Ritch; Fisher, Ted

    2004-01-01

    The Camera, Hand Lens, and Microscope Probe (CHAMP) will allow examination of martian surface features and materials (terrain, rocks, soils, samples) on spatial scales ranging from kilometers to micrometers, thus enabling both microscopy and context imaging with high operational flexibility. CHAMP is designed to allow the detailed and quantitative investigation of a wide range of geologic features and processes on Mars, leading to a better quantitative understanding of the evolution of the martian surface environment through time. In particular, CHAMP will provide key data that will help understand the local region explored by Mars Surface Laboratory (MSL) as a potential habitat for life. CHAMP will also support other anticipated MSL investigations, in particular by helping identify and select the highest priority targets for sample collection and analysis by the MSL's analytical suite.

  6. Application of solid-phase microextraction to the quantitative analysis of 1,8-cineole in blood and expired air in a Eucalyptus herbivore, the brushtail possum (Trichosurus vulpecula).

    PubMed

    Boyle, Rebecca R; McLean, Stuart; Brandon, Sue; Pass, Georgia J; Davies, Noel W

    2002-11-25

    We have developed two solid-phase microextraction (SPME) methods, coupled with gas chromatography, for quantitatively analysing the major Eucalyptus leaf terpene, 1,8-cineole, in both expired air and blood from the common brushtail possum (Trichosurus vulpecula). In-line SPME sampling (5 min at 20 degrees C room temperature) of excurrent air from an expiratory chamber containing a possum dosed orally with 1,8-cineole (50 mg/kg) allowed real-time semi-quantitative measurements reflecting 1,8-cineole blood concentrations. Headspace SPME using 50 microl whole blood collected from possums dosed orally with 1,8-cineole (30 mg/kg) resulted in excellent sensitivity (quantitation limit 1 ng/ml) and reproducibility. Blood concentrations ranged between 1 and 1380 ng/ml. Calibration curves were prepared for two concentration ranges (0.05-10 and 10-400 ng/50 microl) for the analysis of blood concentrations. Both calibration curves were linear (r(2)=0.999 and 0.994, respectively) and the equations for the two concentration ranges were consistent. Copyright 2002 Elsevier Science B.V.

  7. Analysing magnetism using scanning SQUID microscopy.

    PubMed

    Reith, P; Renshaw Wang, X; Hilgenkamp, H

    2017-12-01

    Scanning superconducting quantum interference device microscopy (SSM) is a scanning probe technique that images local magnetic flux, which allows for mapping of magnetic fields with high field and spatial accuracy. Many studies involving SSM have been published in the last few decades, using SSM to make qualitative statements about magnetism. However, quantitative analysis using SSM has received less attention. In this work, we discuss several aspects of interpreting SSM images and methods to improve quantitative analysis. First, we analyse the spatial resolution and how it depends on several factors. Second, we discuss the analysis of SSM scans and the information obtained from the SSM data. Using simulations, we show how signals evolve as a function of changing scan height, SQUID loop size, magnetization strength, and orientation. We also investigated 2-dimensional autocorrelation analysis to extract information about the size, shape, and symmetry of magnetic features. Finally, we provide an outlook on possible future applications and improvements.

  8. Analysing magnetism using scanning SQUID microscopy

    NASA Astrophysics Data System (ADS)

    Reith, P.; Renshaw Wang, X.; Hilgenkamp, H.

    2017-12-01

    Scanning superconducting quantum interference device microscopy (SSM) is a scanning probe technique that images local magnetic flux, which allows for mapping of magnetic fields with high field and spatial accuracy. Many studies involving SSM have been published in the last few decades, using SSM to make qualitative statements about magnetism. However, quantitative analysis using SSM has received less attention. In this work, we discuss several aspects of interpreting SSM images and methods to improve quantitative analysis. First, we analyse the spatial resolution and how it depends on several factors. Second, we discuss the analysis of SSM scans and the information obtained from the SSM data. Using simulations, we show how signals evolve as a function of changing scan height, SQUID loop size, magnetization strength, and orientation. We also investigated 2-dimensional autocorrelation analysis to extract information about the size, shape, and symmetry of magnetic features. Finally, we provide an outlook on possible future applications and improvements.

  9. Predictive value of EEG in postanoxic encephalopathy: A quantitative model-based approach.

    PubMed

    Efthymiou, Evdokia; Renzel, Roland; Baumann, Christian R; Poryazova, Rositsa; Imbach, Lukas L

    2017-10-01

    The majority of comatose patients after cardiac arrest do not regain consciousness due to severe postanoxic encephalopathy. Early and accurate outcome prediction is therefore essential in determining further therapeutic interventions. The electroencephalogram is a standardized and commonly available tool used to estimate prognosis in postanoxic patients. The identification of pathological EEG patterns with poor prognosis relies however primarily on visual EEG scoring by experts. We introduced a model-based approach of EEG analysis (state space model) that allows for an objective and quantitative description of spectral EEG variability. We retrospectively analyzed standard EEG recordings in 83 comatose patients after cardiac arrest between 2005 and 2013 in the intensive care unit of the University Hospital Zürich. Neurological outcome was assessed one month after cardiac arrest using the Cerebral Performance Category. For a dynamic and quantitative EEG analysis, we implemented a model-based approach (state space analysis) to quantify EEG background variability independent from visual scoring of EEG epochs. Spectral variability was compared between groups and correlated with clinical outcome parameters and visual EEG patterns. Quantitative assessment of spectral EEG variability (state space velocity) revealed significant differences between patients with poor and good outcome after cardiac arrest: Lower mean velocity in temporal electrodes (T4 and T5) was significantly associated with poor prognostic outcome (p<0.005) and correlated with independently identified visual EEG patterns such as generalized periodic discharges (p<0.02). Receiver operating characteristic (ROC) analysis confirmed the predictive value of lower state space velocity for poor clinical outcome after cardiac arrest (AUC 80.8, 70% sensitivity, 15% false positive rate). Model-based quantitative EEG analysis (state space analysis) provides a novel, complementary marker for prognosis in postanoxic encephalopathy. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Droplet microfluidics--a tool for single-cell analysis.

    PubMed

    Joensson, Haakan N; Andersson Svahn, Helene

    2012-12-03

    Droplet microfluidics allows the isolation of single cells and reagents in monodisperse picoliter liquid capsules and manipulations at a throughput of thousands of droplets per second. These qualities allow many of the challenges in single-cell analysis to be overcome. Monodispersity enables quantitative control of solute concentrations, while encapsulation in droplets provides an isolated compartment for the single cell and its immediate environment. The high throughput allows the processing and analysis of the tens of thousands to millions of cells that must be analyzed to accurately describe a heterogeneous cell population so as to find rare cell types or access sufficient biological space to find hits in a directed evolution experiment. The low volumes of the droplets make very large screens economically viable. This Review gives an overview of the current state of single-cell analysis involving droplet microfluidics and offers examples where droplet microfluidics can further biological understanding. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Quantitative evaluation of haze formation of koji and progression of internal haze by drying of koji during koji making.

    PubMed

    Ito, Kazunari; Gomi, Katsuya; Kariyama, Masahiro; Miyake, Tsuyoshi

    2017-07-01

    The construction of an experimental system that can mimic koji making in the manufacturing setting of a sake brewery is initially required for the quantitative evaluation of mycelia grown on/in koji pellets (haze formation). Koji making with rice was investigated with a solid-state fermentation (SSF) system using a non-airflow box (NAB), which produced uniform conditions in the culture substrate with high reproducibility and allowed for the control of favorable conditions in the substrate during culture. The SSF system using NAB accurately reproduced koji making in a manufacturing setting. To evaluate haze formation during koji making, surfaces and cross sections of koji pellets obtained from koji making tests were observed using a digital microscope. Image analysis was used to distinguish between haze and non-haze sections of koji pellets, enabling the evaluation of haze formation in a batch by measuring the haze rate of a specific number of koji pellets. This method allowed us to obtain continuous and quantitative data on the time course of haze formation. Moreover, drying koji during the late stage of koji making was revealed to cause further penetration of mycelia into koji pellets (internal haze). The koji making test with the SSF system using NAB and quantitative evaluation of haze formation in a batch by image analysis is a useful method for understanding the relations between haze formation and koji making conditions. Copyright © 2017 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  12. Advances in Surface Plasmon Resonance Imaging enable quantitative measurement of laterally heterogeneous coatings of nanoscale thickness

    NASA Astrophysics Data System (ADS)

    Raegen, Adam; Reiter, Kyle; Clarke, Anthony; Lipkowski, Jacek; Dutcher, John

    2013-03-01

    The Surface Plasmon Resonance (SPR) phenomenon is routinely exploited to qualitatively probe changes to the optical properties of nanoscale coatings on thin metallic surfaces, for use in probes and sensors. Unfortunately, extracting truly quantitative information is usually limited to a select few cases - uniform absorption/desorption of small biomolecules and films, in which a continuous ``slab'' model is a good approximation. We present advancements in the SPR technique that expand the number of cases for which the technique can provide meaningful results. Use of a custom, angle-scanning SPR imaging system, together with a refined data analysis method, allow for quantitative kinetic measurements of laterally heterogeneous systems. We first demonstrate the directionally heterogeneous nature of the SPR phenomenon using a directionally ordered sample, then show how this allows for the calculation of the average coverage of a heterogeneous sample. Finally, the degradation of cellulose microfibrils and bundles of microfibrils due to the action of cellulolytic enzymes will be presented as an excellent example of the capabilities of the SPR imaging system.

  13. Affordable, automatic quantitative fall risk assessment based on clinical balance scales and Kinect data.

    PubMed

    Colagiorgio, P; Romano, F; Sardi, F; Moraschini, M; Sozzi, A; Bejor, M; Ricevuti, G; Buizza, A; Ramat, S

    2014-01-01

    The problem of a correct fall risk assessment is becoming more and more critical with the ageing of the population. In spite of the available approaches allowing a quantitative analysis of the human movement control system's performance, the clinical assessment and diagnostic approach to fall risk assessment still relies mostly on non-quantitative exams, such as clinical scales. This work documents our current effort to develop a novel method to assess balance control abilities through a system implementing an automatic evaluation of exercises drawn from balance assessment scales. Our aim is to overcome the classical limits characterizing these scales i.e. limited granularity and inter-/intra-examiner reliability, to obtain objective scores and more detailed information allowing to predict fall risk. We used Microsoft Kinect to record subjects' movements while performing challenging exercises drawn from clinical balance scales. We then computed a set of parameters quantifying the execution of the exercises and fed them to a supervised classifier to perform a classification based on the clinical score. We obtained a good accuracy (~82%) and especially a high sensitivity (~83%).

  14. Structure and Function of Iron-Loaded Synthetic Melanin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Yiwen; Xie, Yijun; Wang, Zhao

    We describe a synthetic method for increasing and controlling the iron loading of synthetic melanin nanoparticles and use the resulting materials to perform a systematic quantitative investigation on their structure- property relationship. A comprehensive analysis by magnetometry, electron paramagnetic resonance, and nuclear magnetic relaxation dispersion reveals the complexities of their magnetic behavior and how these intraparticle magnetic interactions manifest in useful material properties such as their performance as MRI contrast agents. This analysis allows predictions of the optimal iron loading through a quantitative modeling of antiferromagnetic coupling that arises from proximal iron ions. This study provides a detailed understanding ofmore » this complex class of synthetic biomaterials and gives insight into interactions and structures prevalent in naturally occurring melanins.« less

  15. Cultural schemas for racial identity in Canadian television advertising.

    PubMed

    Baumann, Shyon; Ho, Loretta

    2014-05-01

    What meanings are attached to race in advertising? We analyze a sample of prime-time Canadian television advertising to identify cultural schemas for what it means to be White, Black, and East/Southeast Asian. Our empirical focus is on food and dining advertising. Through quantitative content analysis of associations between race and food subtypes, we show that there are systematic differences in the types of foods that groups are associated with. Through a qualitative content analysis of the commercials, we illuminate these quantitative patterns and discuss six cultural schemas for racial identity. The schemas allow for both diversity and privilege in the representation of Whites, and poignant contrasts regarding status and emotionality in the narrow representations of the other two groups.

  16. LOD significance thresholds for QTL analysis in experimental populations of diploid species

    PubMed

    Van Ooijen JW

    1999-11-01

    Linkage analysis with molecular genetic markers is a very powerful tool in the biological research of quantitative traits. The lack of an easy way to know what areas of the genome can be designated as statistically significant for containing a gene affecting the quantitative trait of interest hampers the important prediction of the rate of false positives. In this paper four tables, obtained by large-scale simulations, are presented that can be used with a simple formula to get the false-positives rate for analyses of the standard types of experimental populations with diploid species with any size of genome. A new definition of the term 'suggestive linkage' is proposed that allows a more objective comparison of results across species.

  17. An Image Analysis Method for the Precise Selection and Quantitation of Fluorescently Labeled Cellular Constituents

    PubMed Central

    Agley, Chibeza C.; Velloso, Cristiana P.; Lazarus, Norman R.

    2012-01-01

    The accurate measurement of the morphological characteristics of cells with nonuniform conformations presents difficulties. We report here a straightforward method using immunofluorescent staining and the commercially available imaging program Adobe Photoshop, which allows objective and precise information to be gathered on irregularly shaped cells. We have applied this measurement technique to the analysis of human muscle cells and their immunologically marked intracellular constituents, as these cells are prone to adopting a highly branched phenotype in culture. Use of this method can be used to overcome many of the long-standing limitations of conventional approaches for quantifying muscle cell size in vitro. In addition, wider applications of Photoshop as a quantitative and semiquantitative tool in immunocytochemistry are explored. PMID:22511600

  18. Parallel labeling experiments for pathway elucidation and (13)C metabolic flux analysis.

    PubMed

    Antoniewicz, Maciek R

    2015-12-01

    Metabolic pathway models provide the foundation for quantitative studies of cellular physiology through the measurement of intracellular metabolic fluxes. For model organisms metabolic models are well established, with many manually curated genome-scale model reconstructions, gene knockout studies and stable-isotope tracing studies. However, for non-model organisms a similar level of knowledge is often lacking. Compartmentation of cellular metabolism in eukaryotic systems also presents significant challenges for quantitative (13)C-metabolic flux analysis ((13)C-MFA). Recently, innovative (13)C-MFA approaches have been developed based on parallel labeling experiments, the use of multiple isotopic tracers and integrated data analysis, that allow more rigorous validation of pathway models and improved quantification of metabolic fluxes. Applications of these approaches open new research directions in metabolic engineering, biotechnology and medicine. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Investing in Education: Analysis of the 1999 World Education Indicators. Education and Skills.

    ERIC Educational Resources Information Center

    Organisation for Economic Cooperation and Development, Paris (France).

    This Organisation for Economic Cooperation and Development report documents the growing demand for learning around the world. A quantitative description of the functioning of education systems allows for international comparisons and the identification of the strengths and weaknesses of various approaches to providing quality education. Chapter 1,…

  20. What Are the 50 Cent Euro Coins Made of?

    ERIC Educational Resources Information Center

    Peralta, Luis; Farinha, Ana Catarina; Rego, Florbela

    2008-01-01

    X-ray fluorescence is a non-destructive technique that allows elemental composition analysis. In this paper we describe a prescription to obtain the elemental composition of homogeneous coins, like 50 cent Euro coins, and how to get the quantitative proportions of each element with the help of Monte Carlo simulation. Undergraduate students can…

  1. Patterns of Subject Mix in Higher Education Institutions: A First Empirical Analysis Using the AQUAMETH Database

    ERIC Educational Resources Information Center

    Lepori, Benedetto; Baschung, Lukas; Probst, Carole

    2010-01-01

    Teaching and research are organised differently between subject domains: attempts to construct typologies of higher education institutions, however, often do not include quantitative indicators concerning subject mix which would allow systematic comparisons of large numbers of higher education institutions among different countries, as the…

  2. Examination of Modeling Languages to Allow Quantitative Analysis for Model-Based Systems Engineering

    DTIC Science & Technology

    2014-06-01

    x THIS PAGE INTENTIONALLY LEFT BLANK xi LIST OF ACRONYMS AND ABBREVIATIONS BOM Base Object Model BPMN Business Process Model & Notation DOD...SysML. There are many variants such as the Unified Profile for DODAF/MODAF (UPDM) and Business Process Model & Notation ( BPMN ) that have origins in

  3. Modelling default and likelihood reasoning as probabilistic reasoning

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. Likely and by default are in fact treated as duals in the same sense as possibility and necessity. To model these four forms probabilistically, a qualitative default probabilistic (QDP) logic and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequent results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  4. Report on the analysis of common beverages spiked with gamma-hydroxybutyric acid (GHB) and gamma-butyrolactone (GBL) using NMR and the PURGE solvent-suppression technique.

    PubMed

    Lesar, Casey T; Decatur, John; Lukasiewicz, Elaan; Champeil, Elise

    2011-10-10

    In forensic evidence, the identification and quantitation of gamma-hydroxybutyric acid (GHB) in "spiked" beverages is challenging. In this report, we present the analysis of common alcoholic beverages found in clubs and bars spiked with gamma-hydroxybutyric acid (GHB) and gamma-butyrolactone (GBL). Our analysis of the spiked beverages consisted of using (1)H NMR with a water suppression method called Presaturation Utilizing Relaxation Gradients and Echoes (PURGE). The following beverages were analyzed: water, 10% ethanol in water, vodka-cranberry juice, rum and coke, gin and tonic, whisky and diet coke, white wine, red wine, and beer. The PURGE method allowed for the direct identification and quantitation of both compounds in all beverages except red and white wine where small interferences prevented accurate quantitation. The NMR method presented in this paper utilizes PURGE water suppression. Thanks to the use of a capillary internal standard, the method is fast, non-destructive, sensitive and requires no sample preparation which could disrupt the equilibrium between GHB and GBL. Published by Elsevier Ireland Ltd.

  5. Functional Module Search in Protein Networks based on Semantic Similarity Improves the Analysis of Proteomics Data*

    PubMed Central

    Boyanova, Desislava; Nilla, Santosh; Klau, Gunnar W.; Dandekar, Thomas; Müller, Tobias; Dittrich, Marcus

    2014-01-01

    The continuously evolving field of proteomics produces increasing amounts of data while improving the quality of protein identifications. Albeit quantitative measurements are becoming more popular, many proteomic studies are still based on non-quantitative methods for protein identification. These studies result in potentially large sets of identified proteins, where the biological interpretation of proteins can be challenging. Systems biology develops innovative network-based methods, which allow an integrated analysis of these data. Here we present a novel approach, which combines prior knowledge of protein-protein interactions (PPI) with proteomics data using functional similarity measurements of interacting proteins. This integrated network analysis exactly identifies network modules with a maximal consistent functional similarity reflecting biological processes of the investigated cells. We validated our approach on small (H9N2 virus-infected gastric cells) and large (blood constituents) proteomic data sets. Using this novel algorithm, we identified characteristic functional modules in virus-infected cells, comprising key signaling proteins (e.g. the stress-related kinase RAF1) and demonstrate that this method allows a module-based functional characterization of cell types. Analysis of a large proteome data set of blood constituents resulted in clear separation of blood cells according to their developmental origin. A detailed investigation of the T-cell proteome further illustrates how the algorithm partitions large networks into functional subnetworks each representing specific cellular functions. These results demonstrate that the integrated network approach not only allows a detailed analysis of proteome networks but also yields a functional decomposition of complex proteomic data sets and thereby provides deeper insights into the underlying cellular processes of the investigated system. PMID:24807868

  6. RGB Color Calibration for Quantitative Image Analysis: The “3D Thin-Plate Spline” Warping Approach

    PubMed Central

    Menesatti, Paolo; Angelini, Claudio; Pallottino, Federico; Antonucci, Francesca; Aguzzi, Jacopo; Costa, Corrado

    2012-01-01

    In the last years the need to numerically define color by its coordinates in n-dimensional space has increased strongly. Colorimetric calibration is fundamental in food processing and other biological disciplines to quantitatively compare samples' color during workflow with many devices. Several software programmes are available to perform standardized colorimetric procedures, but they are often too imprecise for scientific purposes. In this study, we applied the Thin-Plate Spline interpolation algorithm to calibrate colours in sRGB space (the corresponding Matlab code is reported in the Appendix). This was compared with other two approaches. The first is based on a commercial calibration system (ProfileMaker) and the second on a Partial Least Square analysis. Moreover, to explore device variability and resolution two different cameras were adopted and for each sensor, three consecutive pictures were acquired under four different light conditions. According to our results, the Thin-Plate Spline approach reported a very high efficiency of calibration allowing the possibility to create a revolution in the in-field applicative context of colour quantification not only in food sciences, but also in other biological disciplines. These results are of great importance for scientific color evaluation when lighting conditions are not controlled. Moreover, it allows the use of low cost instruments while still returning scientifically sound quantitative data. PMID:22969337

  7. 3D Actin Network Centerline Extraction with Multiple Active Contours

    PubMed Central

    Xu, Ting; Vavylonis, Dimitrios; Huang, Xiaolei

    2013-01-01

    Fluorescence microscopy is frequently used to study two and three dimensional network structures formed by cytoskeletal polymer fibers such as actin filaments and actin cables. While these cytoskeletal structures are often dilute enough to allow imaging of individual filaments or bundles of them, quantitative analysis of these images is challenging. To facilitate quantitative, reproducible and objective analysis of the image data, we propose a semi-automated method to extract actin networks and retrieve their topology in 3D. Our method uses multiple Stretching Open Active Contours (SOACs) that are automatically initialized at image intensity ridges and then evolve along the centerlines of filaments in the network. SOACs can merge, stop at junctions, and reconfigure with others to allow smooth crossing at junctions of filaments. The proposed approach is generally applicable to images of curvilinear networks with low SNR. We demonstrate its potential by extracting the centerlines of synthetic meshwork images, actin networks in 2D Total Internal Reflection Fluorescence Microscopy images, and 3D actin cable meshworks of live fission yeast cells imaged by spinning disk confocal microscopy. Quantitative evaluation of the method using synthetic images shows that for images with SNR above 5.0, the average vertex error measured by the distance between our result and ground truth is 1 voxel, and the average Hausdorff distance is below 10 voxels. PMID:24316442

  8. Quantitative characterization of material surface — Application to Ni + Mo electrolytic composite coatings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kubisztal, J., E-mail: julian.kubisztal@us.edu.pl

    A new approach to numerical analysis of maps of material surface has been proposed and discussed in detail. It was concluded that the roughness factor RF and the root mean square roughness S{sub q} show a saturation effect with increasing size of the analysed maps what allows determining the optimal map dimension representative of the examined material. A quantitative method of determining predominant direction of the surface texture based on the power spectral density function is also proposed and discussed. The elaborated method was applied in surface analysis of Ni + Mo composite coatings. It was shown that co-deposition ofmore » molybdenum particles in nickel matrix leads to an increase in surface roughness. In addition, a decrease in size of the embedded Mo particles in Ni matrix causes an increase of both the surface roughness and the surface texture. It was also stated that the relation between the roughness factor and the double layer capacitance C{sub dl} of the studied coatings is linear and allows determining the double layer capacitance of the smooth nickel electrode. - Highlights: •Optimization of the procedure for the scanning of the material surface •Quantitative determination of the surface roughness and texture intensity •Proposition of the parameter describing privileged direction of the surface texture •Determination of the double layer capacitance of the smooth electrode.« less

  9. Quantitative characterisation of sedimentary grains

    NASA Astrophysics Data System (ADS)

    Tunwal, Mohit; Mulchrone, Kieran F.; Meere, Patrick A.

    2016-04-01

    Analysis of sedimentary texture helps in determining the formation, transportation and deposition processes of sedimentary rocks. Grain size analysis is traditionally quantitative, whereas grain shape analysis is largely qualitative. A semi-automated approach to quantitatively analyse shape and size of sand sized sedimentary grains is presented. Grain boundaries are manually traced from thin section microphotographs in the case of lithified samples and are automatically identified in the case of loose sediments. Shape and size paramters can then be estimated using a software package written on the Mathematica platform. While automated methodology already exists for loose sediment analysis, the available techniques for the case of lithified samples are limited to cases of high definition thin section microphotographs showing clear contrast between framework grains and matrix. Along with the size of grain, shape parameters such as roundness, angularity, circularity, irregularity and fractal dimension are measured. A new grain shape parameter developed using Fourier descriptors has also been developed. To test this new approach theoretical examples were analysed and produce high quality results supporting the accuracy of the algorithm. Furthermore sandstone samples from known aeolian and fluvial environments from the Dingle Basin, County Kerry, Ireland were collected and analysed. Modern loose sediments from glacial till from County Cork, Ireland and aeolian sediments from Rajasthan, India have also been collected and analysed. A graphical summary of the data is presented and allows for quantitative distinction between samples extracted from different sedimentary environments.

  10. New insight in quantitative analysis of vascular permeability during immune reaction (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Kalchenko, Vyacheslav; Molodij, Guillaume; Kuznetsov, Yuri; Smolyakov, Yuri; Israeli, David; Meglinski, Igor; Harmelin, Alon

    2016-03-01

    The use of fluorescence imaging of vascular permeability becomes a golden standard for assessing the inflammation process during experimental immune response in vivo. The use of the optical fluorescence imaging provides a very useful and simple tool to reach this purpose. The motivation comes from the necessity of a robust and simple quantification and data presentation of inflammation based on a vascular permeability. Changes of the fluorescent intensity, as a function of time is a widely accepted method to assess the vascular permeability during inflammation related to the immune response. In the present study we propose to bring a new dimension by applying a more sophisticated approach to the analysis of vascular reaction by using a quantitative analysis based on methods derived from astronomical observations, in particular by using a space-time Fourier filtering analysis followed by a polynomial orthogonal modes decomposition. We demonstrate that temporal evolution of the fluorescent intensity observed at certain pixels correlates quantitatively to the blood flow circulation at normal conditions. The approach allows to determine the regions of permeability and monitor both the fast kinetics related to the contrast material distribution in the circulatory system and slow kinetics associated with extravasation of the contrast material. Thus, we introduce a simple and convenient method for fast quantitative visualization of the leakage related to the inflammatory (immune) reaction in vivo.

  11. Quantification of the methylation status of the PWS/AS imprinted region: comparison of two approaches based on bisulfite sequencing and methylation-sensitive MLPA.

    PubMed

    Dikow, Nicola; Nygren, Anders Oh; Schouten, Jan P; Hartmann, Carolin; Krämer, Nikola; Janssen, Bart; Zschocke, Johannes

    2007-06-01

    Standard methods used for genomic methylation analysis allow the detection of complete absence of either methylated or non-methylated alleles but are usually unable to detect changes in the proportion of methylated and unmethylated alleles. We compare two methods for quantitative methylation analysis, using the chromosome 15q11-q13 imprinted region as model. Absence of the non-methylated paternal allele in this region leads to Prader-Willi syndrome (PWS) whilst absence of the methylated maternal allele results in Angelman syndrome (AS). A proportion of AS is caused by mosaic imprinting defects which may be missed with standard methods and require quantitative analysis for their detection. Sequence-based quantitative methylation analysis (SeQMA) involves quantitative comparison of peaks generated through sequencing reactions after bisulfite treatment. It is simple, cost-effective and can be easily established for a large number of genes. However, our results support previous suggestions that methods based on bisulfite treatment may be problematic for exact quantification of methylation status. Methylation-specific multiplex ligation-dependent probe amplification (MS-MLPA) avoids bisulfite treatment. It detects changes in both CpG methylation as well as copy number of up to 40 chromosomal sequences in one simple reaction. Once established in a laboratory setting, the method is more accurate, reliable and less time consuming.

  12. MASH Suite Pro: A Comprehensive Software Tool for Top-Down Proteomics*

    PubMed Central

    Cai, Wenxuan; Guner, Huseyin; Gregorich, Zachery R.; Chen, Albert J.; Ayaz-Guner, Serife; Peng, Ying; Valeja, Santosh G.; Liu, Xiaowen; Ge, Ying

    2016-01-01

    Top-down mass spectrometry (MS)-based proteomics is arguably a disruptive technology for the comprehensive analysis of all proteoforms arising from genetic variation, alternative splicing, and posttranslational modifications (PTMs). However, the complexity of top-down high-resolution mass spectra presents a significant challenge for data analysis. In contrast to the well-developed software packages available for data analysis in bottom-up proteomics, the data analysis tools in top-down proteomics remain underdeveloped. Moreover, despite recent efforts to develop algorithms and tools for the deconvolution of top-down high-resolution mass spectra and the identification of proteins from complex mixtures, a multifunctional software platform, which allows for the identification, quantitation, and characterization of proteoforms with visual validation, is still lacking. Herein, we have developed MASH Suite Pro, a comprehensive software tool for top-down proteomics with multifaceted functionality. MASH Suite Pro is capable of processing high-resolution MS and tandem MS (MS/MS) data using two deconvolution algorithms to optimize protein identification results. In addition, MASH Suite Pro allows for the characterization of PTMs and sequence variations, as well as the relative quantitation of multiple proteoforms in different experimental conditions. The program also provides visualization components for validation and correction of the computational outputs. Furthermore, MASH Suite Pro facilitates data reporting and presentation via direct output of the graphics. Thus, MASH Suite Pro significantly simplifies and speeds up the interpretation of high-resolution top-down proteomics data by integrating tools for protein identification, quantitation, characterization, and visual validation into a customizable and user-friendly interface. We envision that MASH Suite Pro will play an integral role in advancing the burgeoning field of top-down proteomics. PMID:26598644

  13. Quantitative allochem compositional analysis of Lochkovian-Pragian boundary sections in the Prague Basin (Czech Republic)

    NASA Astrophysics Data System (ADS)

    Weinerová, Hedvika; Hron, Karel; Bábek, Ondřej; Šimíček, Daniel; Hladil, Jindřich

    2017-06-01

    Quantitative allochem compositional trends across the Lochkovian-Pragian boundary Event were examined at three sections recording the proximal to more distal carbonate ramp environment of the Prague Basin. Multivariate statistical methods (principal component analysis, correspondence analysis, cluster analysis) of point-counted thin section data were used to reconstruct facies stacking patterns and sea-level history. Both the closed-nature allochem percentages and their centred log-ratio (clr) coordinates were used. Both these approaches allow for distinguishing of lowstand, transgressive and highstand system tracts within the Praha Formation, which show gradual transition from crinoid-dominated facies deposited above the storm wave base to dacryoconarid-dominated facies of deep-water environment below the storm wave base. Quantitative compositional data also indicate progradative-retrogradative trends in the macrolithologically monotonous shallow-water succession and enable its stratigraphic correlation with successions from deeper-water environments. Generally, the stratigraphic trends of the clr data are more sensitive to subtle changes in allochem composition in comparison to the results based on raw data. A heterozoan-dominated allochem association in shallow-water environments of the Praha Formation supports the carbonate ramp environment assumed by previous authors.

  14. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kiefel, Denis, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com; Stoessel, Rainer, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com; Grosse, Christian, E-mail: Grosse@tum.de

    2015-03-31

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT)more » system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.« less

  15. Tau-U: A Quantitative Approach for Analysis of Single-Case Experimental Data in Aphasia.

    PubMed

    Lee, Jaime B; Cherney, Leora R

    2018-03-01

    Tau-U is a quantitative approach for analyzing single-case experimental design (SCED) data. It combines nonoverlap between phases with intervention phase trend and can correct for a baseline trend (Parker, Vannest, & Davis, 2011). We demonstrate the utility of Tau-U by comparing it with the standardized mean difference approach (Busk & Serlin, 1992) that is widely reported within the aphasia SCED literature. Repeated writing measures from 3 participants with chronic aphasia who received computer-based writing treatment are analyzed visually and quantitatively using both Tau-U and the standardized mean difference approach. Visual analysis alone was insufficient for determining an effect between the intervention and writing improvement. The standardized mean difference yielded effect sizes ranging from 4.18 to 26.72 for trained items and 1.25 to 3.20 for untrained items. Tau-U yielded significant (p < .05) effect sizes for 2 of 3 participants for trained probes and 1 of 3 participants for untrained probes. A baseline trend correction was applied to data from 2 of 3 participants. Tau-U has the unique advantage of allowing for the correction of an undesirable baseline trend. Although further study is needed, Tau-U shows promise as a quantitative approach to augment visual analysis of SCED data in aphasia.

  16. CALIPSO: an interactive image analysis software package for desktop PACS workstations

    NASA Astrophysics Data System (ADS)

    Ratib, Osman M.; Huang, H. K.

    1990-07-01

    The purpose of this project is to develop a low cost workstation for quantitative analysis of multimodality images using a Macintosh II personal computer. In the current configuration the Macintosh operates as a stand alone workstation where images are imported either from a central PACS server through a standard Ethernet network or recorded through video digitizer board. The CALIPSO software developed contains a large variety ofbasic image display and manipulation tools. We focused our effort however on the design and implementation ofquantitative analysis methods that can be applied to images from different imaging modalities. Analysis modules currently implemented include geometric and densitometric volumes and ejection fraction calculation from radionuclide and cine-angiograms Fourier analysis ofcardiac wall motion vascular stenosis measurement color coded parametric display of regional flow distribution from dynamic coronary angiograms automatic analysis ofmyocardial distribution ofradiolabelled tracers from tomoscintigraphic images. Several of these analysis tools were selected because they use similar color coded andparametric display methods to communicate quantitative data extracted from the images. 1. Rationale and objectives of the project Developments of Picture Archiving and Communication Systems (PACS) in clinical environment allow physicians and radiologists to assess radiographic images directly through imaging workstations (''). This convenient access to the images is often limited by the number of workstations available due in part to their high cost. There is also an increasing need for quantitative analysis ofthe images. During thepast decade

  17. Validation of the Mass-Extraction-Window for Quantitative Methods Using Liquid Chromatography High Resolution Mass Spectrometry.

    PubMed

    Glauser, Gaétan; Grund, Baptiste; Gassner, Anne-Laure; Menin, Laure; Henry, Hugues; Bromirski, Maciej; Schütz, Frédéric; McMullen, Justin; Rochat, Bertrand

    2016-03-15

    A paradigm shift is underway in the field of quantitative liquid chromatography-mass spectrometry (LC-MS) analysis thanks to the arrival of recent high-resolution mass spectrometers (HRMS). The capability of HRMS to perform sensitive and reliable quantifications of a large variety of analytes in HR-full scan mode is showing that it is now realistic to perform quantitative and qualitative analysis with the same instrument. Moreover, HR-full scan acquisition offers a global view of sample extracts and allows retrospective investigations as virtually all ionized compounds are detected with a high sensitivity. In time, the versatility of HRMS together with the increasing need for relative quantification of hundreds of endogenous metabolites should promote a shift from triple-quadrupole MS to HRMS. However, a current "pitfall" in quantitative LC-HRMS analysis is the lack of HRMS-specific guidance for validated quantitative analyses. Indeed, false positive and false negative HRMS detections are rare, albeit possible, if inadequate parameters are used. Here, we investigated two key parameters for the validation of LC-HRMS quantitative analyses: the mass accuracy (MA) and the mass-extraction-window (MEW) that is used to construct the extracted-ion-chromatograms. We propose MA-parameters, graphs, and equations to calculate rational MEW width for the validation of quantitative LC-HRMS methods. MA measurements were performed on four different LC-HRMS platforms. Experimentally determined MEW values ranged between 5.6 and 16.5 ppm and depended on the HRMS platform, its working environment, the calibration procedure, and the analyte considered. The proposed procedure provides a fit-for-purpose MEW determination and prevents false detections.

  18. Droplet Microfluidic and Magnetic Particles Platform for Cancer Typing.

    PubMed

    Ferraro, Davide; Champ, Jérôme; Teste, Bruno; Serra, M; Malaquin, Laurent; Descroix, Stéphanie; de Cremoux, Patricia; Viovy, Jean-Louis

    2017-01-01

    Analyses of nucleic acids are routinely performed in hospital laboratories to detect gene alterations for cancer diagnosis and treatment decision. Among the different possible investigations, mRNA analysis provides information on abnormal levels of genes expression. Standard laboratory methods are still not adapted to the isolation and quantitation of low mRNA amounts and new techniques needs to be developed in particular for rare subsets analysis. By reducing the volume involved, time process, and the contamination risks, droplet microfluidics provide numerous advantages to perform analysis down to the single cell level.We report on a droplet microfluidic platform based on the manipulation of magnetic particles that allows the clinical analysis of tumor tissues. In particular, it allows the extraction of mRNA from the total-RNA sample, Reverse Transcription, and cDNA amplification, all in droplets.

  19. Critical considerations for the qualitative and quantitative determination of process-induced disorder in crystalline solids.

    PubMed

    Newman, Ann; Zografi, George

    2014-09-01

    Solid-state instabilities in crystalline solids arise during processing primarily because a certain level of structural disorder has been introduced into the crystal. Many physical instabilities appear to be associated with the recrystallization of molecules from these disordered regions, while chemical instabilities arise from sufficient molecular mobility to allow solid-state chemical reactivity. In this Commentary we discuss the various forms of structural disorder, processing which can produce disorder, the quantitative analysis of process-induced order, and strategies to limit disorder and its effects. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  20. A mixed methods contribution to the study of health public policies: complementarities and difficulties

    PubMed Central

    2015-01-01

    The use of mixed methods (combining quantitative and qualitative data) is developing in a variety of forms, especially in the health field. Our own research has adopted this perspective from the outset. We have sought all along to innovate in various ways and especially to develop an equal partnership, in the sense of not allowing any single approach to dominate. After briefly describing mixed methods, in this article we explain and illustrate how we have exploited both qualitative and quantitative methods to answer our research questions, ending with a reflective analysis of our experiment. PMID:26559730

  1. Single-Cell Based Quantitative Assay of Chromosome Transmission Fidelity

    PubMed Central

    Zhu, Jin; Heinecke, Dominic; Mulla, Wahid A.; Bradford, William D.; Rubinstein, Boris; Box, Andrew; Haug, Jeffrey S.; Li, Rong

    2015-01-01

    Errors in mitosis are a primary cause of chromosome instability (CIN), generating aneuploid progeny cells. Whereas a variety of factors can influence CIN, under most conditions mitotic errors are rare events that have been difficult to measure accurately. Here we report a green fluorescent protein−based quantitative chromosome transmission fidelity (qCTF) assay in budding yeast that allows sensitive and quantitative detection of CIN and can be easily adapted to high-throughput analysis. Using the qCTF assay, we performed genome-wide quantitative profiling of genes that affect CIN in a dosage-dependent manner and identified genes that elevate CIN when either increased (icCIN) or decreased in copy number (dcCIN). Unexpectedly, qCTF screening also revealed genes whose change in copy number quantitatively suppress CIN, suggesting that the basal error rate of the wild-type genome is not minimized, but rather, may have evolved toward an optimal level that balances both stability and low-level karyotype variation for evolutionary adaptation. PMID:25823586

  2. Single-Cell Based Quantitative Assay of Chromosome Transmission Fidelity.

    PubMed

    Zhu, Jin; Heinecke, Dominic; Mulla, Wahid A; Bradford, William D; Rubinstein, Boris; Box, Andrew; Haug, Jeffrey S; Li, Rong

    2015-03-30

    Errors in mitosis are a primary cause of chromosome instability (CIN), generating aneuploid progeny cells. Whereas a variety of factors can influence CIN, under most conditions mitotic errors are rare events that have been difficult to measure accurately. Here we report a green fluorescent protein-based quantitative chromosome transmission fidelity (qCTF) assay in budding yeast that allows sensitive and quantitative detection of CIN and can be easily adapted to high-throughput analysis. Using the qCTF assay, we performed genome-wide quantitative profiling of genes that affect CIN in a dosage-dependent manner and identified genes that elevate CIN when either increased (icCIN) or decreased in copy number (dcCIN). Unexpectedly, qCTF screening also revealed genes whose change in copy number quantitatively suppress CIN, suggesting that the basal error rate of the wild-type genome is not minimized, but rather, may have evolved toward an optimal level that balances both stability and low-level karyotype variation for evolutionary adaptation. Copyright © 2015 Zhu et al.

  3. Quantitative Analysis of Venus Radar Backscatter Data in ArcGIS

    NASA Technical Reports Server (NTRS)

    Long, S. M.; Grosfils, E. B.

    2005-01-01

    Ongoing mapping of the Ganiki Planitia (V14) quadrangle of Venus and definition of material units has involved an integrated but qualitative analysis of Magellan radar backscatter images and topography using standard geomorphological mapping techniques. However, such analyses do not take full advantage of the quantitative information contained within the images. Analysis of the backscatter coefficient allows a much more rigorous statistical comparison between mapped units, permitting first order selfsimilarity tests of geographically separated materials assigned identical geomorphological labels. Such analyses cannot be performed directly on pixel (DN) values from Magellan backscatter images, because the pixels are scaled to the Muhleman law for radar echoes on Venus and are not corrected for latitudinal variations in incidence angle. Therefore, DN values must be converted based on pixel latitude back to their backscatter coefficient values before accurate statistical analysis can occur. Here we present a method for performing the conversions and analysis of Magellan backscatter data using commonly available ArcGIS software and illustrate the advantages of the process for geological mapping.

  4. Quantitative 3D Analysis of Nuclear Morphology and Heterochromatin Organization from Whole-Mount Plant Tissue Using NucleusJ.

    PubMed

    Desset, Sophie; Poulet, Axel; Tatout, Christophe

    2018-01-01

    Image analysis is a classical way to study nuclear organization. While nuclear organization used to be investigated by colorimetric or fluorescent labeling of DNA or specific nuclear compartments, new methods in microscopy imaging now enable qualitative and quantitative analyses of chromatin pattern, and nuclear size and shape. Several procedures have been developed to prepare samples in order to collect 3D images for the analysis of spatial chromatin organization, but only few preserve the positional information of the cell within its tissue context. Here, we describe a whole mount tissue preparation procedure coupled to DNA staining using the PicoGreen ® intercalating agent suitable for image analysis of the nucleus in living and fixed tissues. 3D Image analysis is then performed using NucleusJ, an open source ImageJ plugin, which allows for quantifying variations in nuclear morphology such as nuclear volume, sphericity, elongation, and flatness as well as in heterochromatin content and position in respect to the nuclear periphery.

  5. Evolution, Energy Landscapes and the Paradoxes of Protein Folding

    PubMed Central

    Wolynes, Peter G.

    2014-01-01

    Protein folding has been viewed as a difficult problem of molecular self-organization. The search problem involved in folding however has been simplified through the evolution of folding energy landscapes that are funneled. The funnel hypothesis can be quantified using energy landscape theory based on the minimal frustration principle. Strong quantitative predictions that follow from energy landscape theory have been widely confirmed both through laboratory folding experiments and from detailed simulations. Energy landscape ideas also have allowed successful protein structure prediction algorithms to be developed. The selection constraint of having funneled folding landscapes has left its imprint on the sequences of existing protein structural families. Quantitative analysis of co-evolution patterns allows us to infer the statistical characteristics of the folding landscape. These turn out to be consistent with what has been obtained from laboratory physicochemical folding experiments signalling a beautiful confluence of genomics and chemical physics. PMID:25530262

  6. [Emission of volatile components in carpenter's kumylotox into air].

    PubMed

    Jodynis-Liebert, J; Kiejnowska, M

    1991-01-01

    Study was made of Carpenter's Kumylotox, a fungicidal preparation containing: p-cumylphenol, dibutyl phthalate, machine oil, chloroparaffin, a 15% solution of ker-1500 rubber in painter's naphta, and petrol for pastas. The preparation was applied onto boards placed in an experimental chamber at 1-week intervals. In air of the chamber, dibutyl phthalate and p-cumylphenol were determined quantitatively by gas chromatography. The presence of hydrocarbons was recorded by the same method, without quantitative determination. Analyses were continued until the disappearance of the investigated from air. It was found that already after 2 weeks the p-cumylphenol level dropped below the allowable concentration amounting to 0.015 mg/dm3. The dibutyl phthalate level decreased to the allowable concentration (0.05 mg/m3) only after 9 weeks of board ageing. According to analysis by the GC-MS method, aromatic hydrocarbons disappeared from the chamber's air already after 5 weeks, and the remaining hydrocarbons--after 9 weeks.

  7. Accurate ECG diagnosis of atrial tachyarrhythmias using quantitative analysis: a prospective diagnostic and cost-effectiveness study.

    PubMed

    Krummen, David E; Patel, Mitul; Nguyen, Hong; Ho, Gordon; Kazi, Dhruv S; Clopton, Paul; Holland, Marian C; Greenberg, Scott L; Feld, Gregory K; Faddis, Mitchell N; Narayan, Sanjiv M

    2010-11-01

    Quantitative ECG Analysis. Optimal atrial tachyarrhythmia management is facilitated by accurate electrocardiogram interpretation, yet typical atrial flutter (AFl) may present without sawtooth F-waves or RR regularity, and atrial fibrillation (AF) may be difficult to separate from atypical AFl or rapid focal atrial tachycardia (AT). We analyzed whether improved diagnostic accuracy using a validated analysis tool significantly impacts costs and patient care. We performed a prospective, blinded, multicenter study using a novel quantitative computerized algorithm to identify atrial tachyarrhythmia mechanism from the surface ECG in patients referred for electrophysiology study (EPS). In 122 consecutive patients (age 60 ± 12 years) referred for EPS, 91 sustained atrial tachyarrhythmias were studied. ECGs were also interpreted by 9 physicians from 3 specialties for comparison and to allow healthcare system modeling. Diagnostic accuracy was compared to the diagnosis at EPS. A Markov model was used to estimate the impact of improved arrhythmia diagnosis. We found 13% of typical AFl ECGs had neither sawtooth flutter waves nor RR regularity, and were misdiagnosed by the majority of clinicians (0/6 correctly diagnosed by consensus visual interpretation) but correctly by quantitative analysis in 83% (5/6, P = 0.03). AF diagnosis was also improved through use of the algorithm (92%) versus visual interpretation (primary care: 76%, P < 0.01). Economically, we found that these improvements in diagnostic accuracy resulted in an average cost-savings of $1,303 and 0.007 quality-adjusted-life-years per patient. Typical AFl and AF are frequently misdiagnosed using visual criteria. Quantitative analysis improves diagnostic accuracy and results in improved healthcare costs and patient outcomes. © 2010 Wiley Periodicals, Inc.

  8. Quantitative analysis of eyes and other optical systems in linear optics.

    PubMed

    Harris, William F; Evans, Tanya; van Gool, Radboud D

    2017-05-01

    To show that 14-dimensional spaces of augmented point P and angle Q characteristics, matrices obtained from the ray transference, are suitable for quantitative analysis although only the latter define an inner-product space and only on it can one define distances and angles. The paper examines the nature of the spaces and their relationships to other spaces including symmetric dioptric power space. The paper makes use of linear optics, a three-dimensional generalization of Gaussian optics. Symmetric 2 × 2 dioptric power matrices F define a three-dimensional inner-product space which provides a sound basis for quantitative analysis (calculation of changes, arithmetic means, etc.) of refractive errors and thin systems. For general systems the optical character is defined by the dimensionally-heterogeneous 4 × 4 symplectic matrix S, the transference, or if explicit allowance is made for heterocentricity, the 5 × 5 augmented symplectic matrix T. Ordinary quantitative analysis cannot be performed on them because matrices of neither of these types constitute vector spaces. Suitable transformations have been proposed but because the transforms are dimensionally heterogeneous the spaces are not naturally inner-product spaces. The paper obtains 14-dimensional spaces of augmented point P and angle Q characteristics. The 14-dimensional space defined by the augmented angle characteristics Q is dimensionally homogenous and an inner-product space. A 10-dimensional subspace of the space of augmented point characteristics P is also an inner-product space. The spaces are suitable for quantitative analysis of the optical character of eyes and many other systems. Distances and angles can be defined in the inner-product spaces. The optical systems may have multiple separated astigmatic and decentred refracting elements. © 2017 The Authors Ophthalmic & Physiological Optics © 2017 The College of Optometrists.

  9. Rapid, quantitative analysis of ppm/ppb nicotine using surface-enhanced Raman scattering from polymer-encapsulated Ag nanoparticles (gel-colls).

    PubMed

    Bell, Steven E J; Sirimuthu, Narayana M S

    2004-11-01

    Rapid, quantitative SERS analysis of nicotine at ppm/ppb levels has been carried out using stable and inexpensive polymer-encapsulated Ag nanoparticles (gel-colls). The strongest nicotine band (1030 cm(-1)) was measured against d(5)-pyridine internal standard (974 cm(-1)) which was introduced during preparation of the stock gel-colls. Calibration plots of I(nic)/I(pyr) against the concentration of nicotine were non-linear but plotting I(nic)/I(pyr) against [nicotine](x)(x = 0.6-0.75, depending on the exact experimental conditions) gave linear calibrations over the range (0.1-10 ppm) with R(2) typically ca. 0.998. The RMS prediction error was found to be 0.10 ppm when the gel-colls were used for quantitative determination of unknown nicotine samples in 1-5 ppm level. The main advantages of the method are that the gel-colls constitute a highly stable and reproducible SERS medium that allows high throughput (50 sample h(-1)) measurements.

  10. Ratiometric spectral imaging for fast tumor detection and chemotherapy monitoring in vivo

    PubMed Central

    Hwang, Jae Youn; Gross, Zeev; Gray, Harry B.; Medina-Kauwe, Lali K.; Farkas, Daniel L.

    2011-01-01

    We report a novel in vivo spectral imaging approach to cancer detection and chemotherapy assessment. We describe and characterize a ratiometric spectral imaging and analysis method and evaluate its performance for tumor detection and delineation by quantitatively monitoring the specific accumulation of targeted gallium corrole (HerGa) into HER2-positive (HER2 +) breast tumors. HerGa temporal accumulation in nude mice bearing HER2 + breast tumors was monitored comparatively by a. this new ratiometric imaging and analysis method; b. established (reflectance and fluorescence) spectral imaging; c. more commonly used fluorescence intensity imaging. We also tested the feasibility of HerGa imaging in vivo using the ratiometric spectral imaging method for tumor detection and delineation. Our results show that the new method not only provides better quantitative information than typical spectral imaging, but also better specificity than standard fluorescence intensity imaging, thus allowing enhanced in vivo outlining of tumors and dynamic, quantitative monitoring of targeted chemotherapy agent accumulation into them. PMID:21721808

  11. X-ray vision of fuel sprays.

    PubMed

    Wang, Jin

    2005-03-01

    With brilliant synchrotron X-ray sources, microsecond time-resolved synchrotron X-ray radiography and tomography have been used to elucidate the detailed three-dimensional structure and dynamics of high-pressure high-speed fuel sprays in the near-nozzle region. The measurement allows quantitative determination of the fuel distribution in the optically impenetrable region owing to the multiple scattering of visible light by small atomized fuel droplets surrounding the jet. X-radiographs of the jet-induced shock waves prove that the fuel jets become supersonic under appropriate injection conditions and that the quantitative analysis of the thermodynamic properties of the shock waves can also be derived from the most direct measurement. In other situations where extremely axial-asymmetric sprays are encountered, mass deconvolution and cross-sectional fuel distribution models can be computed based on the monochromatic and time-resolved X-radiographic images collected from various rotational orientations of the sprays. Such quantitative analysis reveals the never-before-reported characteristics and most detailed near-nozzle mass distribution of highly transient fuel sprays.

  12. MaGelLAn 1.0: a software to facilitate quantitative and population genetic analysis of maternal inheritance by combination of molecular and pedigree information.

    PubMed

    Ristov, Strahil; Brajkovic, Vladimir; Cubric-Curik, Vlatka; Michieli, Ivan; Curik, Ino

    2016-09-10

    Identification of genes or even nucleotides that are responsible for quantitative and adaptive trait variation is a difficult task due to the complex interdependence between a large number of genetic and environmental factors. The polymorphism of the mitogenome is one of the factors that can contribute to quantitative trait variation. However, the effects of the mitogenome have not been comprehensively studied, since large numbers of mitogenome sequences and recorded phenotypes are required to reach the adequate power of analysis. Current research in our group focuses on acquiring the necessary mitochondria sequence information and analysing its influence on the phenotype of a quantitative trait. To facilitate these tasks we have produced software for processing pedigrees that is optimised for maternal lineage analysis. We present MaGelLAn 1.0 (maternal genealogy lineage analyser), a suite of four Python scripts (modules) that is designed to facilitate the analysis of the impact of mitogenome polymorphism on quantitative trait variation by combining molecular and pedigree information. MaGelLAn 1.0 is primarily used to: (1) optimise the sampling strategy for molecular analyses; (2) identify and correct pedigree inconsistencies; and (3) identify maternal lineages and assign the corresponding mitogenome sequences to all individuals in the pedigree, this information being used as input to any of the standard software for quantitative genetic (association) analysis. In addition, MaGelLAn 1.0 allows computing the mitogenome (maternal) effective population sizes and probability of mitogenome (maternal) identity that are useful for conservation management of small populations. MaGelLAn is the first tool for pedigree analysis that focuses on quantitative genetic analyses of mitogenome data. It is conceived with the purpose to significantly reduce the effort in handling and preparing large pedigrees for processing the information linked to maternal lines. The software source code, along with the manual and the example files can be downloaded at http://lissp.irb.hr/software/magellan-1-0/ and https://github.com/sristov/magellan .

  13. Sieve-based device for MALDI sample preparation. III. Its power for quantitative measurements.

    PubMed

    Molin, Laura; Cristoni, Simone; Seraglia, Roberta; Traldi, Pietro

    2011-02-01

    The solid sample inhomogeneity is a weak point of traditional MALDI deposition techniques that reflects negatively on quantitative analysis. The recently developed sieve-based device (SBD) sample deposition method, based on the electrospraying of matrix/analyte solutions through a grounded sieve, allows the homogeneous deposition of microcrystals with dimensions smaller than that of the laser spot. In each microcrystal the matrix/analyte molar ratio can be considered constant. Then, by irradiating different portions of the microcrystal distribution an identical response is obtained. This result suggests the employment of SBD in the development of quantitative procedures. For this aim, mixtures of different proteins of known molarity were analyzed, showing a good relationship between molarity and intensity ratios. This behaviour was also observed in the case of proteins with quite different ionic yields. The power of the developed method for quantitative evaluation was also tested by the measurement of the abundance of IGPP[Oxi]GPP[Oxi]GLMGPP (m/z 1219) present in the collagen-α-5(IV) chain precursor, differently expressed in urines from healthy subjects and diabetic-nephropathic patients, confirming its overexpression in the presence of nephropathy. The data obtained indicate that SBD is a particularly effective method for quantitative analysis also in biological fluids of interest. Copyright © 2011 John Wiley & Sons, Ltd.

  14. UPLC-MS/MS quantitative analysis and structural fragmentation study of five Parmotrema lichens from the Eastern Ghats.

    PubMed

    Kumar, K; Siva, Bandi; Sarma, V U M; Mohabe, Satish; Reddy, A Madhusudana; Boustie, Joel; Tiwari, Ashok K; Rao, N Rama; Babu, K Suresh

    2018-07-15

    Comparative phytochemical analysis of five lichen species [Parmotrema tinctorum (Delise ex Nyl.) Hale, P. andinum (Mull. Arg.) Hale, P. praesorediosum (Nyl.) Hale, P. grayanum (Hue) Hale, P. austrosinense (Zahlbr.) Hale] of Parmotrema genus were performed using two complementary UPLC-MS systems. The first system consists of high resolution UPLC-QToF-MS/MS spectrometer and the second system consisted of UPLC-MS/MS in Multiple Reaction Monitoring (MRM) mode for quantitative analysis of major constituents in the selected lichen species. The individual compounds (47 compounds) were identified using Q-ToF-MS/MS, via comparison of the exact molecular masses from their MS/MS spectra, the comparison of literature data and retention times to those of standard compounds which were isolated from crude extract of abundant lichen, P. tinctorum. The analysis also allowed us to identify unknown peaks/compounds, which were further characterized by their mass fragmentation studies. The quantitative MRM analysis was useful to have a better discrimination of species according to their chemical profile. Moreover, the determination of antioxidant activities (ABTS + inhibition) and Advance Glycation Endproducts (AGEs) inhibition carried out for the crude extracts revealed a potential antiglycaemic activity to be confirmed for P. austrosinense. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Analysis of human serum lipoprotein lipid composition using MALDI-TOF mass spectrometry.

    PubMed

    Hidaka, Hiroya; Hanyu, Noboru; Sugano, Mitsutoshi; Kawasaki, Kenji; Yamauchi, Kazuyoshi; Katsuyama, Tsutomu

    2007-01-01

    This study used matrix-assisted laser desorption and ionization time-of-flight mass spectrometry (MALDI-TOF MS) to identify all lipid classes in human serum lipoproteins. After the major lipoproteins classes were isolated from serum by ultracentrifugation, the lipids were extracted and mixed with 2,5-dihydroxybenzoic acid (2,5-DHB) dissolved in Folch's solution (chloroform/methanol 2:1, v/v). MALDI-TOF MS analysis of the samples identified phospholipids (PLs), lysophospholipids (lysoPLs), sphingolipids (SLs), triglycerides (TGs), cholesteryl esters (CEs), and free cholesterol; it also showed the characteristics of individual fatty acid chains in serum lipids. MALDI-TOF MS allowed analysis of strongly hydrophobic and non-polar molecules such as CEs and TGs as well as hydrophilic molecules such as phospholipids. Direct analysis of fatty acids was not possible. The concentrations of lipids were not consistent with the ion peak intensities, since the extent of polarity affected the ionization characteristics of the molecules. However, lipid molecules with similar molecular structures but various fatty acid chains, such as phosphatidylcholine (PCs), were analyzed quantitatively by MALDI-TOF MS. Quantitative measurement of cholesterol was possible with the use of an internal standard. This study shows that MALDI-TOF MS can be used for direct investigation and quantitative analysis of the phospholipid composition of serum lipoproteins.

  16. NNAlign: A Web-Based Prediction Method Allowing Non-Expert End-User Discovery of Sequence Motifs in Quantitative Peptide Data

    PubMed Central

    Andreatta, Massimo; Schafer-Nielsen, Claus; Lund, Ole; Buus, Søren; Nielsen, Morten

    2011-01-01

    Recent advances in high-throughput technologies have made it possible to generate both gene and protein sequence data at an unprecedented rate and scale thereby enabling entirely new “omics”-based approaches towards the analysis of complex biological processes. However, the amount and complexity of data that even a single experiment can produce seriously challenges researchers with limited bioinformatics expertise, who need to handle, analyze and interpret the data before it can be understood in a biological context. Thus, there is an unmet need for tools allowing non-bioinformatics users to interpret large data sets. We have recently developed a method, NNAlign, which is generally applicable to any biological problem where quantitative peptide data is available. This method efficiently identifies underlying sequence patterns by simultaneously aligning peptide sequences and identifying motifs associated with quantitative readouts. Here, we provide a web-based implementation of NNAlign allowing non-expert end-users to submit their data (optionally adjusting method parameters), and in return receive a trained method (including a visual representation of the identified motif) that subsequently can be used as prediction method and applied to unknown proteins/peptides. We have successfully applied this method to several different data sets including peptide microarray-derived sets containing more than 100,000 data points. NNAlign is available online at http://www.cbs.dtu.dk/services/NNAlign. PMID:22073191

  17. NNAlign: a web-based prediction method allowing non-expert end-user discovery of sequence motifs in quantitative peptide data.

    PubMed

    Andreatta, Massimo; Schafer-Nielsen, Claus; Lund, Ole; Buus, Søren; Nielsen, Morten

    2011-01-01

    Recent advances in high-throughput technologies have made it possible to generate both gene and protein sequence data at an unprecedented rate and scale thereby enabling entirely new "omics"-based approaches towards the analysis of complex biological processes. However, the amount and complexity of data that even a single experiment can produce seriously challenges researchers with limited bioinformatics expertise, who need to handle, analyze and interpret the data before it can be understood in a biological context. Thus, there is an unmet need for tools allowing non-bioinformatics users to interpret large data sets. We have recently developed a method, NNAlign, which is generally applicable to any biological problem where quantitative peptide data is available. This method efficiently identifies underlying sequence patterns by simultaneously aligning peptide sequences and identifying motifs associated with quantitative readouts. Here, we provide a web-based implementation of NNAlign allowing non-expert end-users to submit their data (optionally adjusting method parameters), and in return receive a trained method (including a visual representation of the identified motif) that subsequently can be used as prediction method and applied to unknown proteins/peptides. We have successfully applied this method to several different data sets including peptide microarray-derived sets containing more than 100,000 data points. NNAlign is available online at http://www.cbs.dtu.dk/services/NNAlign.

  18. A qualitative and quantitative laser-based computer-aided flow visualization method. M.S. Thesis, 1992 Final Report

    NASA Technical Reports Server (NTRS)

    Canacci, Victor A.; Braun, M. Jack

    1994-01-01

    The experimental approach presented here offers a nonintrusive, qualitative and quantitative evaluation of full field flow patterns applicable in various geometries in a variety of fluids. This Full Flow Field Tracking (FFFT) Particle Image Velocimetry (PIV) technique, by means of particle tracers illuminated by a laser light sheet, offers an alternative to Laser Doppler Velocimetry (LDV), and intrusive systems such as Hot Wire/Film Anemometry. The method makes obtainable the flow patterns, and allows quantitative determination of the velocities, accelerations, and mass flows of an entire flow field. The method uses a computer based digitizing system attached through an imaging board to a low luminosity camera. A customized optical train allows the system to become a long distance microscope (LDM), allowing magnifications of areas of interest ranging up to 100 times. Presented in addition to the method itself, are studies in which the flow patterns and velocities were observed and evaluated in three distinct geometries, with three different working fluids. The first study involved pressure and flow analysis of a brush seal in oil. The next application involved studying the velocity and flow patterns in a cowl lip cooling passage of an air breathing aircraft engine using water as the working fluid. Finally, the method was extended to a study in air to examine the flows in a staggered pin arrangement located on one side of a branched duct.

  19. Engineering cell-compatible paper chips for cell culturing, drug screening, and mass spectrometric sensing.

    PubMed

    Chen, Qiushui; He, Ziyi; Liu, Wu; Lin, Xuexia; Wu, Jing; Li, Haifang; Lin, Jin-Ming

    2015-10-28

    Paper-supported cell culture is an unprecedented development for advanced bioassays. This study reports a strategy for in vitro engineering of cell-compatible paper chips that allow for adherent cell culture, quantitative assessment of drug efficiency, and label-free sensing of intracellular molecules via paper spray mass spectrometry. The polycarbonate paper is employed as an excellent alternative bioscaffold for cell distribution, adhesion, and growth, as well as allowing for fluorescence imaging without light scattering. The cell-cultured paper chips are thus amenable to fabricate 3D tissue construction and cocultures by flexible deformation, stacks and assembly by layers of cells. As a result, the successful development of cell-compatible paper chips subsequently offers a uniquely flexible approach for in situ sensing of live cell components by paper spray mass spectrometry, allowing profiling the cellular lipids and quantitative measurement of drug metabolism with minimum sample pretreatment. Consequently, the developed paper chips for adherent cell culture are inexpensive for one-time use, compatible with high throughputs, and amenable to label-free and rapid analysis. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. An approach to quantitative sustainability assessment in the early stages of process design.

    PubMed

    Tugnoli, Alessandro; Santarelli, Francesco; Cozzani, Valerio

    2008-06-15

    A procedure was developed for the quantitative assessment of key performance indicators suitable for the sustainability analysis of alternative processes, mainly addressing the early stages of process design. The methodology was based on the calculation of a set of normalized impact indices allowing a direct comparison of the additional burden of each process alternative on a selected reference area. Innovative reference criteria were developed to compare and aggregate the impact indicators on the basis of the site-specific impact burden and sustainability policy. An aggregation procedure also allows the calculation of overall sustainability performance indicators and of an "impact fingerprint" of each process alternative. The final aim of the method is to support the decision making process during process development, providing a straightforward assessment of the expected sustainability performances. The application of the methodology to case studies concerning alternative waste disposal processes allowed a preliminary screening of the expected critical sustainability impacts of each process. The methodology was shown to provide useful results to address sustainability issues in the early stages of process design.

  1. Direct quantitative evaluation of disease symptoms on living plant leaves growing under natural light.

    PubMed

    Matsunaga, Tomoko M; Ogawa, Daisuke; Taguchi-Shiobara, Fumio; Ishimoto, Masao; Matsunaga, Sachihiro; Habu, Yoshiki

    2017-06-01

    Leaf color is an important indicator when evaluating plant growth and responses to biotic/abiotic stress. Acquisition of images by digital cameras allows analysis and long-term storage of the acquired images. However, under field conditions, where light intensity can fluctuate and other factors (shade, reflection, and background, etc.) vary, stable and reproducible measurement and quantification of leaf color are hard to achieve. Digital scanners provide fixed conditions for obtaining image data, allowing stable and reliable comparison among samples, but require detached plant materials to capture images, and the destructive processes involved often induce deformation of plant materials (curled leaves and faded colors, etc.). In this study, by using a lightweight digital scanner connected to a mobile computer, we obtained digital image data from intact plant leaves grown in natural-light greenhouses without detaching the targets. We took images of soybean leaves infected by Xanthomonas campestris pv. glycines , and distinctively quantified two disease symptoms (brown lesions and yellow halos) using freely available image processing software. The image data were amenable to quantitative and statistical analyses, allowing precise and objective evaluation of disease resistance.

  2. Acoustic Facies Analysis of Side-Scan Sonar Data

    NASA Astrophysics Data System (ADS)

    Dwan, Fa Shu

    Acoustic facies analysis methods have allowed the generation of system-independent values for the quantitative seafloor acoustic parameter, backscattering strength, from GLORIA and (TAMU) ^2 side-scan sonar data. The resulting acoustic facies parameters enable quantitative comparisons of data collected by different sonar systems, data from different environments, and measurements made with survey geometries. Backscattering strength values were extracted from the sonar amplitude data by inversion based on the sonar equation. Image processing products reveal seafloor features and patterns of relative intensity. To quantitatively compare data collected at different times or by different systems, and to ground truth-measurements and geoacoustic models, quantitative corrections must be made on any given data set for system source level, beam pattern, time-varying gain, processing gain, transmission loss, absorption, insonified area contribution, and grazing angle effects. In the sonar equation, backscattering strength is the sonar parameter which is directly related to seafloor properties. The GLORIA data used in this study are from the edge of a distal lobe of the Monterey Fan. An interfingered region of strong and weak seafloor signal returns from a flat seafloor region provides an ideal data set for this study. Inversion of imagery data from the region allows the quantitative definition of different acoustic facies. The (TAMU) ^2 data used are from a calibration site near the Green Canyon area of the Gulf of Mexico. Acoustic facies analysis techniques were implemented to generate statistical information for acoustic facies based on the estimates of backscattering strength. The backscattering strength values have been compared with Lambert's Law and other functions to parameterize the description of the acoustic facies. The resulting Lambertian constant values range from -26 dB to -36 dB. A modified Lambert relationship, which consists of both intercept and slope terms, appears to represent the BSS versus grazing angle profiles better based on chi^2 testing and error ellipse generation. Different regression functions, composed of trigonometric functions, were analyzed for different segments of the BSS profiles. A cotangent or sine/cosine function shows promising results for representing the entire grazing angle span of the BSS profiles.

  3. Prospective elementary teachers' perceptions of the processes of modeling: A case study

    NASA Astrophysics Data System (ADS)

    Fazio, Claudio; di Paola, Benedetto; Guastella, Ivan

    2012-06-01

    In this paper we discuss a study on the approaches to modeling of students of the 4-year elementary school teacher program at the University of Palermo, Italy. The answers to a specially designed questionnaire are analyzed on the basis of an a priori analysis made using a general scheme of reference on the epistemology of mathematics and physics. The study is performed by using quantitative data analysis methods, i.e. factorial analysis of the correspondences and implicative analysis. A qualitative analysis of key words and terms used by students during interviews is also used to examine some aspects that emerged from the quantitative analysis. The students have been classified on the basis of their different epistemological approaches to knowledge construction, and implications between different conceptual strategies used to answer the questionnaire have been highlighted. The study’s conclusions are consistent with previous research, but the use of quantitative data analysis allowed us to classify the students into three “profiles” related to different epistemological approaches to knowledge construction, and to show the implications of the different conceptual strategies used to answer the questionnaire, giving an estimation of the classification or implication “strength.” Some hints on how a course for elementary school physics and mathematics education can be planned to orient the future teachers to the construction of models of explanation are reported.

  4. 3D Slicer as an Image Computing Platform for the Quantitative Imaging Network

    PubMed Central

    Fedorov, Andriy; Beichel, Reinhard; Kalpathy-Cramer, Jayashree; Finet, Julien; Fillion-Robin, Jean-Christophe; Pujol, Sonia; Bauer, Christian; Jennings, Dominique; Fennessy, Fiona; Sonka, Milan; Buatti, John; Aylward, Stephen; Miller, James V.; Pieper, Steve; Kikinis, Ron

    2012-01-01

    Quantitative analysis has tremendous but mostly unrealized potential in healthcare to support objective and accurate interpretation of the clinical imaging. In 2008, the National Cancer Institute began building the Quantitative Imaging Network (QIN) initiative with the goal of advancing quantitative imaging in the context of personalized therapy and evaluation of treatment response. Computerized analysis is an important component contributing to reproducibility and efficiency of the quantitative imaging techniques. The success of quantitative imaging is contingent on robust analysis methods and software tools to bring these methods from bench to bedside. 3D Slicer is a free open source software application for medical image computing. As a clinical research tool, 3D Slicer is similar to a radiology workstation that supports versatile visualizations but also provides advanced functionality such as automated segmentation and registration for a variety of application domains. Unlike a typical radiology workstation, 3D Slicer is free and is not tied to specific hardware. As a programming platform, 3D Slicer facilitates translation and evaluation of the new quantitative methods by allowing the biomedical researcher to focus on the implementation of the algorithm, and providing abstractions for the common tasks of data communication, visualization and user interface development. Compared to other tools that provide aspects of this functionality, 3D Slicer is fully open source and can be readily extended and redistributed. In addition, 3D Slicer is designed to facilitate the development of new functionality in the form of 3D Slicer extensions. In this paper, we present an overview of 3D Slicer as a platform for prototyping, development and evaluation of image analysis tools for clinical research applications. To illustrate the utility of the platform in the scope of QIN, we discuss several use cases of 3D Slicer by the existing QIN teams, and we elaborate on the future directions that can further facilitate development and validation of imaging biomarkers using 3D Slicer. PMID:22770690

  5. Quantitative comparison of the absorption spectra of the gas mixtures in analogy to the criterion of Pearson

    NASA Astrophysics Data System (ADS)

    Kistenev, Yu. V.; Kuzmin, D. A.; Sandykova, E. A.; Shapovalov, A. V.

    2015-11-01

    An approach to the reduction of the space of the absorption spectra, based on the original criterion for profile analysis of the spectra, was proposed. This criterion dates back to the known statistics chi-square test of Pearson. Introduced criterion allows to quantify the differences of spectral curves.

  6. Time-Lapse Videos for Physics Education: Specific Examples

    ERIC Educational Resources Information Center

    Vollmer, Michael; Möllmann, Klaus-Peter

    2018-01-01

    There are many physics experiments with long time scales such that they are usually neither shown in the physics class room nor in student labs. However, they can be easily recorded with time-lapse cameras and the respective time-lapse videos allow qualitative and/or quantitative analysis of the underlying physics. Here, we present some examples…

  7. Explicating Metatheory for Mixed Methods Research in Educational Leadership: An Application of Habermas's "Theory of Communicative Action"

    ERIC Educational Resources Information Center

    Whiteman, Rodney S.

    2015-01-01

    Purpose: Mixed methods research can provide a fruitful line of inquiry for educational leadership, program evaluation, and policy analysis; however, mixed methods research requires a metatheory that allows for mixing what have traditionally been considered incompatible qualitative and quantitative inquiry. The purpose of this paper is to apply…

  8. What Do Secondary Students Really Learn during Investigations with Living Animals? Parameters for Effective Learning with Social Insects

    ERIC Educational Resources Information Center

    Sammet, Rebecca; Dreesmann, Daniel

    2017-01-01

    Exemplary for social insects, "Temnothorax" ants allow for various hands-on investigations in biology classes. The aim of this study was to provide a quantitative and qualitative analysis of secondary school students' learning achievement after teaching units with ants lasting between one and six weeks. The questionnaires included…

  9. Developing a database for pedestrians' earthquake emergency evacuation in indoor scenarios.

    PubMed

    Zhou, Junxue; Li, Sha; Nie, Gaozhong; Fan, Xiwei; Tan, Jinxian; Li, Huayue; Pang, Xiaoke

    2018-01-01

    With the booming development of evacuation simulation software, developing an extensive database in indoor scenarios for evacuation models is imperative. In this paper, we conduct a qualitative and quantitative analysis of the collected videotapes and aim to provide a complete and unitary database of pedestrians' earthquake emergency response behaviors in indoor scenarios, including human-environment interactions. Using the qualitative analysis method, we extract keyword groups and keywords that code the response modes of pedestrians and construct a general decision flowchart using chronological organization. Using the quantitative analysis method, we analyze data on the delay time, evacuation speed, evacuation route and emergency exit choices. Furthermore, we study the effect of classroom layout on emergency evacuation. The database for indoor scenarios provides reliable input parameters and allows the construction of real and effective constraints for use in software and mathematical models. The database can also be used to validate the accuracy of evacuation models.

  10. Quantitative Analysis of Human Pluripotency and Neural Specification by In-Depth (Phospho)Proteomic Profiling.

    PubMed

    Singec, Ilyas; Crain, Andrew M; Hou, Junjie; Tobe, Brian T D; Talantova, Maria; Winquist, Alicia A; Doctor, Kutbuddin S; Choy, Jennifer; Huang, Xiayu; La Monaca, Esther; Horn, David M; Wolf, Dieter A; Lipton, Stuart A; Gutierrez, Gustavo J; Brill, Laurence M; Snyder, Evan Y

    2016-09-13

    Controlled differentiation of human embryonic stem cells (hESCs) can be utilized for precise analysis of cell type identities during early development. We established a highly efficient neural induction strategy and an improved analytical platform, and determined proteomic and phosphoproteomic profiles of hESCs and their specified multipotent neural stem cell derivatives (hNSCs). This quantitative dataset (nearly 13,000 proteins and 60,000 phosphorylation sites) provides unique molecular insights into pluripotency and neural lineage entry. Systems-level comparative analysis of proteins (e.g., transcription factors, epigenetic regulators, kinase families), phosphorylation sites, and numerous biological pathways allowed the identification of distinct signatures in pluripotent and multipotent cells. Furthermore, as predicted by the dataset, we functionally validated an autocrine/paracrine mechanism by demonstrating that the secreted protein midkine is a regulator of neural specification. This resource is freely available to the scientific community, including a searchable website, PluriProt. Published by Elsevier Inc.

  11. Hybrid data acquisition and processing strategies with increased throughput and selectivity: pSMART analysis for global qualitative and quantitative analysis.

    PubMed

    Prakash, Amol; Peterman, Scott; Ahmad, Shadab; Sarracino, David; Frewen, Barbara; Vogelsang, Maryann; Byram, Gregory; Krastins, Bryan; Vadali, Gouri; Lopez, Mary

    2014-12-05

    Data-dependent acquisition (DDA) and data-independent acquisition strategies (DIA) have both resulted in improved understanding of proteomics samples. Both strategies have advantages and disadvantages that are well-published, where DDA is typically applied for deep discovery and DIA may be used to create sample records. In this paper, we present a hybrid data acquisition and processing strategy (pSMART) that combines the strengths of both techniques and provides significant benefits for qualitative and quantitative peptide analysis. The performance of pSMART is compared to published DIA strategies in an experiment that allows the objective assessment of DIA performance with respect to interrogation of previously acquired MS data. The results of this experiment demonstrate that pSMART creates fewer decoy hits than a standard DIA strategy. Moreover, we show that pSMART is more selective, sensitive, and reproducible than either standard DIA or DDA strategies alone.

  12. Quantitative analysis of sesquiterpene lactones in extract of Arnica montana L. by 1H NMR spectroscopy.

    PubMed

    Staneva, Jordanka; Denkova, Pavletta; Todorova, Milka; Evstatieva, Ljuba

    2011-01-05

    (1)H NMR spectroscopy was used as a method for quantitative analysis of sesquiterpene lactones present in a crude lactone fraction isolated from Arnica montana. Eight main components - tigloyl-, methacryloyl-, isobutyryl- and 2-methylbutyryl-esters of helenalin (H) and 11α,13-dihydrohelenalin (DH) were identified in the studied sample. The method allows the determination of the total amount of sesquiterpene lactones and the quantity of both type helenalin and 11α,13-dihydrohelenalin esters separately. Furthermore, 6-O-tigloylhelenalin (HT, 1), 6-O-methacryloylhelenalin (HM, 2), 6-O-tigloyl-11α,13-dihydrohelenalin (DHT, 5), and 6-O-methacryloyl-11α,13-dihydrohelenalin (DHM, 6) were quantified as individual components. Copyright © 2010 Elsevier B.V. All rights reserved.

  13. Continuous EEG monitoring in the intensive care unit.

    PubMed

    Scheuer, Mark L

    2002-01-01

    Continuous EEG (CEEG) monitoring allows uninterrupted assessment of cerebral cortical activity with good spatial resolution and excellent temporal resolution. Thus, this procedure provides a means of constantly assessing brain function in critically ill obtunded and comatose patients. Recent advances in digital EEG acquisition, storage, quantitative analysis, and transmission have made CEEG monitoring in the intensive care unit (ICU) technically feasible and useful. This article summarizes the indications and methodology of CEEG monitoring in the ICU, and discusses the role of some quantitative EEG analysis techniques in near real-time remote observation of CEEG recordings. Clinical examples of CEEG use, including monitoring of status epilepticus, assessment of ongoing therapy for treatment of seizures in critically ill patients, and monitoring for cerebral ischemia, are presented. Areas requiring further development of CEEG monitoring techniques and indications are discussed.

  14. Python for Information Theoretic Analysis of Neural Data

    PubMed Central

    Ince, Robin A. A.; Petersen, Rasmus S.; Swan, Daniel C.; Panzeri, Stefano

    2008-01-01

    Information theory, the mathematical theory of communication in the presence of noise, is playing an increasingly important role in modern quantitative neuroscience. It makes it possible to treat neural systems as stochastic communication channels and gain valuable, quantitative insights into their sensory coding function. These techniques provide results on how neurons encode stimuli in a way which is independent of any specific assumptions on which part of the neuronal response is signal and which is noise, and they can be usefully applied even to highly non-linear systems where traditional techniques fail. In this article, we describe our work and experiences using Python for information theoretic analysis. We outline some of the algorithmic, statistical and numerical challenges in the computation of information theoretic quantities from neural data. In particular, we consider the problems arising from limited sampling bias and from calculation of maximum entropy distributions in the presence of constraints representing the effects of different orders of interaction in the system. We explain how and why using Python has allowed us to significantly improve the speed and domain of applicability of the information theoretic algorithms, allowing analysis of data sets characterized by larger numbers of variables. We also discuss how our use of Python is facilitating integration with collaborative databases and centralised computational resources. PMID:19242557

  15. The computer treatment of remotely sensed data: An introduction to techniques which have geologic applications. [image enhancement and thematic classification in Brazil

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Paradella, W. R.; Vitorello, I.

    1982-01-01

    Several aspects of computer-assisted analysis techniques for image enhancement and thematic classification by which LANDSAT MSS imagery may be treated quantitatively are explained. On geological applications, computer processing of digital data allows, possibly, the fullest use of LANDSAT data, by displaying enhanced and corrected data for visual analysis and by evaluating and assigning each spectral pixel information to a given class.

  16. Analysis of small carbohydrates in several bioactive botanicals by gas chromatography with mass spectrometry and liquid chromatography with tandem mass spectrometry.

    PubMed

    Moldoveanu, Serban; Scott, Wayne; Zhu, Jeff

    2015-11-01

    Bioactive botanicals contain natural compounds with specific biological activity, such as antibacterial, antioxidant, immune stimulating, and taste improving. A full characterization of the chemical composition of these botanicals is frequently necessary. A study of small carbohydrates from the plant materials of 18 bioactive botanicals is further described. The study presents the identification of the carbohydrate using a gas chromatographic-mass spectrometric analysis that allows detection of molecules as large as maltotetraose, after changing them into trimethylsilyl derivatives. A number of carbohydrates in the plant (fructose, glucose, mannose, sucrose, maltose, xylose, sorbitol, and myo-, chiro-, and scyllo-inositols) were quantitated using a novel liquid chromatography with tandem mass spectrometric technique. Both techniques involved new method developments. The gas chromatography with mass spectrometric analysis involved derivatization and separation on a Rxi(®)-5Sil MS column with H2 as a carrier gas. The liquid chromatographic separation was obtained using a hydrophilic interaction type column, YMC-PAC Polyamine II. The tandem mass spectrometer used an electrospray ionization source in multiple reaction monitoring positive ion mode with the detection of the adducts of the carbohydrates with Cs(+) ions. The validated quantitative procedure showed excellent precision and accuracy allowing the analysis in a wide range of concentrations of the analytes. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. The Separation and Quantitation of Peptides with and without Oxidation of Methionine and Deamidation of Asparagine Using Hydrophilic Interaction Liquid Chromatography with Mass Spectrometry (HILIC-MS)

    NASA Astrophysics Data System (ADS)

    Badgett, Majors J.; Boyes, Barry; Orlando, Ron

    2017-05-01

    Peptides with deamidated asparagine residues and oxidized methionine residues are often not resolved sufficiently to allow quantitation of their native and modified forms using reversed phase (RP) chromatography. The accurate quantitation of these modifications is vital in protein biotherapeutic analysis because they can affect a protein's function, activity, and stability. We demonstrate here that hydrophilic interaction liquid chromatography (HILIC) adequately and predictably separates peptides with these modifications from their native counterparts. Furthermore, coefficients describing the extent of the hydrophilicity of these modifications have been derived and were incorporated into a previously made peptide retention prediction model that is capable of predicting the retention times of peptides with and without these modifications.

  18. Quantitation of the phosphoproteome using the library-assisted extracted ion chromatogram (LAXIC) strategy.

    PubMed

    Arrington, Justine V; Xue, Liang; Tao, W Andy

    2014-01-01

    Phosphorylation is a key posttranslational modification that regulates many signaling pathways, but quantifying changes in phosphorylation between samples can be challenging due to its low stoichiometry within cells. We have introduced a mass spectrometry-based label-free quantitation strategy termed LAXIC for the analysis of the phosphoproteome. This method uses a spiked-in synthetic peptide library designed to elute across the entire chromatogram for local normalization of phosphopeptides within complex samples. Normalization of phosphopeptides by library peptides that co-elute within a small time frame accounts for fluctuating ion suppression effects, allowing more accurate quantitation even when LC-MS performance varies. Here we explain the premise of LAXIC, the design of a suitable peptide library, and how the LAXIC algorithm can be implemented with software developed in-house.

  19. The Separation and Quantitation of Peptides with and without Oxidation of Methionine and Deamidation of Asparagine Using Hydrophilic Interaction Liquid Chromatography with Mass Spectrometry (HILIC-MS).

    PubMed

    Badgett, Majors J; Boyes, Barry; Orlando, Ron

    2017-05-01

    Peptides with deamidated asparagine residues and oxidized methionine residues are often not resolved sufficiently to allow quantitation of their native and modified forms using reversed phase (RP) chromatography. The accurate quantitation of these modifications is vital in protein biotherapeutic analysis because they can affect a protein's function, activity, and stability. We demonstrate here that hydrophilic interaction liquid chromatography (HILIC) adequately and predictably separates peptides with these modifications from their native counterparts. Furthermore, coefficients describing the extent of the hydrophilicity of these modifications have been derived and were incorporated into a previously made peptide retention prediction model that is capable of predicting the retention times of peptides with and without these modifications. Graphical Abstract ᅟ.

  20. Onset dynamics of action potentials in rat neocortical neurons and identified snail neurons: quantification of the difference.

    PubMed

    Volgushev, Maxim; Malyshev, Aleksey; Balaban, Pavel; Chistiakova, Marina; Volgushev, Stanislav; Wolf, Fred

    2008-04-09

    The generation of action potentials (APs) is a key process in the operation of nerve cells and the communication between neurons. Action potentials in mammalian central neurons are characterized by an exceptionally fast onset dynamics, which differs from the typically slow and gradual onset dynamics seen in identified snail neurons. Here we describe a novel method of analysis which provides a quantitative measure of the onset dynamics of action potentials. This method captures the difference between the fast, step-like onset of APs in rat neocortical neurons and the gradual, exponential-like AP onset in identified snail neurons. The quantitative measure of the AP onset dynamics, provided by the method, allows us to perform quantitative analyses of factors influencing the dynamics.

  1. Onset Dynamics of Action Potentials in Rat Neocortical Neurons and Identified Snail Neurons: Quantification of the Difference

    PubMed Central

    Volgushev, Maxim; Malyshev, Aleksey; Balaban, Pavel; Chistiakova, Marina; Volgushev, Stanislav; Wolf, Fred

    2008-01-01

    The generation of action potentials (APs) is a key process in the operation of nerve cells and the communication between neurons. Action potentials in mammalian central neurons are characterized by an exceptionally fast onset dynamics, which differs from the typically slow and gradual onset dynamics seen in identified snail neurons. Here we describe a novel method of analysis which provides a quantitative measure of the onset dynamics of action potentials. This method captures the difference between the fast, step-like onset of APs in rat neocortical neurons and the gradual, exponential-like AP onset in identified snail neurons. The quantitative measure of the AP onset dynamics, provided by the method, allows us to perform quantitative analyses of factors influencing the dynamics. PMID:18398478

  2. A gradient method for the quantitative analysis of cell movement and tissue flow and its application to the analysis of multicellular Dictyostelium development.

    PubMed

    Siegert, F; Weijer, C J; Nomura, A; Miike, H

    1994-01-01

    We describe the application of a novel image processing method, which allows quantitative analysis of cell and tissue movement in a series of digitized video images. The result is a vector velocity field showing average direction and velocity of movement for every pixel in the frame. We apply this method to the analysis of cell movement during different stages of the Dictyostelium developmental cycle. We analysed time-lapse video recordings of cell movement in single cells, mounds and slugs. The program can correctly assess the speed and direction of movement of either unlabelled or labelled cells in a time series of video images depending on the illumination conditions. Our analysis of cell movement during multicellular development shows that the entire morphogenesis of Dictyostelium is characterized by rotational cell movement. The analysis of cell and tissue movement by the velocity field method should be applicable to the analysis of morphogenetic processes in other systems such as gastrulation and neurulation in vertebrate embryos.

  3. Photogrammetry of the Human Brain: A Novel Method for Three-Dimensional Quantitative Exploration of the Structural Connectivity in Neurosurgery and Neurosciences.

    PubMed

    De Benedictis, Alessandro; Nocerino, Erica; Menna, Fabio; Remondino, Fabio; Barbareschi, Mattia; Rozzanigo, Umberto; Corsini, Francesco; Olivetti, Emanuele; Marras, Carlo Efisio; Chioffi, Franco; Avesani, Paolo; Sarubbo, Silvio

    2018-04-13

    Anatomic awareness of the structural connectivity of the brain is mandatory for neurosurgeons, to select the most effective approaches for brain resections. Although standard microdissection is a validated technique to investigate the different white matter (WM) pathways and to verify the results of tractography, the possibility of interactive exploration of the specimens and reliable acquisition of quantitative information has not been described. Photogrammetry is a well-established technique allowing an accurate metrology on highly defined three-dimensional (3D) models. The aim of this work is to propose the application of the photogrammetric technique for supporting the 3D exploration and the quantitative analysis on the cerebral WM connectivity. The main perisylvian pathways, including the superior longitudinal fascicle and the arcuate fascicle were exposed using the Klingler technique. The photogrammetric acquisition followed each dissection step. The point clouds were registered to a reference magnetic resonance image of the specimen. All the acquisitions were coregistered into an open-source model. We analyzed 5 steps, including the cortical surface, the short intergyral fibers, the indirect posterior and anterior superior longitudinal fascicle, and the arcuate fascicle. The coregistration between the magnetic resonance imaging mesh and the point clouds models was highly accurate. Multiple measures of distances between specific cortical landmarks and WM tracts were collected on the photogrammetric model. Photogrammetry allows an accurate 3D reproduction of WM anatomy and the acquisition of unlimited quantitative data directly on the real specimen during the postdissection analysis. These results open many new promising neuroscientific and educational perspectives and also optimize the quality of neurosurgical treatments. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. Development of an agricultural job-exposure matrix for British Columbia, Canada.

    PubMed

    Wood, David; Astrakianakis, George; Lang, Barbara; Le, Nhu; Bert, Joel

    2002-09-01

    Farmers in British Columbia (BC), Canada have been shown to have unexplained elevated proportional mortality rates for several cancers. Because agricultural exposures have never been documented systematically in BC, a quantitative agricultural Job-exposure matrix (JEM) was developed containing exposure assessments from 1950 to 1998. This JEM was developed to document historical exposures and to facilitate future epidemiological studies. Available information regarding BC farming practices was compiled and checklists of potential exposures were produced for each crop. Exposures identified included chemical, biological, and physical agents. Interviews with farmers and agricultural experts were conducted using the checklists as a starting point. This allowed the creation of an initial or 'potential' JEM based on three axes: exposure agent, 'type of work' and time. The 'type of work' axis was determined by combining several variables: region, crop, job title and task. This allowed for a complete description of exposures. Exposure assessments were made quantitatively, where data allowed, or by a dichotomous variable (exposed/unexposed). Quantitative calculations were divided into re-entry and application scenarios. 'Re-entry' exposures were quantified using a standard exposure model with some modification while application exposure estimates were derived using data from the North American Pesticide Handlers Exposure Database (PHED). As expected, exposures differed between crops and job titles both quantitatively and qualitatively. Of the 290 agents included in the exposure axis; 180 were pesticides. Over 3000 estimates of exposure were conducted; 50% of these were quantitative. Each quantitative estimate was at the daily absorbed dose level. Exposure estimates were then rated as high, medium, or low based on comparing them with their respective oral chemical reference dose (RfD) or Acceptable Daily Intake (ADI). This data was mainly obtained from the US Environmental Protection Agency (EPA) Integrated Risk Information System database. Of the quantitative estimates, 74% were rated as low (< 100%) and only 10% were rated as high (>500%). The JEM resulting from this study fills a void concerning exposures for BC farmers and farm workers. While only limited validation of assessments were possible, this JEM can serve as a benchmark for future studies. Preliminary analysis at the BC Cancer Agency (BCCA) using the JEM with prostate cancer records from a large cancer and occupation study/survey has already shown promising results. Development of this JEM provides a useful model for developing historical quantitative exposure estimates where is very little documented information available.

  5. Dual reporter transgene driven by 2.3Col1a1 promoter is active in differentiated osteoblasts

    NASA Technical Reports Server (NTRS)

    Marijanovic, Inga; Jiang, Xi; Kronenberg, Mark S.; Stover, Mary Louise; Erceg, Ivana; Lichtler, Alexander C.; Rowe, David W.

    2003-01-01

    AIM: As quantitative and spatial analyses of promoter reporter constructs are not easily performed in intact bone, we designed a reporter gene specific to bone, which could be analyzed both visually and quantitatively by using chloramphenicol acetyltransferase (CAT) and a cyan version of green fluorescent protein (GFPcyan), driven by a 2.3-kb fragment of the rat collagen promoter (Col2.3). METHODS: The construct Col2.3CATiresGFPcyan was used for generating transgenic mice. Quantitative measurement of promoter activity was performed by CAT analysis of different tissues derived from transgenic animals; localization was performed by visualized GFP in frozen bone sections. To assess transgene expression during in vitro differentiation, marrow stromal cell and neonatal calvarial osteoblast cultures were analyzed for CAT and GFP activity. RESULTS: In mice, CAT activity was detected in the calvaria, long bone, teeth, and tendon, whereas histology showed that GFP expression was limited to osteoblasts and osteocytes. In cell culture, increased activity of CAT correlated with increased differentiation, and GFP activity was restricted to mineralized nodules. CONCLUSION: The concept of a dual reporter allows a simultaneous visual and quantitative analysis of transgene activity in bone.

  6. Clinical significance of quantitative analysis of facial nerve enhancement on MRI in Bell's palsy.

    PubMed

    Song, Mee Hyun; Kim, Jinna; Jeon, Ju Hyun; Cho, Chang Il; Yoo, Eun Hye; Lee, Won-Sang; Lee, Ho-Ki

    2008-11-01

    Quantitative analysis of the facial nerve on the lesion side as well as the normal side, which allowed for more accurate measurement of facial nerve enhancement in patients with facial palsy, showed statistically significant correlation with the initial severity of facial nerve inflammation, although little prognostic significance was shown. This study investigated the clinical significance of quantitative measurement of facial nerve enhancement in patients with Bell's palsy by analyzing the enhancement pattern and correlating MRI findings with initial severity of facial palsy and clinical outcome. Facial nerve enhancement was measured quantitatively by using the region of interest on pre- and postcontrast T1-weighted images in 44 patients diagnosed with Bell's palsy. The signal intensity increase on the lesion side was first compared with that of the contralateral side and then correlated with the initial degree of facial palsy and prognosis. The lesion side showed significantly higher signal intensity increase compared with the normal side in all of the segments except for the mastoid segment. Signal intensity increase at the internal auditory canal and labyrinthine segments showed correlation with the initial degree of facial palsy but no significant difference was found between different prognostic groups.

  7. RECENT ADVANCES IN QUANTITATIVE NEUROPROTEOMICS

    PubMed Central

    Craft, George E; Chen, Anshu; Nairn, Angus C

    2014-01-01

    The field of proteomics is undergoing rapid development in a number of different areas including improvements in mass spectrometric platforms, peptide identification algorithms and bioinformatics. In particular, new and/or improved approaches have established robust methods that not only allow for in-depth and accurate peptide and protein identification and modification, but also allow for sensitive measurement of relative or absolute quantitation. These methods are beginning to be applied to the area of neuroproteomics, but the central nervous system poses many specific challenges in terms of quantitative proteomics, given the large number of different neuronal cell types that are intermixed and that exhibit distinct patterns of gene and protein expression. This review highlights the recent advances that have been made in quantitative neuroproteomics, with a focus on work published over the last five years that applies emerging methods to normal brain function as well as to various neuropsychiatric disorders including schizophrenia and drug addiction as well as of neurodegenerative diseases including Parkinson’s disease and Alzheimer’s disease. While older methods such as two-dimensional polyacrylamide electrophoresis continued to be used, a variety of more in-depth MS-based approaches including both label (ICAT, iTRAQ, TMT, SILAC, SILAM), label-free (label-free, MRM, SWATH) and absolute quantification methods, are rapidly being applied to neurobiological investigations of normal and diseased brain tissue as well as of cerebrospinal fluid (CSF). While the biological implications of many of these studies remain to be clearly established, that there is a clear need for standardization of experimental design and data analysis, and that the analysis of protein changes in specific neuronal cell types in the central nervous system remains a serious challenge, it appears that the quality and depth of the more recent quantitative proteomics studies is beginning to shed light on a number of aspects of neuroscience that relates to normal brain function as well as of the changes in protein expression and regulation that occurs in neuropsychiatric and neurodegenerative disorders. PMID:23623823

  8. Recent advances in quantitative neuroproteomics.

    PubMed

    Craft, George E; Chen, Anshu; Nairn, Angus C

    2013-06-15

    The field of proteomics is undergoing rapid development in a number of different areas including improvements in mass spectrometric platforms, peptide identification algorithms and bioinformatics. In particular, new and/or improved approaches have established robust methods that not only allow for in-depth and accurate peptide and protein identification and modification, but also allow for sensitive measurement of relative or absolute quantitation. These methods are beginning to be applied to the area of neuroproteomics, but the central nervous system poses many specific challenges in terms of quantitative proteomics, given the large number of different neuronal cell types that are intermixed and that exhibit distinct patterns of gene and protein expression. This review highlights the recent advances that have been made in quantitative neuroproteomics, with a focus on work published over the last five years that applies emerging methods to normal brain function as well as to various neuropsychiatric disorders including schizophrenia and drug addiction as well as of neurodegenerative diseases including Parkinson's disease and Alzheimer's disease. While older methods such as two-dimensional polyacrylamide electrophoresis continued to be used, a variety of more in-depth MS-based approaches including both label (ICAT, iTRAQ, TMT, SILAC, SILAM), label-free (label-free, MRM, SWATH) and absolute quantification methods, are rapidly being applied to neurobiological investigations of normal and diseased brain tissue as well as of cerebrospinal fluid (CSF). While the biological implications of many of these studies remain to be clearly established, that there is a clear need for standardization of experimental design and data analysis, and that the analysis of protein changes in specific neuronal cell types in the central nervous system remains a serious challenge, it appears that the quality and depth of the more recent quantitative proteomics studies is beginning to shed light on a number of aspects of neuroscience that relates to normal brain function as well as of the changes in protein expression and regulation that occurs in neuropsychiatric and neurodegenerative disorders. Copyright © 2013. Published by Elsevier Inc.

  9. Quantitative multi-color FRET measurements by Fourier lifetime excitation-emission matrix spectroscopy.

    PubMed

    Zhao, Ming; Huang, Run; Peng, Leilei

    2012-11-19

    Förster resonant energy transfer (FRET) is extensively used to probe macromolecular interactions and conformation changes. The established FRET lifetime analysis method measures the FRET process through its effect on the donor lifetime. In this paper we present a method that directly probes the time-resolved FRET signal with frequency domain Fourier lifetime excitation-emission matrix (FLEEM) measurements. FLEEM separates fluorescent signals by their different phonon energy pathways from excitation to emission. The FRET process generates a unique signal channel that is initiated by donor excitation but ends with acceptor emission. Time-resolved analysis of the FRET EEM channel allows direct measurements on the FRET process, unaffected by free fluorophores that might be present in the sample. Together with time-resolved analysis on non-FRET channels, i.e. donor and acceptor EEM channels, time resolved EEM analysis allows precise quantification of FRET in the presence of free fluorophores. The method is extended to three-color FRET processes, where quantification with traditional methods remains challenging because of the significantly increased complexity in the three-way FRET interactions. We demonstrate the time-resolved EEM analysis method with quantification of three-color FRET in incompletely hybridized triple-labeled DNA oligonucleotides. Quantitative measurements of the three-color FRET process in triple-labeled dsDNA are obtained in the presence of free single-labeled ssDNA and double-labeled dsDNA. The results establish a quantification method for studying multi-color FRET between multiple macromolecules in biochemical equilibrium.

  10. Quantitative multi-color FRET measurements by Fourier lifetime excitation-emission matrix spectroscopy

    PubMed Central

    Zhao, Ming; Huang, Run; Peng, Leilei

    2012-01-01

    Förster resonant energy transfer (FRET) is extensively used to probe macromolecular interactions and conformation changes. The established FRET lifetime analysis method measures the FRET process through its effect on the donor lifetime. In this paper we present a method that directly probes the time-resolved FRET signal with frequency domain Fourier lifetime excitation-emission matrix (FLEEM) measurements. FLEEM separates fluorescent signals by their different phonon energy pathways from excitation to emission. The FRET process generates a unique signal channel that is initiated by donor excitation but ends with acceptor emission. Time-resolved analysis of the FRET EEM channel allows direct measurements on the FRET process, unaffected by free fluorophores that might be present in the sample. Together with time-resolved analysis on non-FRET channels, i.e. donor and acceptor EEM channels, time resolved EEM analysis allows precise quantification of FRET in the presence of free fluorophores. The method is extended to three-color FRET processes, where quantification with traditional methods remains challenging because of the significantly increased complexity in the three-way FRET interactions. We demonstrate the time-resolved EEM analysis method with quantification of three-color FRET in incompletely hybridized triple-labeled DNA oligonucleotides. Quantitative measurements of the three-color FRET process in triple-labeled dsDNA are obtained in the presence of free single-labeled ssDNA and double-labeled dsDNA. The results establish a quantification method for studying multi-color FRET between multiple macromolecules in biochemical equilibrium. PMID:23187535

  11. Analysis of the time structure of synchronization in multidimensional chaotic systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makarenko, A. V., E-mail: avm.science@mail.ru

    2015-05-15

    A new approach is proposed to the integrated analysis of the time structure of synchronization of multidimensional chaotic systems. The method allows one to diagnose and quantitatively evaluate the intermittency characteristics during synchronization of chaotic oscillations in the T-synchronization mode. A system of two identical logistic mappings with unidirectional coupling that operate in the developed chaos regime is analyzed. It is shown that the widely used approach, in which only synchronization patterns are subjected to analysis while desynchronization areas are considered as a background signal and removed from analysis, should be regarded as methodologically incomplete.

  12. Quantification and clustering of phenotypic screening data using time-series analysis for chemotherapy of schistosomiasis.

    PubMed

    Lee, Hyokyeong; Moody-Davis, Asher; Saha, Utsab; Suzuki, Brian M; Asarnow, Daniel; Chen, Steven; Arkin, Michelle; Caffrey, Conor R; Singh, Rahul

    2012-01-01

    Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Development of techniques for automated, high-throughput drug screening against these diseases, especially in whole-organism settings, constitutes one of the great challenges of modern drug discovery. We present a method for enabling high-throughput phenotypic drug screening against diseases caused by helminths with a focus on schistosomiasis. The proposed method allows for a quantitative analysis of the systemic impact of a drug molecule on the pathogen as exhibited by the complex continuum of its phenotypic responses. This method consists of two key parts: first, biological image analysis is employed to automatically monitor and quantify shape-, appearance-, and motion-based phenotypes of the parasites. Next, we represent these phenotypes as time-series and show how to compare, cluster, and quantitatively reason about them using techniques of time-series analysis. We present results on a number of algorithmic issues pertinent to the time-series representation of phenotypes. These include results on appropriate representation of phenotypic time-series, analysis of different time-series similarity measures for comparing phenotypic responses over time, and techniques for clustering such responses by similarity. Finally, we show how these algorithmic techniques can be used for quantifying the complex continuum of phenotypic responses of parasites. An important corollary is the ability of our method to recognize and rigorously group parasites based on the variability of their phenotypic response to different drugs. The methods and results presented in this paper enable automatic and quantitative scoring of high-throughput phenotypic screens focused on helmintic diseases. Furthermore, these methods allow us to analyze and stratify parasites based on their phenotypic response to drugs. Together, these advancements represent a significant breakthrough for the process of drug discovery against schistosomiasis in particular and can be extended to other helmintic diseases which together afflict a large part of humankind.

  13. Quantification and clustering of phenotypic screening data using time-series analysis for chemotherapy of schistosomiasis

    PubMed Central

    2012-01-01

    Background Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Development of techniques for automated, high-throughput drug screening against these diseases, especially in whole-organism settings, constitutes one of the great challenges of modern drug discovery. Method We present a method for enabling high-throughput phenotypic drug screening against diseases caused by helminths with a focus on schistosomiasis. The proposed method allows for a quantitative analysis of the systemic impact of a drug molecule on the pathogen as exhibited by the complex continuum of its phenotypic responses. This method consists of two key parts: first, biological image analysis is employed to automatically monitor and quantify shape-, appearance-, and motion-based phenotypes of the parasites. Next, we represent these phenotypes as time-series and show how to compare, cluster, and quantitatively reason about them using techniques of time-series analysis. Results We present results on a number of algorithmic issues pertinent to the time-series representation of phenotypes. These include results on appropriate representation of phenotypic time-series, analysis of different time-series similarity measures for comparing phenotypic responses over time, and techniques for clustering such responses by similarity. Finally, we show how these algorithmic techniques can be used for quantifying the complex continuum of phenotypic responses of parasites. An important corollary is the ability of our method to recognize and rigorously group parasites based on the variability of their phenotypic response to different drugs. Conclusions The methods and results presented in this paper enable automatic and quantitative scoring of high-throughput phenotypic screens focused on helmintic diseases. Furthermore, these methods allow us to analyze and stratify parasites based on their phenotypic response to drugs. Together, these advancements represent a significant breakthrough for the process of drug discovery against schistosomiasis in particular and can be extended to other helmintic diseases which together afflict a large part of humankind. PMID:22369037

  14. Systematic Evaluation of Non-Uniform Sampling Parameters in the Targeted Analysis of Urine Metabolites by 1H,1H 2D NMR Spectroscopy.

    PubMed

    Schlippenbach, Trixi von; Oefner, Peter J; Gronwald, Wolfram

    2018-03-09

    Non-uniform sampling (NUS) allows the accelerated acquisition of multidimensional NMR spectra. The aim of this contribution was the systematic evaluation of the impact of various quantitative NUS parameters on the accuracy and precision of 2D NMR measurements of urinary metabolites. Urine aliquots spiked with varying concentrations (15.6-500.0 µM) of tryptophan, tyrosine, glutamine, glutamic acid, lactic acid, and threonine, which can only be resolved fully by 2D NMR, were used to assess the influence of the sampling scheme, reconstruction algorithm, amount of omitted data points, and seed value on the quantitative performance of NUS in 1 H, 1 H-TOCSY and 1 H, 1 H-COSY45 NMR spectroscopy. Sinusoidal Poisson-gap sampling and a compressed sensing approach employing the iterative re-weighted least squares method for spectral reconstruction allowed a 50% reduction in measurement time while maintaining sufficient quantitative accuracy and precision for both types of homonuclear 2D NMR spectroscopy. Together with other advances in instrument design, such as state-of-the-art cryogenic probes, use of 2D NMR spectroscopy in large biomedical cohort studies seems feasible.

  15. Global Analysis of River Planform Change using the Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Bryk, A.; Dietrich, W. E.; Gorelick, N.; Sargent, R.; Braudrick, C. A.

    2014-12-01

    Geomorphologists have historically tracked river dynamics using a combination of maps, aerial photographs, and the stratigraphic record. Although stratigraphic records can extend into deep time, maps and aerial photographs often confine our record of change to sparse measurements over the last ~80 years and in some cases much less time. For the first time Google's Earth Engine (GEE) cloud based platform allows researchers the means to analyze quantitatively the pattern and pace of river channel change over the last 30 years with high temporal resolution across the entire planet. The GEE provides an application programing interface (API) that enables quantitative analysis of various data sets including the entire Landsat L1T archive. This allows change detection for channels wider than about 150 m over 30 years of successive, georeferenced imagery. Qualitatively, it becomes immediately evident that the pace of channel morphodynamics for similar planforms varies by orders of magnitude across the planet and downstream along individual rivers. To quantify these rates of change and to explore their controls we have developed methods for differentiating channels from floodplain along large alluvial rivers. We introduce a new metric of morphodynamics: the ratio of eroded area to channel area per unit time, referred to as "M". We also keep track of depositional areas resulting from channel shifting. To date our quantitative analysis has focused on rivers in the Andean foreland. Our analysis shows channel bank erosion rates, M, varies by orders of magnitude for these rivers, from 0 to ~0.25 yr-1, yet these rivers have essentially identical curvature and sinuosity and are visually indistinguishable. By tracking both bank paths in time, we find that, for some meandering rivers, a significant fraction of new floodplain is produced through outer-bank accretion rather than point bar deposition. This process is perhaps more important in generating floodplain stratigraphy than previously recognized. These initial findings indicate a new set of quantitative observations will emerge to further test and advance morphodynamic theory. The Google Earth Engine offers the opportunity to explore river morphodynamics on an unprecedented scale and provides a powerful tool for addressing fundamental questions in river morphodynamics.

  16. Quantitative fluorescence loss in photobleaching for analysis of protein transport and aggregation

    PubMed Central

    2012-01-01

    Background Fluorescence loss in photobleaching (FLIP) is a widely used imaging technique, which provides information about protein dynamics in various cellular regions. In FLIP, a small cellular region is repeatedly illuminated by an intense laser pulse, while images are taken with reduced laser power with a time lag between the bleaches. Despite its popularity, tools are lacking for quantitative analysis of FLIP experiments. Typically, the user defines regions of interest (ROIs) for further analysis which is subjective and does not allow for comparing different cells and experimental settings. Results We present two complementary methods to detect and quantify protein transport and aggregation in living cells from FLIP image series. In the first approach, a stretched exponential (StrExp) function is fitted to fluorescence loss (FL) inside and outside the bleached region. We show by reaction–diffusion simulations, that the StrExp function can describe both, binding/barrier–limited and diffusion-limited FL kinetics. By pixel-wise regression of that function to FL kinetics of enhanced green fluorescent protein (eGFP), we determined in a user-unbiased manner from which cellular regions eGFP can be replenished in the bleached area. Spatial variation in the parameters calculated from the StrExp function allow for detecting diffusion barriers for eGFP in the nucleus and cytoplasm of living cells. Polyglutamine (polyQ) disease proteins like mutant huntingtin (mtHtt) can form large aggregates called inclusion bodies (IB’s). The second method combines single particle tracking with multi-compartment modelling of FL kinetics in moving IB’s to determine exchange rates of eGFP-tagged mtHtt protein (eGFP-mtHtt) between aggregates and the cytoplasm. This method is self-calibrating since it relates the FL inside and outside the bleached regions. It makes it therefore possible to compare release kinetics of eGFP-mtHtt between different cells and experiments. Conclusions We present two complementary methods for quantitative analysis of FLIP experiments in living cells. They provide spatial maps of exchange dynamics and absolute binding parameters of fluorescent molecules to moving intracellular entities, respectively. Our methods should be of great value for quantitative studies of intracellular transport. PMID:23148417

  17. Single Fluorescence Channel-based Multiplex Detection of Avian Influenza Virus by Quantitative PCR with Intercalating Dye

    PubMed Central

    Ahberg, Christian D.; Manz, Andreas; Neuzil, Pavel

    2015-01-01

    Since its invention in 1985 the polymerase chain reaction (PCR) has become a well-established method for amplification and detection of segments of double-stranded DNA. Incorporation of fluorogenic probe or DNA intercalating dyes (such as SYBR Green) into the PCR mixture allowed real-time reaction monitoring and extraction of quantitative information (qPCR). Probes with different excitation spectra enable multiplex qPCR of several DNA segments using multi-channel optical detection systems. Here we show multiplex qPCR using an economical EvaGreen-based system with single optical channel detection. Previously reported non quantitative multiplex real-time PCR techniques based on intercalating dyes were conducted once the PCR is completed by performing melting curve analysis (MCA). The technique presented in this paper is both qualitative and quantitative as it provides information about the presence of multiple DNA strands as well as the number of starting copies in the tested sample. Besides important internal control, multiplex qPCR also allows detecting concentrations of more than one DNA strand within the same sample. Detection of the avian influenza virus H7N9 by PCR is a well established method. Multiplex qPCR greatly enhances its specificity as it is capable of distinguishing both haemagglutinin (HA) and neuraminidase (NA) genes as well as their ratio. PMID:26088868

  18. Quantitative analysis of periodontal pathogens by ELISA and real-time polymerase chain reaction.

    PubMed

    Hamlet, Stephen M

    2010-01-01

    The development of analytical methods enabling the accurate identification and enumeration of bacterial species colonizing the oral cavity has led to the identification of a small number of bacterial pathogens that are major factors in the etiology of periodontal disease. Further, these methods also underpin more recent epidemiological analyses of the impact of periodontal disease on general health. Given the complex milieu of over 700 species of microorganisms known to exist within the complex biofilms found in the oral cavity, the identification and enumeration of oral periodontopathogens has not been an easy task. In recent years however, some of the intrinsic limitations of the more traditional microbiological analyses previously used have been overcome with the advent of immunological and molecular analytical methods. Of the plethora of methodologies reported in the literature, the enzyme-linked immunosorbent assay (ELISA), which combines the specificity of antibody with the sensitivity of simple enzyme assays and the polymerase chain reaction (PCR), has been widely utilized in both laboratory and clinical applications. Although conventional PCR does not allow quantitation of the target organism, real-time PCR (rtPCR) has the ability to detect amplicons as they accumulate in "real time" allowing subsequent quantitation. These methods enable the accurate quantitation of as few as 10(2) (using rtPCR) to 10(4) (using ELISA) periodontopathogens in dental plaque samples.

  19. Temporal Data Set Reduction Based on D-Optimality for Quantitative FLIM-FRET Imaging.

    PubMed

    Omer, Travis; Intes, Xavier; Hahn, Juergen

    2015-01-01

    Fluorescence lifetime imaging (FLIM) when paired with Förster resonance energy transfer (FLIM-FRET) enables the monitoring of nanoscale interactions in living biological samples. FLIM-FRET model-based estimation methods allow the quantitative retrieval of parameters such as the quenched (interacting) and unquenched (non-interacting) fractional populations of the donor fluorophore and/or the distance of the interactions. The quantitative accuracy of such model-based approaches is dependent on multiple factors such as signal-to-noise ratio and number of temporal points acquired when sampling the fluorescence decays. For high-throughput or in vivo applications of FLIM-FRET, it is desirable to acquire a limited number of temporal points for fast acquisition times. Yet, it is critical to acquire temporal data sets with sufficient information content to allow for accurate FLIM-FRET parameter estimation. Herein, an optimal experimental design approach based upon sensitivity analysis is presented in order to identify the time points that provide the best quantitative estimates of the parameters for a determined number of temporal sampling points. More specifically, the D-optimality criterion is employed to identify, within a sparse temporal data set, the set of time points leading to optimal estimations of the quenched fractional population of the donor fluorophore. Overall, a reduced set of 10 time points (compared to a typical complete set of 90 time points) was identified to have minimal impact on parameter estimation accuracy (≈5%), with in silico and in vivo experiment validations. This reduction of the number of needed time points by almost an order of magnitude allows the use of FLIM-FRET for certain high-throughput applications which would be infeasible if the entire number of time sampling points were used.

  20. An assay for lateral line regeneration in adult zebrafish.

    PubMed

    Pisano, Gina C; Mason, Samantha M; Dhliwayo, Nyembezi; Intine, Robert V; Sarras, Michael P

    2014-04-08

    Due to the clinical importance of hearing and balance disorders in man, model organisms such as the zebrafish have been used to study lateral line development and regeneration. The zebrafish is particularly attractive for such studies because of its rapid development time and its high regenerative capacity. To date, zebrafish studies of lateral line regeneration have mainly utilized fish of the embryonic and larval stages because of the lower number of neuromasts at these stages. This has made quantitative analysis of lateral line regeneration/and or development easier in the earlier developmental stages. Because many zebrafish models of neurological and non-neurological diseases are studied in the adult fish and not in the embryo/larvae, we focused on developing a quantitative lateral line regenerative assay in adult zebrafish so that an assay was available that could be applied to current adult zebrafish disease models. Building on previous studies by Van Trump et al. that described procedures for ablation of hair cells in adult Mexican blind cave fish and zebrafish (Danio rerio), our assay was designed to allow quantitative comparison between control and experimental groups. This was accomplished by developing a regenerative neuromast standard curve based on the percent of neuromast reappearance over a 24 hr time period following gentamicin-induced necrosis of hair cells in a defined region of the lateral line. The assay was also designed to allow extension of the analysis to the individual hair cell level when a higher level of resolution is required.

  1. Multiple Time-of-Flight/Time-of-Flight Events in a Single Laser Shot for Improved Matrix-Assisted Laser Desorption/Ionization Tandem Mass Spectrometry Quantification.

    PubMed

    Prentice, Boone M; Chumbley, Chad W; Hachey, Brian C; Norris, Jeremy L; Caprioli, Richard M

    2016-10-04

    Quantitative matrix-assisted laser desorption/ionization time-of-flight (MALDI TOF) approaches have historically suffered from poor accuracy and precision mainly due to the nonuniform distribution of matrix and analyte across the target surface, matrix interferences, and ionization suppression. Tandem mass spectrometry (MS/MS) can be used to ensure chemical specificity as well as improve signal-to-noise ratios by eliminating interferences from chemical noise, alleviating some concerns about dynamic range. However, conventional MALDI TOF/TOF modalities typically only scan for a single MS/MS event per laser shot, and multiplex assays require sequential analyses. We describe here new methodology that allows for multiple TOF/TOF fragmentation events to be performed in a single laser shot. This technology allows the reference of analyte intensity to that of the internal standard in each laser shot, even when the analyte and internal standard are quite disparate in m/z, thereby improving quantification while maintaining chemical specificity and duty cycle. In the quantitative analysis of the drug enalapril in pooled human plasma with ramipril as an internal standard, a greater than 4-fold improvement in relative standard deviation (<10%) was observed as well as improved coefficients of determination (R 2 ) and accuracy (>85% quality controls). Using this approach we have also performed simultaneous quantitative analysis of three drugs (promethazine, enalapril, and verapamil) using deuterated analogues of these drugs as internal standards.

  2. Integration of Metabolic and Quorum Sensing Signals Governing the Decision to Cooperate in a Bacterial Social Trait

    PubMed Central

    Boyle, Kerry E.; Monaco, Hilary; van Ditmarsch, Dave; Deforet, Maxime; Xavier, Joao B.

    2015-01-01

    Many unicellular organisms live in multicellular communities that rely on cooperation between cells. However, cooperative traits are vulnerable to exploitation by non-cooperators (cheaters). We expand our understanding of the molecular mechanisms that allow multicellular systems to remain robust in the face of cheating by dissecting the dynamic regulation of cooperative rhamnolipids required for swarming in Pseudomonas aeruginosa. We combine mathematical modeling and experiments to quantitatively characterize the integration of metabolic and population density signals (quorum sensing) governing expression of the rhamnolipid synthesis operon rhlAB. The combined computational/experimental analysis reveals that when nutrients are abundant, rhlAB promoter activity increases gradually in a density dependent way. When growth slows down due to nutrient limitation, rhlAB promoter activity can stop abruptly, decrease gradually or even increase depending on whether the growth-limiting nutrient is the carbon source, nitrogen source or iron. Starvation by specific nutrients drives growth on intracellular nutrient pools as well as the qualitative rhlAB promoter response, which itself is modulated by quorum sensing. Our quantitative analysis suggests a supply-driven activation that integrates metabolic prudence with quorum sensing in a non-digital manner and allows P. aeruginosa cells to invest in cooperation only when the population size is large enough (quorum sensing) and individual cells have enough metabolic resources to do so (metabolic prudence). Thus, the quantitative description of rhlAB regulatory dynamics brings a greater understating to the regulation required to make swarming cooperation stable. PMID:26102206

  3. Quantitative evaluation of contrast-enhanced ultrasound after intravenous administration of a microbubble contrast agent for differentiation of benign and malignant thyroid nodules: assessment of diagnostic accuracy.

    PubMed

    Nemec, Ursula; Nemec, Stefan F; Novotny, Clemens; Weber, Michael; Czerny, Christian; Krestan, Christian R

    2012-06-01

    To investigate the diagnostic accuracy, through quantitative analysis, of contrast-enhanced ultrasound (CEUS), using a microbubble contrast agent, in the differentiation of thyroid nodules. This prospective study enrolled 46 patients with solitary, scintigraphically non-functional thyroid nodules. These patients were scheduled for surgery and underwent preoperative CEUS with pulse-inversion harmonic imaging after intravenous microbubble contrast medium administration. Using histology as a standard of reference, time-intensity curves of benign and malignant nodules were compared by means of peak enhancement and wash-out enhancement relative to the baseline intensity using a mixed model ANOVA. ROC analysis was performed to assess the diagnostic accuracy in the differentiation of benign and malignant nodules on CEUS. The complete CEUS data of 42 patients (31/42 [73.8%] benign and 11/42 [26.2%] malignant nodules) revealed a significant difference (P < 0.001) in enhancement between benign and malignant nodules. Furthermore, based on ROC analysis, CEUS demonstrated sensitivity of 76.9%, specificity of 84.8% and accuracy of 82.6%. Quantitative analysis of CEUS using a microbubble contrast agent allows the differentiation of benign and malignant thyroid nodules and may potentially serve, in addition to grey-scale and Doppler ultrasound, as an adjunctive tool in the assessment of patients with thyroid nodules. • Contrast-enhanced ultrasound (CEUS) helps differentiate between benign and malignant thyroid nodules. • Quantitative CEUS analysis yields sensitivity of 76.9% and specificity of 84.8%. • CEUS may be a potentially useful adjunct in assessing thyroid nodules.

  4. Influence of sample preparation and reliability of automated numerical refocusing in stain-free analysis of dissected tissues with quantitative phase digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Kemper, Björn; Lenz, Philipp; Bettenworth, Dominik; Krausewitz, Philipp; Domagk, Dirk; Ketelhut, Steffi

    2015-05-01

    Digital holographic microscopy (DHM) has been demonstrated to be a versatile tool for high resolution non-destructive quantitative phase imaging of surfaces and multi-modal minimally-invasive monitoring of living cell cultures in-vitro. DHM provides quantitative monitoring of physiological processes through functional imaging and structural analysis which, for example, gives new insight into signalling of cellular water permeability and cell morphology changes due to toxins and infections. Also the analysis of dissected tissues quantitative DHM phase contrast prospects application fields by stain-free imaging and the quantification of tissue density changes. We show that DHM allows imaging of different tissue layers with high contrast in unstained tissue sections. As the investigation of fixed samples represents a very important application field in pathology, we also analyzed the influence of the sample preparation. The retrieved data demonstrate that the quality of quantitative DHM phase images of dissected tissues depends strongly on the fixing method and common staining agents. As in DHM the reconstruction is performed numerically, multi-focus imaging is achieved from a single digital hologram. Thus, we evaluated the automated refocussing feature of DHM for application on different types of dissected tissues and revealed that on moderately stained samples highly reproducible holographic autofocussing can be achieved. Finally, it is demonstrated that alterations of the spatial refractive index distribution in murine and human tissue samples represent a reliable absolute parameter that is related of different degrees of inflammation in experimental colitis and Crohn's disease. This paves the way towards the usage of DHM in digital pathology for automated histological examinations and further studies to elucidate the translational potential of quantitative phase microscopy for the clinical management of patients, e.g., with inflammatory bowel disease.

  5. Simulation of realistic abnormal SPECT brain perfusion images: application in semi-quantitative analysis

    NASA Astrophysics Data System (ADS)

    Ward, T.; Fleming, J. S.; Hoffmann, S. M. A.; Kemp, P. M.

    2005-11-01

    Simulation is useful in the validation of functional image analysis methods, particularly when considering the number of analysis techniques currently available lacking thorough validation. Problems exist with current simulation methods due to long run times or unrealistic results making it problematic to generate complete datasets. A method is presented for simulating known abnormalities within normal brain SPECT images using a measured point spread function (PSF), and incorporating a stereotactic atlas of the brain for anatomical positioning. This allows for the simulation of realistic images through the use of prior information regarding disease progression. SPECT images of cerebral perfusion have been generated consisting of a control database and a group of simulated abnormal subjects that are to be used in a UK audit of analysis methods. The abnormality is defined in the stereotactic space, then transformed to the individual subject space, convolved with a measured PSF and removed from the normal subject image. The dataset was analysed using SPM99 (Wellcome Department of Imaging Neuroscience, University College, London) and the MarsBaR volume of interest (VOI) analysis toolbox. The results were evaluated by comparison with the known ground truth. The analysis showed improvement when using a smoothing kernel equal to system resolution over the slightly larger kernel used routinely. Significant correlation was found between effective volume of a simulated abnormality and the detected size using SPM99. Improvements in VOI analysis sensitivity were found when using the region median over the region mean. The method and dataset provide an efficient methodology for use in the comparison and cross validation of semi-quantitative analysis methods in brain SPECT, and allow the optimization of analysis parameters.

  6. Application of factor analysis of infrared spectra for quantitative determination of beta-tricalcium phosphate in calcium hydroxylapatite.

    PubMed

    Arsenyev, P A; Trezvov, V V; Saratovskaya, N V

    1997-01-01

    This work represents a method, which allows to determine phase composition of calcium hydroxylapatite basing on its infrared spectrum. The method uses factor analysis of the spectral data of calibration set of samples to determine minimal number of factors required to reproduce the spectra within experimental error. Multiple linear regression is applied to establish correlation between factor scores of calibration standards and their properties. The regression equations can be used to predict the property value of unknown sample. The regression model was built for determination of beta-tricalcium phosphate content in hydroxylapatite. Statistical estimation of quality of the model was carried out. Application of the factor analysis on spectral data allows to increase accuracy of beta-tricalcium phosphate determination and expand the range of determination towards its less concentration. Reproducibility of results is retained.

  7. High-throughput SISCAPA quantitation of peptides from human plasma digests by ultrafast, liquid chromatography-free mass spectrometry.

    PubMed

    Razavi, Morteza; Frick, Lauren E; LaMarr, William A; Pope, Matthew E; Miller, Christine A; Anderson, N Leigh; Pearson, Terry W

    2012-12-07

    We investigated the utility of an SPE-MS/MS platform in combination with a modified SISCAPA workflow for chromatography-free MRM analysis of proteotypic peptides in digested human plasma. This combination of SISCAPA and SPE-MS/MS technology allows sensitive, MRM-based quantification of peptides from plasma digests with a sample cycle time of ∼7 s, a 300-fold improvement over typical MRM analyses with analysis times of 30-40 min that use liquid chromatography upstream of MS. The optimized system includes capture and enrichment to near purity of target proteotypic peptides using rigorously selected, high affinity, antipeptide monoclonal antibodies and reduction of background peptides using a novel treatment of magnetic bead immunoadsorbents. Using this method, we have successfully quantitated LPS-binding protein and mesothelin (concentrations of ∼5000 ng/mL and ∼10 ng/mL, respectively) in human plasma. The method eliminates the need for upstream liquid-chromatography and can be multiplexed, thus facilitating quantitative analysis of proteins, including biomarkers, in large sample sets. The method is ideal for high-throughput biomarker validation after affinity enrichment and has the potential for applications in clinical laboratories.

  8. Proflavine Hemisulfate as a Fluorescent Contrast Agent for Point-of-Care Cytology

    PubMed Central

    Prieto, Sandra P.; Powless, Amy J.; Boice, Jackson W.; Sharma, Shree G.; Muldoon, Timothy J.

    2015-01-01

    Proflavine hemisulfate, an acridine-derived fluorescent dye, can be used as a rapid stain for cytologic examination of biological specimens. Proflavine fluorescently stains cell nuclei and cytoplasmic structures, owing to its small amphipathic structure and ability to intercalate DNA. In this manuscript, we demonstrated the use of proflavine as a rapid cytologic dye on a number of specimens, including normal exfoliated oral squamous cells, cultured human oral squamous carcinoma cells, and leukocytes derived from whole blood specimens using a custom-built, portable, LED-illuminated fluorescence microscope. No incubation time was needed after suspending cells in 0.01% (w/v) proflavine diluted in saline. Images of proflavine stained oral cells had clearly visible nuclei as well as granular cytoplasm, while stained leukocytes exhibited bright nuclei, and highlighted the multilobar nature of nuclei in neutrophils. We also demonstrated the utility of quantitative analysis of digital images of proflavine stained cells, which can be used to detect significant morphological differences between different cell types. Proflavine stained oral cells have well-defined nuclei and cell membranes which allowed for quantitative analysis of nuclear to cytoplasmic ratios, as well as image texture analysis to extract quantitative image features. PMID:25962131

  9. Proflavine Hemisulfate as a Fluorescent Contrast Agent for Point-of-Care Cytology.

    PubMed

    Prieto, Sandra P; Powless, Amy J; Boice, Jackson W; Sharma, Shree G; Muldoon, Timothy J

    2015-01-01

    Proflavine hemisulfate, an acridine-derived fluorescent dye, can be used as a rapid stain for cytologic examination of biological specimens. Proflavine fluorescently stains cell nuclei and cytoplasmic structures, owing to its small amphipathic structure and ability to intercalate DNA. In this manuscript, we demonstrated the use of proflavine as a rapid cytologic dye on a number of specimens, including normal exfoliated oral squamous cells, cultured human oral squamous carcinoma cells, and leukocytes derived from whole blood specimens using a custom-built, portable, LED-illuminated fluorescence microscope. No incubation time was needed after suspending cells in 0.01% (w/v) proflavine diluted in saline. Images of proflavine stained oral cells had clearly visible nuclei as well as granular cytoplasm, while stained leukocytes exhibited bright nuclei, and highlighted the multilobar nature of nuclei in neutrophils. We also demonstrated the utility of quantitative analysis of digital images of proflavine stained cells, which can be used to detect significant morphological differences between different cell types. Proflavine stained oral cells have well-defined nuclei and cell membranes which allowed for quantitative analysis of nuclear to cytoplasmic ratios, as well as image texture analysis to extract quantitative image features.

  10. Authentication of pineapple (Ananas comosus [L.] Merr.) fruit maturity stages by quantitative analysis of γ- and δ-lactones using headspace solid-phase microextraction and chirospecific gas chromatography-selected ion monitoring mass spectrometry (HS-SPME-GC-SIM-MS).

    PubMed

    Steingass, Christof B; Langen, Johannes; Carle, Reinhold; Schmarr, Hans-Georg

    2015-02-01

    Headspace solid phase microextraction and chirospecific gas chromatography-mass spectrometry in selected ion monitoring mode (HS-SPME-GC-SIM-MS) allowed quantitative determination of δ-lactones (δ-C8, δ-C10) and γ-lactones (γ-C6, γ-C8, γ-C10). A stable isotope dilution assay (SIDA) with d7-γ-decalactone as internal standard was used for quantitative analysis of pineapple lactones that was performed at three progressing post-harvest stages of fully ripe air-freighted and green-ripe sea-freighted fruits, covering the relevant shelf-life of the fruits. Fresh pineapples harvested at full maturity were characterised by γ-C6 of high enantiomeric purity remaining stable during the whole post-harvest period. In contrast, the enantiomeric purity of γ-C6 significantly decreased during post-harvest storage of sea-freighted pineapples. The biogenetical background and the potential of chirospecific analysis of lactones for authentication and quality evaluation of fresh pineapple fruits are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Quantitative Analysis of the Trends Exhibited by the Three Interdisciplinary Biological Sciences: Biophysics, Bioinformatics, and Systems Biology.

    PubMed

    Kang, Jonghoon; Park, Seyeon; Venkat, Aarya; Gopinath, Adarsh

    2015-12-01

    New interdisciplinary biological sciences like bioinformatics, biophysics, and systems biology have become increasingly relevant in modern science. Many papers have suggested the importance of adding these subjects, particularly bioinformatics, to an undergraduate curriculum; however, most of their assertions have relied on qualitative arguments. In this paper, we will show our metadata analysis of a scientific literature database (PubMed) that quantitatively describes the importance of the subjects of bioinformatics, systems biology, and biophysics as compared with a well-established interdisciplinary subject, biochemistry. Specifically, we found that the development of each subject assessed by its publication volume was well described by a set of simple nonlinear equations, allowing us to characterize them quantitatively. Bioinformatics, which had the highest ratio of publications produced, was predicted to grow between 77% and 93% by 2025 according to the model. Due to the large number of publications produced in bioinformatics, which nearly matches the number published in biochemistry, it can be inferred that bioinformatics is almost equal in significance to biochemistry. Based on our analysis, we suggest that bioinformatics be added to the standard biology undergraduate curriculum. Adding this course to an undergraduate curriculum will better prepare students for future research in biology.

  12. Differential reliability : probabilistic engineering applied to wood members in bending-tension

    Treesearch

    Stanley K. Suddarth; Frank E. Woeste; William L. Galligan

    1978-01-01

    Reliability analysis is a mathematical technique for appraising the design and materials of engineered structures to provide a quantitative estimate of probability of failure. Two or more cases which are similar in all respects but one may be analyzed by this method; the contrast between the probabilities of failure for these cases allows strong analytical focus on the...

  13. How can my research paper be useful for future meta-analyses on forest restoration practices?

    Treesearch

    Enrique Andivia; Pedro Villar‑Salvador; Juan A. Oliet; Jaime Puertolas; R. Kasten Dumroese

    2018-01-01

    Statistical meta-analysis is a powerful and useful tool to quantitatively synthesize the information conveyed in published studies on a particular topic. It allows identifying and quantifying overall patterns and exploring causes of variation. The inclusion of published works in meta-analyses requires, however, a minimum quality standard of the reported data and...

  14. Cost Effective Paper-Based Colorimetric Microfluidic Devices and Mobile Phone Camera Readers for the Classroom

    ERIC Educational Resources Information Center

    Koesdjojo, Myra T.; Pengpumkiat, Sumate; Wu, Yuanyuan; Boonloed, Anukul; Huynh, Daniel; Remcho, Thomas P.; Remcho, Vincent T.

    2015-01-01

    We have developed a simple and direct method to fabricate paper-based microfluidic devices that can be used for a wide range of colorimetric assay applications. With these devices, assays can be performed within minutes to allow for quantitative colorimetric analysis by use of a widely accessible iPhone camera and an RGB color reader application…

  15. Quantitative and Qualitative Analysis of Bacteria in Er(III) Solution by Thin-Film Magnetopheresis

    PubMed Central

    Zborowski, Maciej; Tada, Yoko; Malchesky, Paul S.; Hall, Geraldine S.

    1993-01-01

    Magnetic deposition, quantitation, and identification of bacteria reacting with the paramagnetic trivalent lanthanide ion, Er3+, was evaluated. The magnetic deposition method was dubbed thin-film magnetopheresis. The optimization of the magnetic deposition protocol was accomplished with Escherichia coli as a model organism in 150 mM NaCl and 5 mM ErCl3 solution. Three gram-positive bacteria, Staphylococcus epidermidis, Staphylococcus saprophyticus, and Enterococcus faecalis, and four gram-negative bacteria, E. coli, Pseudomonas aeruginosa, Proteus mirabilis, and Klebsiella pneumoniae, were subsequently investigated. Quantitative analysis consisted of the microscopic cell count and a scattered-light scanning of the magnetically deposited material aided by the computer data acquisition system. Qualitative analysis consisted of Gram stain differentiation and fluorescein isothiocyanate staining in combination with selected antisera against specific types of bacteria on the solid substrate. The magnetic deposition protocol allowed quantitative detection of E. coli down to the concentration of 105 CFU ml-1, significant in clinical diagnosis applications such as urinary tract infections. Er3+ did not interfere with the typical appearance of the Gram-stained bacteria nor with the antigen recognition by the antibody in the immunohistological evaluations. Indirect antiserum-fluorescein isothiocyanate labelling correctly revealed the presence of E. faecalis and P. aeruginosa in the magnetically deposited material obtained from the mixture of these two bacterial species. On average, the reaction of gram-positive organisms was significantly stronger to the magnetic field in the presence of Er3+ than the reaction of gram-negative organisms. The thin-film magnetophoresis offers promise as a rapid method for quantitative and qualitative analysis of bacteria in solutions such as urine or environmental water. Images PMID:16348916

  16. Comparison of longitudinal excursion of a nerve-phantom model using quantitative ultrasound imaging and motion analysis system methods: A convergent validity study.

    PubMed

    Paquette, Philippe; El Khamlichi, Youssef; Lamontagne, Martin; Higgins, Johanne; Gagnon, Dany H

    2017-08-01

    Quantitative ultrasound imaging is gaining popularity in research and clinical settings to measure the neuromechanical properties of the peripheral nerves such as their capability to glide in response to body segment movement. Increasing evidence suggests that impaired median nerve longitudinal excursion is associated with carpal tunnel syndrome. To date, psychometric properties of longitudinal nerve excursion measurements using quantitative ultrasound imaging have not been extensively investigated. This study investigates the convergent validity of the longitudinal nerve excursion by comparing measures obtained using quantitative ultrasound imaging with those determined with a motion analysis system. A 38-cm long rigid nerve-phantom model was used to assess the longitudinal excursion in a laboratory environment. The nerve-phantom model, immersed in a 20-cm deep container filled with a gelatin-based solution, was moved 20 times using a linear forward and backward motion. Three light-emitting diodes were used to record nerve-phantom excursion with a motion analysis system, while a 5-cm linear transducer allowed simultaneous recording via ultrasound imaging. Both measurement techniques yielded excellent association ( r  = 0.99) and agreement (mean absolute difference between methods = 0.85 mm; mean relative difference between methods = 7.48 %). Small discrepancies were largely found when larger excursions (i.e. > 10 mm) were performed, revealing slight underestimation of the excursion by the ultrasound imaging analysis software. Quantitative ultrasound imaging is an accurate method to assess the longitudinal excursion of an in vitro nerve-phantom model and appears relevant for future research protocols investigating the neuromechanical properties of the peripheral nerves.

  17. Development of a Fourier transform infrared spectroscopy coupled to UV-Visible analysis technique for aminosides and glycopeptides quantitation in antibiotic locks.

    PubMed

    Sayet, G; Sinegre, M; Ben Reguiga, M

    2014-01-01

    Antibiotic Lock technique maintains catheters' sterility in high-risk patients with long-term parenteral nutrition. In our institution, vancomycin, teicoplanin, amikacin and gentamicin locks are prepared in the pharmaceutical department. In order to insure patient safety and to comply to regulatory requirements, antibiotic locks are submitted to qualitative and quantitative assays prior to their release. The aim of this study was to develop an alternative quantitation technique for each of these 4 antibiotics, using a Fourier transform infrared (FTIR) coupled to UV-Visible spectroscopy and to compare results to HPLC or Immunochemistry assays. Prevalidation studies permitted to assess spectroscopic conditions used for antibiotic locks quantitation: FTIR/UV combinations were used for amikacin (1091-1115cm(-1) and 208-224nm), vancomycin (1222-1240cm(-1) and 276-280nm), and teicoplanin (1226-1230cm(-1) and 278-282nm). Gentamicin was quantified with FTIR only (1045-1169cm(-1) and 2715-2850cm(-1)) due to interferences in UV domain of parabens, preservatives present in the commercial brand used to prepare locks. For all AL, the method was linear (R(2)=0.996 to 0.999), accurate, repeatable (intraday RSD%: from 2.9 to 7.1% and inter-days RSD%: 2.9 to 5.1%) and precise. Compared to the reference methods, the FTIR/UV method appeared tightly correlated (Pearson factor: 97.4 to 99.9%) and did not show significant difference in recovery determinations. We developed a new simple reliable analysis technique for antibiotics quantitation in locks using an original association of FTIR and UV analysis, allowing a short time analysis to identify and quantify the studied antibiotics. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  18. Communication—Quantitative Voltammetric Analysis of High Concentration Actinides in Molten Salts

    DOE PAGES

    Hoyt, Nathaniel C.; Willit, James L.; Williamson, Mark A.

    2017-01-18

    Previous electroanalytical studies have shown that cyclic voltammetry can provide accurate quantitative measurements of actinide concentrations at low weight loadings in molten salts. However, above 2 wt%, the techniques were found to underpredict the concentrations of the reactant species. Here this work will demonstrate that much of the discrepancy is caused by uncompensated resistance and cylindrical diffusion. An improved electroanalytical approach has therefore been developed using the results of digital simulations to take these effects into account. This approach allows for accurate electroanalytical predictions across the full range of weight loadings expected to be encountered in operational nuclear fuel processingmore » equipment.« less

  19. The A-Like Faker Assay for Measuring Yeast Chromosome III Stability.

    PubMed

    Novoa, Carolina A; Ang, J Sidney; Stirling, Peter C

    2018-01-01

    The ability to rapidly assess chromosome instability (CIN) has enabled profiling of most yeast genes for potential effects on genome stability. The A-like faker (ALF) assay is one of several qualitative and quantitative marker loss assays that indirectly measure loss or conversion of genetic material using a counterselection step. The ALF assay relies on the ability to count spurious mating events that occur upon loss of the MATα locus of haploid Saccharomyces cerevisiae strains. Here, we describe the deployment of the ALF assay for both rapid and simple qualitative, and more in-depth quantitative analysis allowing determination of absolute ALF frequencies.

  20. Communication—Quantitative Voltammetric Analysis of High Concentration Actinides in Molten Salts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoyt, Nathaniel C.; Willit, James L.; Williamson, Mark A.

    Previous electroanalytical studies have shown that cyclic voltammetry can provide accurate quantitative measurements of actinide concentrations at low weight loadings in molten salts. However, above 2 wt%, the techniques were found to underpredict the concentrations of the reactant species. Here this work will demonstrate that much of the discrepancy is caused by uncompensated resistance and cylindrical diffusion. An improved electroanalytical approach has therefore been developed using the results of digital simulations to take these effects into account. This approach allows for accurate electroanalytical predictions across the full range of weight loadings expected to be encountered in operational nuclear fuel processingmore » equipment.« less

  1. Falcon: A Temporal Visual Analysis System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A.

    2016-09-05

    Flexible visible exploration of long, high-resolution time series from multiple sensor streams is a challenge in several domains. Falcon is a visual analytics approach that helps researchers acquire a deep understanding of patterns in log and imagery data. Falcon allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations with multiple levels of detail. These capabilities are applicable to the analysis of any quantitative time series.

  2. Ultrafast Screening and Quantitation of Pesticides in Food and Environmental Matrices by Solid-Phase Microextraction-Transmission Mode (SPME-TM) and Direct Analysis in Real Time (DART).

    PubMed

    Gómez-Ríos, Germán Augusto; Gionfriddo, Emanuela; Poole, Justen; Pawliszyn, Janusz

    2017-07-05

    The direct interface of microextraction technologies to mass spectrometry (MS) has unquestionably revolutionized the speed and efficacy at which complex matrices are analyzed. Solid Phase Micro Extraction-Transmission Mode (SPME-TM) is a technology conceived as an effective synergy between sample preparation and ambient ionization. Succinctly, the device consists of a mesh coated with polymeric particles that extracts analytes of interest present in a given sample matrix. This coated mesh acts as a transmission-mode substrate for Direct Analysis in Real Time (DART), allowing for rapid and efficient thermal desorption/ionization of analytes previously concentrated on the coating, and dramatically lowering the limits of detection attained by sole DART analysis. In this study, we present SPME-TM as a novel tool for the ultrafast enrichment of pesticides present in food and environmental matrices and their quantitative determination by MS via DART ionization. Limits of quantitation in the subnanogram per milliliter range can be attained, while total analysis time does not exceed 2 min per sample. In addition to target information obtained via tandem MS, retrospective studies of the same sample via high-resolution mass spectrometry (HRMS) were accomplished by thermally desorbing a different segment of the microextraction device.

  3. Investigating the quality of mental models deployed by undergraduate engineering students in creating explanations: The case of thermally activated phenomena

    NASA Astrophysics Data System (ADS)

    Fazio, Claudio; Battaglia, Onofrio Rosario; Di Paola, Benedetto

    2013-12-01

    This paper describes a method aimed at pointing out the quality of the mental models undergraduate engineering students deploy when asked to create explanations for phenomena or processes and/or use a given model in the same context. Student responses to a specially designed written questionnaire are quantitatively analyzed using researcher-generated categories of reasoning, based on the physics education research literature on student understanding of the relevant physics content. The use of statistical implicative analysis tools allows us to successfully identify clusters of students with respect to the similarity to the reasoning categories, defined as “practical or everyday,” “descriptive,” or “explicative.” Through the use of similarity and implication indexes our method also enables us to study the consistency in students’ deployment of mental models. A qualitative analysis of interviews conducted with students after they had completed the questionnaire is used to clarify some aspects which emerged from the quantitative analysis and validate the results obtained. Some implications of this joint use of quantitative and qualitative analysis for the design of a learning environment focused on the understanding of some aspects of the world at the level of causation and mechanisms of functioning are discussed.

  4. Uncertainty analysis in vulnerability estimations for elements at risk- a review of concepts and some examples on landslides

    NASA Astrophysics Data System (ADS)

    Ciurean, R. L.; Glade, T.

    2012-04-01

    Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.

  5. Automated quantitative gait analysis during overground locomotion in the rat: its application to spinal cord contusion and transection injuries.

    PubMed

    Hamers, F P; Lankhorst, A J; van Laar, T J; Veldhuis, W B; Gispen, W H

    2001-02-01

    Analysis of locomotion is an important tool in the study of peripheral and central nervous system damage. Most locomotor scoring systems in rodents are based either upon open field locomotion assessment, for example, the BBB score or upon foot print analysis. The former yields a semiquantitative description of locomotion as a whole, whereas the latter generates quantitative data on several selected gait parameters. In this paper, we describe the use of a newly developed gait analysis method that allows easy quantitation of a large number of locomotion parameters during walkway crossing. We were able to extract data on interlimb coordination, swing duration, paw print areas (total over stance, and at 20-msec time resolution), stride length, and base of support: Similar data can not be gathered by any single previously described method. We compare changes in gait parameters induced by two different models of spinal cord injury in rats, transection of the dorsal half of the spinal cord and spinal cord contusion injury induced by the NYU or MASCIS device. Although we applied this method to rats with spinal cord injury, the usefulness of this method is not limited to rats or to the investigation of spinal cord injuries alone.

  6. Analysis of yohimbine alkaloid from Pausinystalia yohimbe by non-aqueous capillary electrophoresis and gas chromatography-mass spectrometry.

    PubMed

    Chen, Qinhua; Li, Peng; Zhang, Zhuo; Li, Kaijun; Liu, Jia; Li, Qiang

    2008-07-01

    In the present work, the qualitative and quantitative analysis of Pausinystalia yohimbe-type alkaloids in the barks of Rubiaceae species is presented using different analytical approaches. Extracts of P. yohimbe were first examined by GC-MS and the major alkaloids were identified. The quantitation of yohimbine was then accomplished by non-aqueous CE (NACE) with diode array detection. This approach was selected in order to use a running buffer fully compatible with samples in organic solvent. In particular, a mixture of methanol containing ammonium acetate (20 mM) and glacial acetic acid was used as a BGE. The same analytical sample was subjected to GC-MS and NACE analysis; the different selectivity displayed by these techniques allowed different separation profiles that can be useful in phytochemical characterization of the extracts. The linear calibration ranges were all 10-1000 microg/mL for yohimbine by GC-MS and NACE analysis. The recovery of yohimbine was 91.2-94.0% with RSD 1.4-4.3%. The LOD for yohimbine were 0.6 microg/mL by GC-MS and 1.0 microg/mL by NACE, respectively. The GC-MS and NACE methods were successfully validated and applied to the quantitation of yohimbine.

  7. An approach for quantitative image quality analysis for CT

    NASA Astrophysics Data System (ADS)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  8. Quantitative model analysis with diverse biological data: applications in developmental pattern formation.

    PubMed

    Pargett, Michael; Umulis, David M

    2013-07-15

    Mathematical modeling of transcription factor and signaling networks is widely used to understand if and how a mechanism works, and to infer regulatory interactions that produce a model consistent with the observed data. Both of these approaches to modeling are informed by experimental data, however, much of the data available or even acquirable are not quantitative. Data that is not strictly quantitative cannot be used by classical, quantitative, model-based analyses that measure a difference between the measured observation and the model prediction for that observation. To bridge the model-to-data gap, a variety of techniques have been developed to measure model "fitness" and provide numerical values that can subsequently be used in model optimization or model inference studies. Here, we discuss a selection of traditional and novel techniques to transform data of varied quality and enable quantitative comparison with mathematical models. This review is intended to both inform the use of these model analysis methods, focused on parameter estimation, and to help guide the choice of method to use for a given study based on the type of data available. Applying techniques such as normalization or optimal scaling may significantly improve the utility of current biological data in model-based study and allow greater integration between disparate types of data. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. Analysis of DNA interactions using single-molecule force spectroscopy.

    PubMed

    Ritzefeld, Markus; Walhorn, Volker; Anselmetti, Dario; Sewald, Norbert

    2013-06-01

    Protein-DNA interactions are involved in many biochemical pathways and determine the fate of the corresponding cell. Qualitative and quantitative investigations on these recognition and binding processes are of key importance for an improved understanding of biochemical processes and also for systems biology. This review article focusses on atomic force microscopy (AFM)-based single-molecule force spectroscopy and its application to the quantification of forces and binding mechanisms that lead to the formation of protein-DNA complexes. AFM and dynamic force spectroscopy are exciting tools that allow for quantitative analysis of biomolecular interactions. Besides an overview on the method and the most important immobilization approaches, the physical basics of the data evaluation is described. Recent applications of AFM-based force spectroscopy to investigate DNA intercalation, complexes involving DNA aptamers and peptide- and protein-DNA interactions are given.

  10. Application of microchip CGE for the analysis of PEG-modified recombinant human granulocyte-colony stimulating factors.

    PubMed

    Park, Eun Ji; Lee, Kyung Soo; Lee, Kang Choon; Na, Dong Hee

    2010-11-01

    The purpose of this study was to evaluate the microchip CGE (MCGE) for the analysis of PEG-modified granulocyte-colony stimulating factor (PEG-G-CSF) prepared with PEG-aldehydes. The unmodified and PEG-modified G-CSFs were analyzed by Protein 80 and 230 Labchips on the Agilent 2100 Bioanalyzer. The MCGE allowed size-based separation and quantitation of PEG-G-CSF. The Protein 80 Labchip was useful for PEG-5K-G-CSF, while the Protein 230 Labchip was more suitable for PEG-20K-G-CSF. The MCGE was also used to monitor a search for optimal PEG-modification (PEGylation) conditions to produce mono-PEG-G-CSF. This study demonstrates the usefulness of MCGE for monitoring and optimizing the PEGylation of G-CSF with the advantages of speed, minimal sample consumption, and automatic quantitation.

  11. Quantitative analysis and feature recognition in 3-D microstructural data sets

    NASA Astrophysics Data System (ADS)

    Lewis, A. C.; Suh, C.; Stukowski, M.; Geltmacher, A. B.; Spanos, G.; Rajan, K.

    2006-12-01

    A three-dimensional (3-D) reconstruction of an austenitic stainless-steel microstructure was used as input for an image-based finite-element model to simulate the anisotropic elastic mechanical response of the microstructure. The quantitative data-mining and data-warehousing techniques used to correlate regions of high stress with critical microstructural features are discussed. Initial analysis of elastic stresses near grain boundaries due to mechanical loading revealed low overall correlation with their location in the microstructure. However, the use of data-mining and feature-tracking techniques to identify high-stress outliers revealed that many of these high-stress points are generated near grain boundaries and grain edges (triple junctions). These techniques also allowed for the differentiation between high stresses due to boundary conditions of the finite volume reconstructed, and those due to 3-D microstructural features.

  12. Quantitative Monitoring of Microbial Species during Bioleaching of a Copper Concentrate.

    PubMed

    Hedrich, Sabrina; Guézennec, Anne-Gwenaëlle; Charron, Mickaël; Schippers, Axel; Joulian, Catherine

    2016-01-01

    Monitoring of the microbial community in bioleaching processes is essential in order to control process parameters and enhance the leaching efficiency. Suitable methods are, however, limited as they are usually not adapted to bioleaching samples and often no taxon-specific assays are available in the literature for these types of consortia. Therefore, our study focused on the development of novel quantitative real-time PCR (qPCR) assays for the quantification of Acidithiobacillus caldus, Leptospirillum ferriphilum, Sulfobacillus thermosulfidooxidans , and Sulfobacillus benefaciens and comparison of the results with data from other common molecular monitoring methods in order to evaluate their accuracy and specificity. Stirred tank bioreactors for the leaching of copper concentrate, housing a consortium of acidophilic, moderately thermophilic bacteria, relevant in several bioleaching operations, served as a model system. The microbial community analysis via qPCR allowed a precise monitoring of the evolution of total biomass as well as abundance of specific species. Data achieved by the standard fingerprinting methods, terminal restriction fragment length polymorphism (T-RFLP) and capillary electrophoresis single strand conformation polymorphism (CE-SSCP) on the same samples followed the same trend as qPCR data. The main added value of qPCR was, however, to provide quantitative data for each species whereas only relative abundance could be deduced from T-RFLP and CE-SSCP profiles. Additional value was obtained by applying two further quantitative methods which do not require nucleic acid extraction, total cell counting after SYBR Green staining and metal sulfide oxidation activity measurements via microcalorimetry. Overall, these complementary methods allow for an efficient quantitative microbial community monitoring in various bioleaching operations.

  13. Quantitative Monitoring of Microbial Species during Bioleaching of a Copper Concentrate

    PubMed Central

    Hedrich, Sabrina; Guézennec, Anne-Gwenaëlle; Charron, Mickaël; Schippers, Axel; Joulian, Catherine

    2016-01-01

    Monitoring of the microbial community in bioleaching processes is essential in order to control process parameters and enhance the leaching efficiency. Suitable methods are, however, limited as they are usually not adapted to bioleaching samples and often no taxon-specific assays are available in the literature for these types of consortia. Therefore, our study focused on the development of novel quantitative real-time PCR (qPCR) assays for the quantification of Acidithiobacillus caldus, Leptospirillum ferriphilum, Sulfobacillus thermosulfidooxidans, and Sulfobacillus benefaciens and comparison of the results with data from other common molecular monitoring methods in order to evaluate their accuracy and specificity. Stirred tank bioreactors for the leaching of copper concentrate, housing a consortium of acidophilic, moderately thermophilic bacteria, relevant in several bioleaching operations, served as a model system. The microbial community analysis via qPCR allowed a precise monitoring of the evolution of total biomass as well as abundance of specific species. Data achieved by the standard fingerprinting methods, terminal restriction fragment length polymorphism (T-RFLP) and capillary electrophoresis single strand conformation polymorphism (CE-SSCP) on the same samples followed the same trend as qPCR data. The main added value of qPCR was, however, to provide quantitative data for each species whereas only relative abundance could be deduced from T-RFLP and CE-SSCP profiles. Additional value was obtained by applying two further quantitative methods which do not require nucleic acid extraction, total cell counting after SYBR Green staining and metal sulfide oxidation activity measurements via microcalorimetry. Overall, these complementary methods allow for an efficient quantitative microbial community monitoring in various bioleaching operations. PMID:28066365

  14. Sensitive method for the quantitation of droloxifene in plasma and serum by high-performance liquid chromatography employing fluorimetric detection.

    PubMed

    Tess, D A; Cole, R O; Toler, S M

    1995-12-15

    A simple and highly sensitive reversed-phase fluorimetric HPLC method for the quantitation of droloxifene from rat, monkey, and human plasma as well as human serum is described. This assay employs solid-phase extraction and has a dynamic range of 25 to 10,000 pg/ml. Sample extraction (efficiencies > 86%) was accomplished using a benzenesulfonic acid (SCX) column with water and methanol rinses. Droloxifene and internal standard were eluted with 1 ml of 3.5% (v/v) ammonium hydroxide (30%) in methanol. Samples were quantitated using post-column UV-photochemical cyclization coupled with fluorimetric detection with excitation and emission wavelengths of 260 nm and 375 nm, respectively. Relative ease of sample extraction and short run times allow for the analysis of approximately 100 samples per day.

  15. Photometric Determination of Ammonium and Phosphate in Seawater Medium Using a Microplate Reader.

    PubMed

    Ruppersberg, Hanna S; Goebel, Maren R; Kleinert, Svea I; Wünsch, Daniel; Trautwein, Kathleen; Rabus, Ralf

    2017-01-01

    To more efficiently process the large sample numbers for quantitative determination of ammonium (NH4+) and phosphate (orthophosphate, PO43-) generated during comprehensive growth experiments with the marine Roseobacter group member Phaeobacter inhibens DSM 17395, specific colorimetric assays employing a microplate reader (MPR) were established. The NH4+ assay is based on the reaction of NH4+ with hypochlorite and salicylate, yielding a limit of detection of 14 µM, a limit of quantitation of 36 µM, and a linear range for quantitative determination up to 200 µM. The PO43-assay is based on the complex formation of PO43- with ammonium molybdate in the presence of ascorbate and zinc acetate, yielding a limit of detection of 13 µM, a limit of quantitation of 50 µM, and a linear range for quantitative determination up to 1 mM. Both MPR-based assays allowed for fast (significantly lower than 1 h) analysis of 21 samples plus standards for calibration (all measured in triplicates) and showed only low variation across a large collection of biological samples. © 2017 S. Karger AG, Basel.

  16. SPICE Module for the Satellite Orbit Analysis Program (SOAP)

    NASA Technical Reports Server (NTRS)

    Coggi, John; Carnright, Robert; Hildebrand, Claude

    2008-01-01

    A SPICE module for the Satellite Orbit Analysis Program (SOAP) precisely represents complex motion and maneuvers in an interactive, 3D animated environment with support for user-defined quantitative outputs. (SPICE stands for Spacecraft, Planet, Instrument, Camera-matrix, and Events). This module enables the SOAP software to exploit NASA mission ephemeris represented in the JPL Ancillary Information Facility (NAIF) SPICE formats. Ephemeris types supported include position, velocity, and orientation for spacecraft and planetary bodies including the Sun, planets, natural satellites, comets, and asteroids. Entire missions can now be imported into SOAP for 3D visualization, playback, and analysis. The SOAP analysis and display features can now leverage detailed mission files to offer the analyst both a numerically correct and aesthetically pleasing combination of results that can be varied to study many hypothetical scenarios. The software provides a modeling and simulation environment that can encompass a broad variety of problems using orbital prediction. For example, ground coverage analysis, communications analysis, power and thermal analysis, and 3D visualization that provide the user with insight into complex geometric relations are included. The SOAP SPICE module allows distributed science and engineering teams to share common mission models of known pedigree, which greatly reduces duplication of effort and the potential for error. The use of the software spans all phases of the space system lifecycle, from the study of future concepts to operations and anomaly analysis. It allows SOAP software to correctly position and orient all of the principal bodies of the Solar System within a single simulation session along with multiple spacecraft trajectories and the orientation of mission payloads. In addition to the 3D visualization, the user can define numeric variables and x-y plots to quantitatively assess metrics of interest.

  17. Fully automatic and precise data analysis developed for time-of-flight mass spectrometry.

    PubMed

    Meyer, Stefan; Riedo, Andreas; Neuland, Maike B; Tulej, Marek; Wurz, Peter

    2017-09-01

    Scientific objectives of current and future space missions are focused on the investigation of the origin and evolution of the solar system with the particular emphasis on habitability and signatures of past and present life. For in situ measurements of the chemical composition of solid samples on planetary surfaces, the neutral atmospheric gas and the thermal plasma of planetary atmospheres, the application of mass spectrometers making use of time-of-flight mass analysers is a technique widely used. However, such investigations imply measurements with good statistics and, thus, a large amount of data to be analysed. Therefore, faster and especially robust automated data analysis with enhanced accuracy is required. In this contribution, an automatic data analysis software, which allows fast and precise quantitative data analysis of time-of-flight mass spectrometric data, is presented and discussed in detail. A crucial part of this software is a robust and fast peak finding algorithm with a consecutive numerical integration method allowing precise data analysis. We tested our analysis software with data from different time-of-flight mass spectrometers and different measurement campaigns thereof. The quantitative analysis of isotopes, using automatic data analysis, yields results with an accuracy of isotope ratios up to 100 ppm for a signal-to-noise ratio (SNR) of 10 4 . We show that the accuracy of isotope ratios is in fact proportional to SNR -1 . Furthermore, we observe that the accuracy of isotope ratios is inversely proportional to the mass resolution. Additionally, we show that the accuracy of isotope ratios is depending on the sample width T s by T s 0.5 . Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  18. Quantitative analysis of cellular proteome alterations in human influenza A virus-infected mammalian cell lines.

    PubMed

    Vester, Diana; Rapp, Erdmann; Gade, Dörte; Genzel, Yvonne; Reichl, Udo

    2009-06-01

    Over the last years virus-host cell interactions were investigated in numerous studies. Viral strategies for evasion of innate immune response, inhibition of cellular protein synthesis and permission of viral RNA and protein production were disclosed. With quantitative proteome technology, comprehensive studies concerning the impact of viruses on the cellular machinery of their host cells at protein level are possible. Therefore, 2-D DIGE and nanoHPLC-nanoESI-MS/MS analysis were used to qualitatively and quantitatively determine the dynamic cellular proteome responses of two mammalian cell lines to human influenza A virus infection. A cell line used for vaccine production (MDCK) was compared with a human lung carcinoma cell line (A549) as a reference model. Analyzing 2-D gels of the proteomes of uninfected and influenza-infected host cells, 16 quantitatively altered protein spots (at least +/-1.7-fold change in relative abundance, p<0.001) were identified for both cell lines. Most significant changes were found for keratins, major components of the cytoskeleton system, and for Mx proteins, interferon-induced key components of the host cell defense. Time series analysis of infection processes allowed the identification of further proteins that are described to be involved in protein synthesis, signal transduction and apoptosis events. Most likely, these proteins are required for supporting functions during influenza viral life cycle or host cell stress response. Quantitative proteome-wide profiling of virus infection can provide insights into complexity and dynamics of virus-host cell interactions and may accelerate antiviral research and support optimization of vaccine manufacturing processes.

  19. An analytical approach based on ESI-MS, LC-MS and PCA for the quali-quantitative analysis of cycloartane derivatives in Astragalus spp.

    PubMed

    Napolitano, Assunta; Akay, Seref; Mari, Angela; Bedir, Erdal; Pizza, Cosimo; Piacente, Sonia

    2013-11-01

    Astragalus species are widely used as health foods and dietary supplements, as well as drugs in traditional medicine. To rapidly evaluate metabolite similarities and differences among the EtOH extracts of the roots of eight commercial Astragalus spp., an approach based on direct analyses by ESI-MS followed by PCA of ESI-MS data, was carried out. Successively, quali-quantitative analyses of cycloartane derivatives in the eight Astragalus spp. by LC-ESI-MS(n) and PCA of LC-ESI-MS data were performed. This approach allowed to promptly highlighting metabolite similarities and differences among the various Astragalus spp. PCA results from LC-ESI-MS data of Astragalus samples were in reasonable agreement with both PCA results of ESI-MS data and quantitative results. This study affords an analytical method for the quali-quantitative determination of cycloartane derivatives in herbal preparations used as health and food supplements. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. A Quantitative Proteomics Approach to Clinical Research with Non-Traditional Samples

    PubMed Central

    Licier, Rígel; Miranda, Eric; Serrano, Horacio

    2016-01-01

    The proper handling of samples to be analyzed by mass spectrometry (MS) can guarantee excellent results and a greater depth of analysis when working in quantitative proteomics. This is critical when trying to assess non-traditional sources such as ear wax, saliva, vitreous humor, aqueous humor, tears, nipple aspirate fluid, breast milk/colostrum, cervical-vaginal fluid, nasal secretions, bronco-alveolar lavage fluid, and stools. We intend to provide the investigator with relevant aspects of quantitative proteomics and to recognize the most recent clinical research work conducted with atypical samples and analyzed by quantitative proteomics. Having as reference the most recent and different approaches used with non-traditional sources allows us to compare new strategies in the development of novel experimental models. On the other hand, these references help us to contribute significantly to the understanding of the proportions of proteins in different proteomes of clinical interest and may lead to potential advances in the emerging field of precision medicine. PMID:28248241

  1. A Quantitative Proteomics Approach to Clinical Research with Non-Traditional Samples.

    PubMed

    Licier, Rígel; Miranda, Eric; Serrano, Horacio

    2016-10-17

    The proper handling of samples to be analyzed by mass spectrometry (MS) can guarantee excellent results and a greater depth of analysis when working in quantitative proteomics. This is critical when trying to assess non-traditional sources such as ear wax, saliva, vitreous humor, aqueous humor, tears, nipple aspirate fluid, breast milk/colostrum, cervical-vaginal fluid, nasal secretions, bronco-alveolar lavage fluid, and stools. We intend to provide the investigator with relevant aspects of quantitative proteomics and to recognize the most recent clinical research work conducted with atypical samples and analyzed by quantitative proteomics. Having as reference the most recent and different approaches used with non-traditional sources allows us to compare new strategies in the development of novel experimental models. On the other hand, these references help us to contribute significantly to the understanding of the proportions of proteins in different proteomes of clinical interest and may lead to potential advances in the emerging field of precision medicine.

  2. The Spectral Image Processing System (SIPS) - Interactive visualization and analysis of imaging spectrometer data

    NASA Technical Reports Server (NTRS)

    Kruse, F. A.; Lefkoff, A. B.; Boardman, J. W.; Heidebrecht, K. B.; Shapiro, A. T.; Barloon, P. J.; Goetz, A. F. H.

    1993-01-01

    The Center for the Study of Earth from Space (CSES) at the University of Colorado, Boulder, has developed a prototype interactive software system called the Spectral Image Processing System (SIPS) using IDL (the Interactive Data Language) on UNIX-based workstations. SIPS is designed to take advantage of the combination of high spectral resolution and spatial data presentation unique to imaging spectrometers. It streamlines analysis of these data by allowing scientists to rapidly interact with entire datasets. SIPS provides visualization tools for rapid exploratory analysis and numerical tools for quantitative modeling. The user interface is X-Windows-based, user friendly, and provides 'point and click' operation. SIPS is being used for multidisciplinary research concentrating on use of physically based analysis methods to enhance scientific results from imaging spectrometer data. The objective of this continuing effort is to develop operational techniques for quantitative analysis of imaging spectrometer data and to make them available to the scientific community prior to the launch of imaging spectrometer satellite systems such as the Earth Observing System (EOS) High Resolution Imaging Spectrometer (HIRIS).

  3. Quantitative analysis of the polarization characteristics of atherosclerotic plaques

    NASA Astrophysics Data System (ADS)

    Gubarkova, Ekaterina V.; Kirillin, Michail Y.; Dudenkova, Varvara V.; Kiseleva, Elena B.; Moiseev, Alexander A.; Gelikonov, Grigory V.; Timofeeva, Lidia B.; Fiks, Ilya I.; Feldchtein, Felix I.; Gladkova, Natalia D.

    2016-04-01

    In this study we demonstrate the capability of cross-polarization optical coherence tomography (CP OCT) to assess collagen and elastin fibers condition in atherosclerotic plaques basing on ratio of the OCT signal levels in cross- and co- polarizations. We consider the depolarization factor (DF) and the effective birefringence (Δn) as quantitative characteristics of CP OCT images. We revealed that calculation of both DF and Δn in the region of interest (fibrous cap) yields a statistically significant difference between stable and unstable plaques (0.46+/-0.21 vs 0.09+/-0.04 for IDF; (4.7+/-1.0)•10-4 vs (2.5+/-0.7)•10-4 for Δn p<0.05). In parallel with CP OCT we used the nonlinear microscopy for analysis of thin cross-section of atherosclerotic plaque, revealing the different average isotropy index of collagen and elastin fibers for stable and unstable plaques (0.30 +/- 0.10 vs 0.70 +/- 0.08; p<0.001). The proposed approach for quantitative assessment of CP OCT images allows cross-scattering and birefringence characterization of stable and unstable atherosclerotic plaques.

  4. A novel image-based quantitative method for the characterization of NETosis

    PubMed Central

    Zhao, Wenpu; Fogg, Darin K.; Kaplan, Mariana J.

    2015-01-01

    NETosis is a newly recognized mechanism of programmed neutrophil death. It is characterized by a stepwise progression of chromatin decondensation, membrane rupture, and release of bactericidal DNA-based structures called neutrophil extracellular traps (NETs). Conventional ‘suicidal’ NETosis has been described in pathogenic models of systemic autoimmune disorders. Recent in vivo studies suggest that a process of ‘vital’ NETosis also exists, in which chromatin is condensed and membrane integrity is preserved. Techniques to assess ‘suicidal’ or ‘vital’ NET formation in a specific, quantitative, rapid and semiautomated way have been lacking, hindering the characterization of this process. Here we have developed a new method to simultaneously assess both ‘suicidal’ and ‘vital’ NETosis, using high-speed multi-spectral imaging coupled to morphometric image analysis, to quantify spontaneous NET formation observed ex-vivo or stimulus-induced NET formation triggered in vitro. Use of imaging flow cytometry allows automated, quantitative and rapid analysis of subcellular morphology and texture, and introduces the potential for further investigation using NETosis as a biomarker in pre-clinical and clinical studies. PMID:26003624

  5. T1, diffusion tensor, and quantitative magnetization transfer imaging of the hippocampus in an Alzheimer's disease mouse model.

    PubMed

    Whittaker, Heather T; Zhu, Shenghua; Di Curzio, Domenico L; Buist, Richard; Li, Xin-Min; Noy, Suzanna; Wiseman, Frances K; Thiessen, Jonathan D; Martin, Melanie

    2018-07-01

    Alzheimer's disease (AD) pathology causes microstructural changes in the brain. These changes, if quantified with magnetic resonance imaging (MRI), could be studied for use as an early biomarker for AD. The aim of our study was to determine if T 1 relaxation, diffusion tensor imaging (DTI), and quantitative magnetization transfer imaging (qMTI) metrics could reveal changes within the hippocampus and surrounding white matter structures in ex vivo transgenic mouse brains overexpressing human amyloid precursor protein with the Swedish mutation. Delineation of hippocampal cell layers using DTI color maps allows more detailed analysis of T 1 -weighted imaging, DTI, and qMTI metrics, compared with segmentation of gross anatomy based on relaxation images, and with analysis of DTI or qMTI metrics alone. These alterations are observed in the absence of robust intracellular Aβ accumulation or plaque deposition as revealed by histology. This work demonstrates that multiparametric quantitative MRI methods are useful for characterizing changes within the hippocampal substructures and surrounding white matter tracts of mouse models of AD. Copyright © 2018. Published by Elsevier Inc.

  6. Quantitative image analysis of cellular heterogeneity in breast tumors complements genomic profiling.

    PubMed

    Yuan, Yinyin; Failmezger, Henrik; Rueda, Oscar M; Ali, H Raza; Gräf, Stefan; Chin, Suet-Feung; Schwarz, Roland F; Curtis, Christina; Dunning, Mark J; Bardwell, Helen; Johnson, Nicola; Doyle, Sarah; Turashvili, Gulisa; Provenzano, Elena; Aparicio, Sam; Caldas, Carlos; Markowetz, Florian

    2012-10-24

    Solid tumors are heterogeneous tissues composed of a mixture of cancer and normal cells, which complicates the interpretation of their molecular profiles. Furthermore, tissue architecture is generally not reflected in molecular assays, rendering this rich information underused. To address these challenges, we developed a computational approach based on standard hematoxylin and eosin-stained tissue sections and demonstrated its power in a discovery and validation cohort of 323 and 241 breast tumors, respectively. To deconvolute cellular heterogeneity and detect subtle genomic aberrations, we introduced an algorithm based on tumor cellularity to increase the comparability of copy number profiles between samples. We next devised a predictor for survival in estrogen receptor-negative breast cancer that integrated both image-based and gene expression analyses and significantly outperformed classifiers that use single data types, such as microarray expression signatures. Image processing also allowed us to describe and validate an independent prognostic factor based on quantitative analysis of spatial patterns between stromal cells, which are not detectable by molecular assays. Our quantitative, image-based method could benefit any large-scale cancer study by refining and complementing molecular assays of tumor samples.

  7. The genetic architecture of photosynthesis and plant growth-related traits in tomato.

    PubMed

    de Oliveira Silva, Franklin Magnum; Lichtenstein, Gabriel; Alseekh, Saleh; Rosado-Souza, Laise; Conte, Mariana; Suguiyama, Vanessa Fuentes; Lira, Bruno Silvestre; Fanourakis, Dimitrios; Usadel, Björn; Bhering, Leonardo Lopes; DaMatta, Fábio M; Sulpice, Ronan; Araújo, Wagner L; Rossi, Magdalena; de Setta, Nathalia; Fernie, Alisdair R; Carrari, Fernando; Nunes-Nesi, Adriano

    2018-02-01

    To identify genomic regions involved in the regulation of fundamental physiological processes such as photosynthesis and respiration, a population of Solanum pennellii introgression lines was analyzed. We determined phenotypes for physiological, metabolic, and growth related traits, including gas exchange and chlorophyll fluorescence parameters. Data analysis allowed the identification of 208 physiological and metabolic quantitative trait loci with 33 of these being associated to smaller intervals of the genomic regions, termed BINs. Eight BINs were identified that were associated with higher assimilation rates than the recurrent parent M82. Two and 10 genomic regions were related to shoot and root dry matter accumulation, respectively. Nine genomic regions were associated with starch levels, whereas 12 BINs were associated with the levels of other metabolites. Additionally, a comprehensive and detailed annotation of the genomic regions spanning these quantitative trait loci allowed us to identify 87 candidate genes that putatively control the investigated traits. We confirmed 8 of these at the level of variance in gene expression. Taken together, our results allowed the identification of candidate genes that most likely regulate photosynthesis, primary metabolism, and plant growth and as such provide new avenues for crop improvement. © 2017 John Wiley & Sons Ltd.

  8. A hanging drop culture method to study terminal erythroid differentiation.

    PubMed

    Gutiérrez, Laura; Lindeboom, Fokke; Ferreira, Rita; Drissen, Roy; Grosveld, Frank; Whyatt, David; Philipsen, Sjaak

    2005-10-01

    To design a culture method allowing the quantitative and qualitative analysis of terminal erythroid differentiation. Primary erythroid progenitors derived either from mouse tissues or from human umbilical cord blood were differentiated using hanging drop cultures and compared to methylcellulose cultures. Cultured cells were analyzed by FACS to assess differentiation. We describe a practical culture method by adapting the previously described hanging drop culture system to conditions allowing terminal differentiation of primary erythroid progenitors. Using minimal volumes of media and small numbers of cells, we obtained quantitative terminal erythroid differentiation within two days of culture in the case of murine cells and 4 days in the case of human cells. The established methods for ex vivo culture of primary erythroid progenitors, such as methylcellulose-based burst-forming unit-erythroid (BFU-E) and colony-forming unit-erythroid (CFU-E) assays, allow the detection of committed erythroid progenitors but are of limited value to study terminal erythroid differentiation. We show that the application of hanging drop cultures is a practical alternative that, in combination with clonogenic assays, enables a comprehensive assessment of the behavior of primary erythroid cells ex vivo in the context of genetic and drug-induced perturbations.

  9. Direct-injection chemiluminescence detector. Properties and potential applications in flow analysis.

    PubMed

    Koronkiewicz, Stanislawa; Kalinowski, Slawomir

    2015-02-01

    We present a novel chemiluminescence detector, with a cone-shaped detection chamber where the analytical reaction takes place. The sample and appropriate reagents are injected directly into the chamber in countercurrent using solenoid-operated pulse micro-pumps. The proposed detector allows for fast measurement of the chemiluminescence signal in stop-flow conditions from the moment of reagents mixing. To evaluate potential applications of the detector the Fenton-like reaction with a luminol-H2O2 system and several transition metal ions (Co(2+), Cu(2+), Cr(3+), Fe(3+)) as a catalyst were investigated. The results demonstrate suitability of the proposed detector for quantitative analysis and for investigations of reaction kinetics, particularly rapid reactions. A multi-pumping flow system was designed and optimized. The developed methodology demonstrated that the shape of the analytical signals strongly depends on the type and concentration of the metal ions. The application of the detector in quantitative analysis was assessed for determination of Fe(III). The direct-injection chemiluminescence detector allows for a sensitive and repeatable (R.S.D. 2%) determination. The intensity of chemiluminescence increased linearly in the range from about 0.5 to 10 mg L(-1) Fe(III) with the detection limit of 0.025 mg L(-1). The time of analysis depended mainly on reaction kinetics. It is possible to achieve the high sampling rate of 144 samples per hour. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Dissection and Downstream Analysis of Zebra Finch Embryos at Early Stages of Development

    PubMed Central

    Murray, Jessica R.; Stanciauskas, Monika E.; Aralere, Tejas S.; Saha, Margaret S.

    2014-01-01

    The zebra finch (Taeniopygiaguttata) has become an increasingly important model organism in many areas of research including toxicology1,2, behavior3, and memory and learning4,5,6. As the only songbird with a sequenced genome, the zebra finch has great potential for use in developmental studies; however, the early stages of zebra finch development have not been well studied. Lack of research in zebra finch development can be attributed to the difficulty of dissecting the small egg and embryo. The following dissection method minimizes embryonic tissue damage, which allows for investigation of morphology and gene expression at all stages of embryonic development. This permits both bright field and fluorescence quality imaging of embryos, use in molecular procedures such as in situ hybridization (ISH), cell proliferation assays, and RNA extraction for quantitative assays such as quantitative real-time PCR (qtRT-PCR). This technique allows investigators to study early stages of development that were previously difficult to access. PMID:24999108

  11. Is procrastination all that "bad"? A qualitative study of academic procrastination and self-worth in postgraduate university students.

    PubMed

    Abramowski, Anna

    2018-01-01

    Most of the existing literature investigated the construct of procrastination using quantitative paradigms-primarily self-administered questionnaires. However, such approaches seem to limit insight, elaboration, and deeper understanding of central facets that might influence procrastination. The present qualitative study explored how a sample of postgraduate students from Cambridge University represented academic procrastination framed within their personal perspectives and context using semistructured interviews. This study extends the existing quantitative literature by adding students' personal narratives and voices. Ten postgraduate students were interviewed and the data were analyzed using thematic analysis. The preponderance of the literature on academic procrastination has described it as a maladaptive and detrimental behavior. However, the present study found evidence which supports the existence of a positive form of procrastination as well which suggests that procrastination can sometimes be worthwhile and allow further thinking time, allowing students to do a task and enable them to give more attention to detail which suggests a reconsideration of the negative image commonly associated with procrastination.

  12. Modeling and Analysis of Space Based Transceivers

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Liebetreu, John; Moore, Michael S.; Price, Jeremy C.; Abbott, Ben

    2005-01-01

    This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.

  13. Modeling and Analysis of Space Based Transceivers

    NASA Technical Reports Server (NTRS)

    Moore, Michael S.; Price, Jeremy C.; Abbott, Ben; Liebetreu, John; Reinhart, Richard C.; Kacpura, Thomas J.

    2007-01-01

    This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.

  14. Time-lapse videos for physics education: specific examples

    NASA Astrophysics Data System (ADS)

    Vollmer, Michael; Möllmann, Klaus-Peter

    2018-05-01

    There are many physics experiments with long time scales such that they are usually neither shown in the physics class room nor in student labs. However, they can be easily recorded with time-lapse cameras and the respective time-lapse videos allow qualitative and/or quantitative analysis of the underlying physics. Here, we present some examples from thermal physics (melting, evaporation, cooling) as well as diffusion processes

  15. Extended Field Laser Confocal Microscopy (EFLCM): Combining automated Gigapixel image capture with in silico virtual microscopy

    PubMed Central

    Flaberg, Emilie; Sabelström, Per; Strandh, Christer; Szekely, Laszlo

    2008-01-01

    Background Confocal laser scanning microscopy has revolutionized cell biology. However, the technique has major limitations in speed and sensitivity due to the fact that a single laser beam scans the sample, allowing only a few microseconds signal collection for each pixel. This limitation has been overcome by the introduction of parallel beam illumination techniques in combination with cold CCD camera based image capture. Methods Using the combination of microlens enhanced Nipkow spinning disc confocal illumination together with fully automated image capture and large scale in silico image processing we have developed a system allowing the acquisition, presentation and analysis of maximum resolution confocal panorama images of several Gigapixel size. We call the method Extended Field Laser Confocal Microscopy (EFLCM). Results We show using the EFLCM technique that it is possible to create a continuous confocal multi-colour mosaic from thousands of individually captured images. EFLCM can digitize and analyze histological slides, sections of entire rodent organ and full size embryos. It can also record hundreds of thousands cultured cells at multiple wavelength in single event or time-lapse fashion on fixed slides, in live cell imaging chambers or microtiter plates. Conclusion The observer independent image capture of EFLCM allows quantitative measurements of fluorescence intensities and morphological parameters on a large number of cells. EFLCM therefore bridges the gap between the mainly illustrative fluorescence microscopy and purely quantitative flow cytometry. EFLCM can also be used as high content analysis (HCA) instrument for automated screening processes. PMID:18627634

  16. Extended Field Laser Confocal Microscopy (EFLCM): combining automated Gigapixel image capture with in silico virtual microscopy.

    PubMed

    Flaberg, Emilie; Sabelström, Per; Strandh, Christer; Szekely, Laszlo

    2008-07-16

    Confocal laser scanning microscopy has revolutionized cell biology. However, the technique has major limitations in speed and sensitivity due to the fact that a single laser beam scans the sample, allowing only a few microseconds signal collection for each pixel. This limitation has been overcome by the introduction of parallel beam illumination techniques in combination with cold CCD camera based image capture. Using the combination of microlens enhanced Nipkow spinning disc confocal illumination together with fully automated image capture and large scale in silico image processing we have developed a system allowing the acquisition, presentation and analysis of maximum resolution confocal panorama images of several Gigapixel size. We call the method Extended Field Laser Confocal Microscopy (EFLCM). We show using the EFLCM technique that it is possible to create a continuous confocal multi-colour mosaic from thousands of individually captured images. EFLCM can digitize and analyze histological slides, sections of entire rodent organ and full size embryos. It can also record hundreds of thousands cultured cells at multiple wavelength in single event or time-lapse fashion on fixed slides, in live cell imaging chambers or microtiter plates. The observer independent image capture of EFLCM allows quantitative measurements of fluorescence intensities and morphological parameters on a large number of cells. EFLCM therefore bridges the gap between the mainly illustrative fluorescence microscopy and purely quantitative flow cytometry. EFLCM can also be used as high content analysis (HCA) instrument for automated screening processes.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hartmann, Anja, E-mail: hartmann@ipk-gatersleben.de; Schreiber, Falk; Martin-Luther-University Halle-Wittenberg, Halle

    The characterization of biological systems with respect to their behavior and functionality based on versatile biochemical interactions is a major challenge. To understand these complex mechanisms at systems level modeling approaches are investigated. Different modeling formalisms allow metabolic models to be analyzed depending on the question to be solved, the biochemical knowledge and the availability of experimental data. Here, we describe a method for an integrative analysis of the structure and dynamics represented by qualitative and quantitative metabolic models. Using various formalisms, the metabolic model is analyzed from different perspectives. Determined structural and dynamic properties are visualized in the contextmore » of the metabolic model. Interaction techniques allow the exploration and visual analysis thereby leading to a broader understanding of the behavior and functionality of the underlying biological system. The System Biology Metabolic Model Framework (SBM{sup 2} – Framework) implements the developed method and, as an example, is applied for the integrative analysis of the crop plant potato.« less

  18. Development of one novel multiple-target plasmid for duplex quantitative PCR analysis of roundup ready soybean.

    PubMed

    Zhang, Haibo; Yang, Litao; Guo, Jinchao; Li, Xiang; Jiang, Lingxi; Zhang, Dabing

    2008-07-23

    To enforce the labeling regulations of genetically modified organisms (GMOs), the application of reference molecules as calibrators is becoming essential for practical quantification of GMOs. However, the reported reference molecules with tandem marker multiple targets have been proved not suitable for duplex PCR analysis. In this study, we developed one unique plasmid molecule based on one pMD-18T vector with three exogenous target DNA fragments of Roundup Ready soybean GTS 40-3-2 (RRS), that is, CaMV35S, NOS, and RRS event fragments, plus one fragment of soybean endogenous Lectin gene. This Lectin gene fragment was separated from the three exogenous target DNA fragments of RRS by inserting one 2.6 kb DNA fragment with no relatedness to RRS detection targets in this resultant plasmid. Then, we proved that this design allows the quantification of RRS using the three duplex real-time PCR assays targeting CaMV35S, NOS, and RRS events employing this reference molecule as the calibrator. In these duplex PCR assays, the limits of detection (LOD) and quantification (LOQ) were 10 and 50 copies, respectively. For the quantitative analysis of practical RRS samples, the results of accuracy and precision were similar to those of simplex PCR assays, for instance, the quantitative results were at the 1% level, the mean bias of the simplex and duplex PCR were 4.0% and 4.6%, respectively, and the statistic analysis ( t-test) showed that the quantitative data from duplex and simplex PCR had no significant discrepancy for each soybean sample. Obviously, duplex PCR analysis has the advantages of saving the costs of PCR reaction and reducing the experimental errors in simplex PCR testing. The strategy reported in the present study will be helpful for the development of new reference molecules suitable for duplex PCR quantitative assays of GMOs.

  19. A comparison of cosegregation analysis methods for the clinical setting.

    PubMed

    Rañola, John Michael O; Liu, Quanhui; Rosenthal, Elisabeth A; Shirts, Brian H

    2018-04-01

    Quantitative cosegregation analysis can help evaluate the pathogenicity of genetic variants. However, genetics professionals without statistical training often use simple methods, reporting only qualitative findings. We evaluate the potential utility of quantitative cosegregation in the clinical setting by comparing three methods. One thousand pedigrees each were simulated for benign and pathogenic variants in BRCA1 and MLH1 using United States historical demographic data to produce pedigrees similar to those seen in the clinic. These pedigrees were analyzed using two robust methods, full likelihood Bayes factors (FLB) and cosegregation likelihood ratios (CSLR), and a simpler method, counting meioses. Both FLB and CSLR outperform counting meioses when dealing with pathogenic variants, though counting meioses is not far behind. For benign variants, FLB and CSLR greatly outperform as counting meioses is unable to generate evidence for benign variants. Comparing FLB and CSLR, we find that the two methods perform similarly, indicating that quantitative results from either of these methods could be combined in multifactorial calculations. Combining quantitative information will be important as isolated use of cosegregation in single families will yield classification for less than 1% of variants. To encourage wider use of robust cosegregation analysis, we present a website ( http://www.analyze.myvariant.org ) which implements the CSLR, FLB, and Counting Meioses methods for ATM, BRCA1, BRCA2, CHEK2, MEN1, MLH1, MSH2, MSH6, and PMS2. We also present an R package, CoSeg, which performs the CSLR analysis on any gene with user supplied parameters. Future variant classification guidelines should allow nuanced inclusion of cosegregation evidence against pathogenicity.

  20. Google glass based immunochromatographic diagnostic test analysis

    NASA Astrophysics Data System (ADS)

    Feng, Steve; Caire, Romain; Cortazar, Bingen; Turan, Mehmet; Wong, Andrew; Ozcan, Aydogan

    2015-03-01

    Integration of optical imagers and sensors into recently emerging wearable computational devices allows for simpler and more intuitive methods of integrating biomedical imaging and medical diagnostics tasks into existing infrastructures. Here we demonstrate the ability of one such device, the Google Glass, to perform qualitative and quantitative analysis of immunochromatographic rapid diagnostic tests (RDTs) using a voice-commandable hands-free software-only interface, as an alternative to larger and more bulky desktop or handheld units. Using the built-in camera of Glass to image one or more RDTs (labeled with Quick Response (QR) codes), our Glass software application uploads the captured image and related information (e.g., user name, GPS, etc.) to our servers for remote analysis and storage. After digital analysis of the RDT images, the results are transmitted back to the originating Glass device, and made available through a website in geospatial and tabular representations. We tested this system on qualitative human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) RDTs. For qualitative HIV tests, we demonstrate successful detection and labeling (i.e., yes/no decisions) for up to 6-fold dilution of HIV samples. For quantitative measurements, we activated and imaged PSA concentrations ranging from 0 to 200 ng/mL and generated calibration curves relating the RDT line intensity values to PSA concentration. By providing automated digitization of both qualitative and quantitative test results, this wearable colorimetric diagnostic test reader platform on Google Glass can reduce operator errors caused by poor training, provide real-time spatiotemporal mapping of test results, and assist with remote monitoring of various biomedical conditions.

  1. Capillary nano-immunoassays: advancing quantitative proteomics analysis, biomarker assessment, and molecular diagnostics.

    PubMed

    Chen, Jin-Qiu; Wakefield, Lalage M; Goldstein, David J

    2015-06-06

    There is an emerging demand for the use of molecular profiling to facilitate biomarker identification and development, and to stratify patients for more efficient treatment decisions with reduced adverse effects. In the past decade, great strides have been made to advance genomic, transcriptomic and proteomic approaches to address these demands. While there has been much progress with these large scale approaches, profiling at the protein level still faces challenges due to limitations in clinical sample size, poor reproducibility, unreliable quantitation, and lack of assay robustness. A novel automated capillary nano-immunoassay (CNIA) technology has been developed. This technology offers precise and accurate measurement of proteins and their post-translational modifications using either charge-based or size-based separation formats. The system not only uses ultralow nanogram levels of protein but also allows multi-analyte analysis using a parallel single-analyte format for increased sensitivity and specificity. The high sensitivity and excellent reproducibility of this technology make it particularly powerful for analysis of clinical samples. Furthermore, the system can distinguish and detect specific protein post-translational modifications that conventional Western blot and other immunoassays cannot easily capture. This review will summarize and evaluate the latest progress to optimize the CNIA system for comprehensive, quantitative protein and signaling event characterization. It will also discuss how the technology has been successfully applied in both discovery research and clinical studies, for signaling pathway dissection, proteomic biomarker assessment, targeted treatment evaluation and quantitative proteomic analysis. Lastly, a comparison of this novel system with other conventional immuno-assay platforms is performed.

  2. Quantitative RNA-seq analysis of the Campylobacter jejuni transcriptome

    PubMed Central

    Chaudhuri, Roy R.; Yu, Lu; Kanji, Alpa; Perkins, Timothy T.; Gardner, Paul P.; Choudhary, Jyoti; Maskell, Duncan J.

    2011-01-01

    Campylobacter jejuni is the most common bacterial cause of foodborne disease in the developed world. Its general physiology and biochemistry, as well as the mechanisms enabling it to colonize and cause disease in various hosts, are not well understood, and new approaches are required to understand its basic biology. High-throughput sequencing technologies provide unprecedented opportunities for functional genomic research. Recent studies have shown that direct Illumina sequencing of cDNA (RNA-seq) is a useful technique for the quantitative and qualitative examination of transcriptomes. In this study we report RNA-seq analyses of the transcriptomes of C. jejuni (NCTC11168) and its rpoN mutant. This has allowed the identification of hitherto unknown transcriptional units, and further defines the regulon that is dependent on rpoN for expression. The analysis of the NCTC11168 transcriptome was supplemented by additional proteomic analysis using liquid chromatography-MS. The transcriptomic and proteomic datasets represent an important resource for the Campylobacter research community. PMID:21816880

  3. Fourier analysis of human soft tissue facial shape: sex differences in normal adults.

    PubMed Central

    Ferrario, V F; Sforza, C; Schmitz, J H; Miani, A; Taroni, G

    1995-01-01

    Sexual dimorphism in human facial form involves both size and shape variations of the soft tissue structures. These variations are conventionally appreciated using linear and angular measurements, as well as ratios, taken from photographs or radiographs. Unfortunately this metric approach provides adequate quantitative information about size only, eluding the problems of shape definition. Mathematical methods such as the Fourier series allow a correct quantitative analysis of shape and of its changes. A method for the reconstruction of outlines starting from selected landmarks and for their Fourier analysis has been developed, and applied to analyse sex differences in shape of the soft tissue facial contour in a group of healthy young adults. When standardised for size, no sex differences were found between both cosine and sine coefficients of the Fourier series expansion. This shape similarity was largely overwhelmed by the very evident size differences and it could be measured only using the proper mathematical methods. PMID:8586558

  4. [Clinical research XXIII. From clinical judgment to meta-analyses].

    PubMed

    Rivas-Ruiz, Rodolfo; Castelán-Martínez, Osvaldo D; Pérez-Rodríguez, Marcela; Palacios-Cruz, Lino; Noyola-Castillo, Maura E; Talavera, Juan O

    2014-01-01

    Systematic reviews (SR) are studies made in order to ask clinical questions based on original articles. Meta-analysis (MTA) is the mathematical analysis of SR. These analyses are divided in two groups, those which evaluate the measured results of quantitative variables (for example, the body mass index -BMI-) and those which evaluate qualitative variables (for example, if a patient is alive or dead, or if he is healing or not). Quantitative variables generally use the mean difference analysis and qualitative variables can be performed using several calculations: odds ratio (OR), relative risk (RR), absolute risk reduction (ARR) and hazard ratio (HR). These analyses are represented through forest plots which allow the evaluation of each individual study, as well as the heterogeneity between studies and the overall effect of the intervention. These analyses are mainly based on Student's t test and chi-squared. To take appropriate decisions based on the MTA, it is important to understand the characteristics of statistical methods in order to avoid misinterpretations.

  5. Evaluation of electrochemical, UV/VIS and Raman spectroelectrochemical detection of Naratriptan with screen-printed electrodes.

    PubMed

    Hernández, Carla Navarro; Martín-Yerga, Daniel; González-García, María Begoña; Hernández-Santos, David; Fanjul-Bolado, Pablo

    2018-02-01

    Naratriptan, active pharmaceutical ingredient with antimigraine activity was electrochemically detected in untreated screen-printed carbon electrodes (SPCEs). Cyclic voltammetry and differential pulse voltammetry were used to carry out quantitative analysis of this molecule (in a Britton-Robinson buffer solution at pH 3.0) through its irreversible oxidation (diffusion controlled) at a potential of +0.75V (vs. Ag pseudoreference electrode). Naratriptan oxidation product is an indole based dimer with a yellowish colour (maximum absorption at 320nm) so UV-VIS spectroelectrochemistry technique was used for the very first time as an in situ characterization and quantification technique for this molecule. A reflection configuration approach allowed its measurement over the untreated carbon based electrode. Finally, time resolved Raman Spectroelectrochemistry is used as a powerful technique to carry out qualitative and quantitative analysis of Naratriptan. Electrochemically treated silver screen-printed electrodes are shown as easy to use and cost-effective SERS substrates for the analysis of Naratriptan. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Systematic exploration of essential yeast gene function with temperature-sensitive mutants

    PubMed Central

    Li, Zhijian; Vizeacoumar, Franco J; Bahr, Sondra; Li, Jingjing; Warringer, Jonas; Vizeacoumar, Frederick S; Min, Renqiang; VanderSluis, Benjamin; Bellay, Jeremy; DeVit, Michael; Fleming, James A; Stephens, Andrew; Haase, Julian; Lin, Zhen-Yuan; Baryshnikova, Anastasia; Lu, Hong; Yan, Zhun; Jin, Ke; Barker, Sarah; Datti, Alessandro; Giaever, Guri; Nislow, Corey; Bulawa, Chris; Myers, Chad L; Costanzo, Michael; Gingras, Anne-Claude; Zhang, Zhaolei; Blomberg, Anders; Bloom, Kerry; Andrews, Brenda; Boone, Charles

    2012-01-01

    Conditional temperature-sensitive (ts) mutations are valuable reagents for studying essential genes in the yeast Saccharomyces cerevisiae. We constructed 787 ts strains, covering 497 (~45%) of the 1,101 essential yeast genes, with ~30% of the genes represented by multiple alleles. All of the alleles are integrated into their native genomic locus in the S288C common reference strain and are linked to a kanMX selectable marker, allowing further genetic manipulation by synthetic genetic array (SGA)–based, high-throughput methods. We show two such manipulations: barcoding of 440 strains, which enables chemical-genetic suppression analysis, and the construction of arrays of strains carrying different fluorescent markers of subcellular structure, which enables quantitative analysis of phenotypes using high-content screening. Quantitative analysis of a GFP-tubulin marker identified roles for cohesin and condensin genes in spindle disassembly. This mutant collection should facilitate a wide range of systematic studies aimed at understanding the functions of essential genes. PMID:21441928

  7. Mapping the Extracellular and Membrane Proteome Associated with the Vasculature and the Stroma in the Embryo*

    PubMed Central

    Soulet, Fabienne; Kilarski, Witold W.; Roux-Dalvai, Florence; Herbert, John M. J.; Sacewicz, Izabela; Mouton-Barbosa, Emmanuelle; Bicknell, Roy; Lalor, Patricia; Monsarrat, Bernard; Bikfalvi, Andreas

    2013-01-01

    In order to map the extracellular or membrane proteome associated with the vasculature and the stroma in an embryonic organism in vivo, we developed a biotinylation technique for chicken embryo and combined it with mass spectrometry and bioinformatic analysis. We also applied this procedure to implanted tumors growing on the chorioallantoic membrane or after the induction of granulation tissue. Membrane and extracellular matrix proteins were the most abundant components identified. Relative quantitative analysis revealed differential protein expression patterns in several tissues. Through a bioinformatic approach, we determined endothelial cell protein expression signatures, which allowed us to identify several proteins not yet reported to be associated with endothelial cells or the vasculature. This is the first study reported so far that applies in vivo biotinylation, in combination with robust label-free quantitative proteomics approaches and bioinformatic analysis, to an embryonic organism. It also provides the first description of the vascular and matrix proteome of the embryo that might constitute the starting point for further developments. PMID:23674615

  8. [Improvement of 2-mercaptoimidazoline analysis in rubber products containing chlorine].

    PubMed

    Kaneko, Reiko; Haneishi, Nahoko; Kawamura, Yoko

    2012-01-01

    An improved analysis method for 2-mercaptoimidazoline in rubber products containing chlorine was developed. 2-Mercaptoimidazoline (20 µg/mL) is detected by means of TLC with two developing solvents in the official method. But, this method is not quantitative. Instead, we employed HPLC using water-methanol (9 : 1) as the mobile phase. This procedure decreased interfering peaks, and the quantitation limit was 2 µg/mL of standard solution. 2-Mercaptoimidazoline was confirmed by GC-MS (5 µg/mL) and LC/MS (1 µg/mL) in the scan mode. For preparation of test solution, a soaking extraction method, in which 20 mL of methanol was added to the sample and allowed to stand overnight at about 40°C, was used. This gave similar values to the Soxhlet extraction method (official method) and was more convenient. The results indicate that our procedure is suitable for analysis of 2-mercaptoimidazoline. When 2-mercaptoimidazoline is detected, it is confirmed by either GC/MS or LC/MS.

  9. Experimental design and data-analysis in label-free quantitative LC/MS proteomics: A tutorial with MSqRob.

    PubMed

    Goeminne, Ludger J E; Gevaert, Kris; Clement, Lieven

    2018-01-16

    Label-free shotgun proteomics is routinely used to assess proteomes. However, extracting relevant information from the massive amounts of generated data remains difficult. This tutorial provides a strong foundation on analysis of quantitative proteomics data. We provide key statistical concepts that help researchers to design proteomics experiments and we showcase how to analyze quantitative proteomics data using our recent free and open-source R package MSqRob, which was developed to implement the peptide-level robust ridge regression method for relative protein quantification described by Goeminne et al. MSqRob can handle virtually any experimental proteomics design and outputs proteins ordered by statistical significance. Moreover, its graphical user interface and interactive diagnostic plots provide easy inspection and also detection of anomalies in the data and flaws in the data analysis, allowing deeper assessment of the validity of results and a critical review of the experimental design. Our tutorial discusses interactive preprocessing, data analysis and visualization of label-free MS-based quantitative proteomics experiments with simple and more complex designs. We provide well-documented scripts to run analyses in bash mode on GitHub, enabling the integration of MSqRob in automated pipelines on cluster environments (https://github.com/statOmics/MSqRob). The concepts outlined in this tutorial aid in designing better experiments and analyzing the resulting data more appropriately. The two case studies using the MSqRob graphical user interface will contribute to a wider adaptation of advanced peptide-based models, resulting in higher quality data analysis workflows and more reproducible results in the proteomics community. We also provide well-documented scripts for experienced users that aim at automating MSqRob on cluster environments. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Quantitative monitoring of tamoxifen in human plasma extended to 40 metabolites using liquid-chromatography high-resolution mass spectrometry: new investigation capabilities for clinical pharmacology.

    PubMed

    Dahmane, Elyes; Boccard, Julien; Csajka, Chantal; Rudaz, Serge; Décosterd, Laurent; Genin, Eric; Duretz, Bénédicte; Bromirski, Maciej; Zaman, Khalil; Testa, Bernard; Rochat, Bertrand

    2014-04-01

    Liquid-chromatography (LC) high-resolution (HR) mass spectrometry (MS) analysis can record HR full scans, a technique of detection that shows comparable selectivity and sensitivity to ion transitions (SRM) performed with triple-quadrupole (TQ)-MS but that allows de facto determination of "all" ions including drug metabolites. This could be of potential utility in in vivo drug metabolism and pharmacovigilance studies in order to have a more comprehensive insight in drug biotransformation profile differences in patients. This simultaneous quantitative and qualitative (Quan/Qual) approach has been tested with 20 patients chronically treated with tamoxifen (TAM). The absolute quantification of TAM and three metabolites in plasma was realized using HR- and TQ-MS and compared. The same LC-HR-MS analysis allowed the identification and relative quantification of 37 additional TAM metabolites. A number of new metabolites were detected in patients' plasma including metabolites identified as didemethyl-trihydroxy-TAM-glucoside and didemethyl-tetrahydroxy-TAM-glucoside conjugates corresponding to TAM with six and seven biotransformation steps, respectively. Multivariate analysis allowed relevant patterns of metabolites and ratios to be associated with TAM administration and CYP2D6 genotype. Two hydroxylated metabolites, α-OH-TAM and 4'-OH-TAM, were newly identified as putative CYP2D6 substrates. The relative quantification was precise (<20 %), and the semiquantitative estimation suggests that metabolite levels are non-negligible. Metabolites could play an important role in drug toxicity, but their impact on drug-related side effects has been partially neglected due to the tremendous effort needed with previous MS technologies. Using present HR-MS, this situation should evolve with the straightforward determination of drug metabolites, enlarging the possibilities in studying inter- and intra-patients drug metabolism variability and related effects.

  11. The Spectral Image Processing System (SIPS): Software for integrated analysis of AVIRIS data

    NASA Technical Reports Server (NTRS)

    Kruse, F. A.; Lefkoff, A. B.; Boardman, J. W.; Heidebrecht, K. B.; Shapiro, A. T.; Barloon, P. J.; Goetz, A. F. H.

    1992-01-01

    The Spectral Image Processing System (SIPS) is a software package developed by the Center for the Study of Earth from Space (CSES) at the University of Colorado, Boulder, in response to a perceived need to provide integrated tools for analysis of imaging spectrometer data both spectrally and spatially. SIPS was specifically designed to deal with data from the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and the High Resolution Imaging Spectrometer (HIRIS), but was tested with other datasets including the Geophysical and Environmental Research Imaging Spectrometer (GERIS), GEOSCAN images, and Landsat TM. SIPS was developed using the 'Interactive Data Language' (IDL). It takes advantage of high speed disk access and fast processors running under the UNIX operating system to provide rapid analysis of entire imaging spectrometer datasets. SIPS allows analysis of single or multiple imaging spectrometer data segments at full spatial and spectral resolution. It also allows visualization and interactive analysis of image cubes derived from quantitative analysis procedures such as absorption band characterization and spectral unmixing. SIPS consists of three modules: SIPS Utilities, SIPS_View, and SIPS Analysis. SIPS version 1.1 is described below.

  12. Recovery and Determination of Adsorbed Technetium on Savannah River Site Charcoal Stack Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lahoda, Kristy G.; Engelmann, Mark D.; Farmer, Orville T.

    2008-03-01

    Experimental results are provided for the sample analyses for technetium (Tc) in charcoal samples placed in-line with a Savannah River Site (SRS) processing stack effluent stream as a part of an environmental surveillance program. The method for Tc removal from charcoal was based on that originally developed with high purity charcoal. Presented is the process that allowed for the quantitative analysis of 99Tc in SRS charcoal stack samples with and without 97Tc as a tracer. The results obtained with the method using the 97Tc tracer quantitatively confirm the results obtained with no tracer added. All samples contain 99Tc at themore » pg g-1 level.« less

  13. Three-dimensional segmentation of luminal and adventitial borders in serial intravascular ultrasound images

    NASA Technical Reports Server (NTRS)

    Shekhar, R.; Cothren, R. M.; Vince, D. G.; Chandra, S.; Thomas, J. D.; Cornhill, J. F.

    1999-01-01

    Intravascular ultrasound (IVUS) provides exact anatomy of arteries, allowing accurate quantitative analysis. Automated segmentation of IVUS images is a prerequisite for routine quantitative analyses. We present a new three-dimensional (3D) segmentation technique, called active surface segmentation, which detects luminal and adventitial borders in IVUS pullback examinations of coronary arteries. The technique was validated against expert tracings by computing correlation coefficients (range 0.83-0.97) and William's index values (range 0.37-0.66). The technique was statistically accurate, robust to image artifacts, and capable of segmenting a large number of images rapidly. Active surface segmentation enabled geometrically accurate 3D reconstruction and visualization of coronary arteries and volumetric measurements.

  14. A new 3D tracking method for cell mechanics investigation exploiting the capabilities of digital holography in microscopy

    NASA Astrophysics Data System (ADS)

    Miccio, L.; Memmolo, P.; Merola, F.; Fusco, S.; Netti, P. A.; Ferraro, P.

    2014-03-01

    A method for 3D tracking has been developed exploiting Digital Holography features in Microscopy (DHM). In the framework of self-consistent platform for manipulation and measurement of biological specimen we use DHM for quantitative and completely label free analysis of samples with low amplitude contrast. Tracking capability extend the potentiality of DHM allowing to monitor the motion of appropriate probes and correlate it with sample properties. Complete 3D tracking has been obtained for the probes avoiding the amplitude refocusing in traditional tracking processes. Moreover, in biology and biomedical research fields one of the main topic is the understanding of morphology and mechanics of cells and microorganisms. Biological samples present low amplitude contrast that limits the information that can be retrieved through optical bright-field microscope measurements. The main effect on light propagating in such objects is in phase. This is known as phase-retardation or phase-shift. DHM is an innovative and alternative approach in microscopy, it's a good candidate for no-invasive and complete specimen analysis because its main characteristic is the possibility to discern between intensity and phase information performing quantitative mapping of the Optical Path Length. In this paper, the flexibility of DH is employed to analyze cell mechanics of unstained cells subjected to appropriate stimuli. DHM is used to measure all the parameters useful to understand the deformations induced by external and controlled stresses on in-vitro cells. Our configuration allows 3D tracking of micro-particles and, simultaneously, furnish quantitative phase-contrast maps. Experimental results are presented and discussed for in vitro cells.

  15. Analyzing Two-Phase Single-Case Data with Non-overlap and Mean Difference Indices: Illustration, Software Tools, and Alternatives.

    PubMed

    Manolov, Rumen; Losada, José L; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2016-01-01

    Two-phase single-case designs, including baseline evaluation followed by an intervention, represent the most clinically straightforward option for combining professional practice and research. However, unless they are part of a multiple-baseline schedule, such designs do not allow demonstrating a causal relation between the intervention and the behavior. Although the statistical options reviewed here cannot help overcoming this methodological limitation, we aim to make practitioners and applied researchers aware of the available appropriate options for extracting maximum information from the data. In the current paper, we suggest that the evaluation of behavioral change should include visual and quantitative analyses, complementing the substantive criteria regarding the practical importance of the behavioral change. Specifically, we emphasize the need to use structured criteria for visual analysis, such as the ones summarized in the What Works Clearinghouse Standards, especially if such criteria are complemented by visual aids, as illustrated here. For quantitative analysis, we focus on the non-overlap of all pairs and the slope and level change procedure, as they offer straightforward information and have shown reasonable performance. An illustration is provided of the use of these three pieces of information: visual, quantitative, and substantive. To make the use of visual and quantitative analysis feasible, open source software is referred to and demonstrated. In order to provide practitioners and applied researchers with a more complete guide, several analytical alternatives are commented on pointing out the situations (aims, data patterns) for which these are potentially useful.

  16. Analyzing Two-Phase Single-Case Data with Non-overlap and Mean Difference Indices: Illustration, Software Tools, and Alternatives

    PubMed Central

    Manolov, Rumen; Losada, José L.; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2016-01-01

    Two-phase single-case designs, including baseline evaluation followed by an intervention, represent the most clinically straightforward option for combining professional practice and research. However, unless they are part of a multiple-baseline schedule, such designs do not allow demonstrating a causal relation between the intervention and the behavior. Although the statistical options reviewed here cannot help overcoming this methodological limitation, we aim to make practitioners and applied researchers aware of the available appropriate options for extracting maximum information from the data. In the current paper, we suggest that the evaluation of behavioral change should include visual and quantitative analyses, complementing the substantive criteria regarding the practical importance of the behavioral change. Specifically, we emphasize the need to use structured criteria for visual analysis, such as the ones summarized in the What Works Clearinghouse Standards, especially if such criteria are complemented by visual aids, as illustrated here. For quantitative analysis, we focus on the non-overlap of all pairs and the slope and level change procedure, as they offer straightforward information and have shown reasonable performance. An illustration is provided of the use of these three pieces of information: visual, quantitative, and substantive. To make the use of visual and quantitative analysis feasible, open source software is referred to and demonstrated. In order to provide practitioners and applied researchers with a more complete guide, several analytical alternatives are commented on pointing out the situations (aims, data patterns) for which these are potentially useful. PMID:26834691

  17. Shallow Investigations of the Deep Seafloor: Quantitative Morphology in the Levant Basin, Eastern Mediterranean

    NASA Astrophysics Data System (ADS)

    Kanari, M.; Ketter, T.; Tibor, G.; Schattner, U.

    2017-12-01

    We aim to characterize the seafloor morphology and its shallow sub-surface structures and deformations in the deep part of the Levant basin (eastern Mediterranean) using recently acquired high-resolution shallow seismic reflection data and multibeam bathymetry, which allow quantitative analysis of morphology and structure. The Levant basin at the eastern Mediterranean is considered a passive continental margin, where most of the recent geological processes were related in literature to salt tectonics rooted at the Messinian deposits from 6Ma. We analyzed two sets of recently acquired high-resolution data from multibeam bathymetry and 3.5 kHz Chirp sub-bottom seismic reflection in the deep basin of the continental shelf offshore Israel (water depths up to 2100 m). Semi-automatic mapping of seafloor features and seismic data interpretation resulted in quantitative morphological analysis of the seafloor and its underlying sediment with penetration depth up to 60 m. The quantitative analysis and its interpretation are still in progress. Preliminary results reveal distinct morphologies of four major elements: channels, faults, folds and sediment waves, validated by seismic data. From the spatial distribution and orientation analyses of these phenomena, we identify two primary process types which dominate the formation of the seafloor in the Levant basin: structural and sedimentary. Characterization of the geological and geomorphological processes forming the seafloor helps to better understand the transport mechanisms and the relations between sediment transport and deposition in deep water and the shallower parts of the shelf and slope.

  18. Graphene oxide membrane as an efficient extraction and ionization substrate for spray-mass spectrometric analysis of malachite green and its metabolite in fish samples.

    PubMed

    Wei, Shih-Chun; Fan, Shen; Lien, Chia-Wen; Unnikrishnan, Binesh; Wang, Yi-Sheng; Chu, Han-Wei; Huang, Chih-Ching; Hsu, Pang-Hung; Chang, Huan-Tsung

    2018-03-20

    A graphene oxide (GO) nanosheet-modified N + -nylon membrane (GOM) has been prepared and used as an extraction and spray-ionization substrate for robust mass spectrometric detection of malachite green (MG), a highly toxic disinfectant in liquid samples and fish meat. The GOM is prepared by self-deposition of GO thin film onto an N + -nylon membrane, which has been used for efficient extraction of MG in aquaculture water samples or homogenized fish meat samples. Having a dissociation constant of 2.17 × 10 -9  M -1 , the GOM allows extraction of approximately 98% of 100 nM MG. Coupling of the GOM-spray with an ion-trap mass spectrometer allows quantitation of MG in aquaculture freshwater and seawater samples down to nanomolar levels. Furthermore, the system possesses high selectivity and sensitivity for the quantitation of MG and its metabolite (leucomalachite green) in fish meat samples. With easy extraction and efficient spray ionization properties of GOM, this membrane spray-mass spectrometry technique is relatively simple and fast in comparison to the traditional LC-MS/MS methods for the quantitation of MG and its metabolite in aquaculture products. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Finding the bottom and using it

    PubMed Central

    Sandoval, Ruben M.; Wang, Exing; Molitoris, Bruce A.

    2014-01-01

    Maximizing 2-photon parameters used in acquiring images for quantitative intravital microscopy, especially when high sensitivity is required, remains an open area of investigation. Here we present data on correctly setting the black level of the photomultiplier tube amplifier by adjusting the offset to allow for accurate quantitation of low intensity processes. When the black level is set too high some low intensity pixel values become zero and a nonlinear degradation in sensitivity occurs rendering otherwise quantifiable low intensity values virtually undetectable. Initial studies using a series of increasing offsets for a sequence of concentrations of fluorescent albumin in vitro revealed a loss of sensitivity for higher offsets at lower albumin concentrations. A similar decrease in sensitivity, and therefore the ability to correctly determine the glomerular permeability coefficient of albumin, occurred in vivo at higher offset. Finding the offset that yields accurate and linear data are essential for quantitative analysis when high sensitivity is required. PMID:25313346

  20. Global, quantitative and dynamic mapping of protein subcellular localization.

    PubMed

    Itzhak, Daniel N; Tyanova, Stefka; Cox, Jürgen; Borner, Georg Hh

    2016-06-09

    Subcellular localization critically influences protein function, and cells control protein localization to regulate biological processes. We have developed and applied Dynamic Organellar Maps, a proteomic method that allows global mapping of protein translocation events. We initially used maps statically to generate a database with localization and absolute copy number information for over 8700 proteins from HeLa cells, approaching comprehensive coverage. All major organelles were resolved, with exceptional prediction accuracy (estimated at >92%). Combining spatial and abundance information yielded an unprecedented quantitative view of HeLa cell anatomy and organellar composition, at the protein level. We subsequently demonstrated the dynamic capabilities of the approach by capturing translocation events following EGF stimulation, which we integrated into a quantitative model. Dynamic Organellar Maps enable the proteome-wide analysis of physiological protein movements, without requiring any reagents specific to the investigated process, and will thus be widely applicable in cell biology.

  1. Localization-based super-resolution imaging meets high-content screening.

    PubMed

    Beghin, Anne; Kechkar, Adel; Butler, Corey; Levet, Florian; Cabillic, Marine; Rossier, Olivier; Giannone, Gregory; Galland, Rémi; Choquet, Daniel; Sibarita, Jean-Baptiste

    2017-12-01

    Single-molecule localization microscopy techniques have proven to be essential tools for quantitatively monitoring biological processes at unprecedented spatial resolution. However, these techniques are very low throughput and are not yet compatible with fully automated, multiparametric cellular assays. This shortcoming is primarily due to the huge amount of data generated during imaging and the lack of software for automation and dedicated data mining. We describe an automated quantitative single-molecule-based super-resolution methodology that operates in standard multiwell plates and uses analysis based on high-content screening and data-mining software. The workflow is compatible with fixed- and live-cell imaging and allows extraction of quantitative data like fluorophore photophysics, protein clustering or dynamic behavior of biomolecules. We demonstrate that the method is compatible with high-content screening using 3D dSTORM and DNA-PAINT based super-resolution microscopy as well as single-particle tracking.

  2. Combinational pixel-by-pixel and object-level classifying, segmenting, and agglomerating in performing quantitative image analysis that distinguishes between healthy non-cancerous and cancerous cell nuclei and delineates nuclear, cytoplasm, and stromal material objects from stained biological tissue materials

    DOEpatents

    Boucheron, Laura E

    2013-07-16

    Quantitative object and spatial arrangement-level analysis of tissue are detailed using expert (pathologist) input to guide the classification process. A two-step method is disclosed for imaging tissue, by classifying one or more biological materials, e.g. nuclei, cytoplasm, and stroma, in the tissue into one or more identified classes on a pixel-by-pixel basis, and segmenting the identified classes to agglomerate one or more sets of identified pixels into segmented regions. Typically, the one or more biological materials comprises nuclear material, cytoplasm material, and stromal material. The method further allows a user to markup the image subsequent to the classification to re-classify said materials. The markup is performed via a graphic user interface to edit designated regions in the image.

  3. Variation compensation and analysis on diaphragm curvature analysis for emphysema quantification on whole lung CT scans

    NASA Astrophysics Data System (ADS)

    Keller, Brad M.; Reeves, Anthony P.; Barr, R. Graham; Yankelevitz, David F.; Henschke, Claudia I.

    2010-03-01

    CT scans allow for the quantitative evaluation of the anatomical bases of emphysema. Recently, a non-density based geometric measurement of lung diagphragm curvature has been proposed as a method for the quantification of emphysema from CT. This work analyzes variability of diaphragm curvature and evaluates the effectiveness of a compensation methodology for the reduction of this variability as compared to emphysema index. Using a dataset of 43 scan-pairs with less than a 100 day time-interval between scans, we find that the diaphragm curvature had a trend towards lower overall variability over emphysema index (95% CI:-9.7 to + 14.7 vs. -15.8 to +12.0), and that the variation of both measures was reduced after compensation. We conclude that the variation of the new measure can be considered comparable to the established measure and the compensation can reduce the apparent variation of quantitative measures successfully.

  4. Advanced STEM microanalysis of bimetallic nanoparticle catalysts

    NASA Astrophysics Data System (ADS)

    Lyman, Charles E.; Dimick, Paul S.

    2012-05-01

    Individual particles within bimetallic nanoparticle populations are not always identical, limiting the usefulness of bulk analysis techniques such as EXAFS. The scanning transmission electron microscope (STEM) is the only instrument able to characterize supported nanoparticle populations on a particle-by-particle basis. Quantitative elemental analyses of sub-5-nm particles reveal phase separations among particles and surface segregation within particles. This knowledge can lead to improvements in bimetallic catalysts. Advanced STEMs with field-emission guns, aberration-corrected optics, and efficient signal detection systems allow analysis of sub-nanometer particles.

  5. Systems-Level Analysis of Innate Immunity

    PubMed Central

    Zak, Daniel E.; Tam, Vincent C.; Aderem, Alan

    2014-01-01

    Systems-level analysis of biological processes strives to comprehensively and quantitatively evaluate the interactions between the relevant molecular components over time, thereby enabling development of models that can be employed to ultimately predict behavior. Rapid development in measurement technologies (omics), when combined with the accessible nature of the cellular constituents themselves, is allowing the field of innate immunity to take significant strides toward this lofty goal. In this review, we survey exciting results derived from systems biology analyses of the immune system, ranging from gene regulatory networks to influenza pathogenesis and systems vaccinology. PMID:24655298

  6. A database system to support image algorithm evaluation

    NASA Technical Reports Server (NTRS)

    Lien, Y. E.

    1977-01-01

    The design is given of an interactive image database system IMDB, which allows the user to create, retrieve, store, display, and manipulate images through the facility of a high-level, interactive image query (IQ) language. The query language IQ permits the user to define false color functions, pixel value transformations, overlay functions, zoom functions, and windows. The user manipulates the images through generic functions. The user can direct images to display devices for visual and qualitative analysis. Image histograms and pixel value distributions can also be computed to obtain a quantitative analysis of images.

  7. Sensitive molecular diagnostics using surface-enhanced resonance Raman scattering (SERRS)

    NASA Astrophysics Data System (ADS)

    Faulds, Karen; Graham, Duncan; McKenzie, Fiona; MacRae, Douglas; Ricketts, Alastair; Dougan, Jennifer

    2009-02-01

    Surface enhanced resonance Raman scattering (SERRS) is an analytical technique with several advantages over competitive techniques in terms of improved sensitivity and multiplexing. We have made great progress in the development of SERRS as a quantitative analytical method, in particular for the detection of DNA. SERRS is an extremely sensitive and selective technique which when applied to the detection of labelled DNA sequences allows detection limits to be obtained which rival, and in most cases, are better than fluorescence. Here the conditions are explored which will enable the successful detection of DNA using SERRS. The enhancing surface which is used is crucial and in this case suspensions of nanoparticles were used as they allow quantitative behaviour to be achieved and allow analogous systems to current fluorescence based systems to be made. The aggregation conditions required to obtain SERRS of DNA are crucial and herein we describe the use of spermine as an aggregating agent. The nature of the label which is used, be it fluorescent, positively or negatively charged also effects the SERRS response and these conditions are again explored here. We have clearly demonstrated the ability to identify the components of a mixture of 5 analytes in solution by using two different excitation wavelengths and also of a 6-plex using data analysis techniques. These conditions will allow the use of SERRS for the detection of target DNA in a meaningful diagnostic assay.

  8. Redefining the Breast Cancer Exosome Proteome by Tandem Mass Tag Quantitative Proteomics and Multivariate Cluster Analysis.

    PubMed

    Clark, David J; Fondrie, William E; Liao, Zhongping; Hanson, Phyllis I; Fulton, Amy; Mao, Li; Yang, Austin J

    2015-10-20

    Exosomes are microvesicles of endocytic origin constitutively released by multiple cell types into the extracellular environment. With evidence that exosomes can be detected in the blood of patients with various malignancies, the development of a platform that uses exosomes as a diagnostic tool has been proposed. However, it has been difficult to truly define the exosome proteome due to the challenge of discerning contaminant proteins that may be identified via mass spectrometry using various exosome enrichment strategies. To better define the exosome proteome in breast cancer, we incorporated a combination of Tandem-Mass-Tag (TMT) quantitative proteomics approach and Support Vector Machine (SVM) cluster analysis of three conditioned media derived fractions corresponding to a 10 000g cellular debris pellet, a 100 000g crude exosome pellet, and an Optiprep enriched exosome pellet. The quantitative analysis identified 2 179 proteins in all three fractions, with known exosomal cargo proteins displaying at least a 2-fold enrichment in the exosome fraction based on the TMT protein ratios. Employing SVM cluster analysis allowed for the classification 251 proteins as "true" exosomal cargo proteins. This study provides a robust and vigorous framework for the future development of using exosomes as a potential multiprotein marker phenotyping tool that could be useful in breast cancer diagnosis and monitoring disease progression.

  9. Quantitative detection of powdered activated carbon in wastewater treatment plant effluent by thermogravimetric analysis (TGA).

    PubMed

    Krahnstöver, Therese; Plattner, Julia; Wintgens, Thomas

    2016-09-15

    For the elimination of potentially harmful micropollutants, powdered activated carbon (PAC) adsorption is applied in many wastewater treatment plants (WWTP). This holds the risk of PAC leakage into the WWTP effluent and desorption of contaminants into natural water bodies. In order to assess a potential PAC leakage, PAC concentrations below several mg/L have to be detected in the WWTP effluent. None of the methods that are used for water analysis today are able to differentiate between activated carbon and solid background matrix. Thus, a selective, quantitative and easily applicable method is still needed for the detection of PAC residues in wastewater. In the present study, a method was developed to quantitatively measure the PAC content in wastewater by using filtration and thermogravimetric analysis (TGA), which is a well-established technique for the distinction between different solid materials. For the sample filtration, quartz filters with a temperature stability up to 950 °C were used. This allowed for sensitive and well reproducible measurements, as the TGA was not affected by the presence of the filter. The sample's mass fractions were calculated by integrating the mass decrease rate obtained by TGA in specific, clearly identifiable peak areas. A two-step TGA heating method consisting of N2 and O2 atmospheres led to a good differentiation between PAC and biological background matrix, thanks to the reduction of peak overlapping. A linear correlation was found between a sample's PAC content and the corresponding peak areas under N2 and O2, the sample volume and the solid mass separated by filtration. Based on these findings, various wastewater samples from different WWTPs were then analyzed by TGA with regard to their PAC content. It was found that, compared to alternative techniques such as measurement of turbidity or total suspended solids, the newly developed TGA method allows for a quantitative and selective detection of PAC concentrations down to 0.1 mg/L. The method showed a linearity coefficient of 0.98 and relative standard deviations of 10%, using small water sample volumes between 0.3 and 0.6 L. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Focussed ion beam thin sample microanalysis using a field emission gun electron probe microanalyser

    NASA Astrophysics Data System (ADS)

    Kubo, Y.

    2018-01-01

    Field emission gun electron probe microanalysis (FEG-EPMA) in conjunction with wavelength-dispersive X-ray spectrometry using a low acceleration voltage (V acc) allows elemental analysis with sub-micrometre lateral spatial resolution (SR). However, this degree of SR does not necessarily meet the requirements associated with increasingly miniaturised devices. Another challenge related to performing FEG-EPMA with a low V acc is that the accuracy of quantitative analyses is adversely affected, primarily because low energy X-ray lines such as the L- and M-lines must be employed and due to the potential of line interference. One promising means of obtaining high SR with FEG-EPMA is to use thin samples together with high V acc values. This mini-review covers the basic principles of thin-sample FEG-EPMA and describes an application of this technique to the analysis of optical fibres. Outstanding issues related to this technique that must be addressed are also discussed, which include the potential for electron beam damage during analysis of insulating materials and the development of methods to use thin samples for quantitative analysis.

  11. Improved assay to detect Plasmodium falciparum using an uninterrupted, semi-nested PCR and quantitative lateral flow analysis

    PubMed Central

    2013-01-01

    Background A rapid, non-invasive, and inexpensive point-of-care (POC) diagnostic for malaria followed by therapeutic intervention would improve the ability to control infection in endemic areas. Methods A semi-nested PCR amplification protocol is described for quantitative detection of Plasmodium falciparum and is compared to a traditional nested PCR. The approach uses primers that target the P. falciparum dihydrofolate reductase gene. Results This study demonstrates that it is possible to perform an uninterrupted, asymmetric, semi-nested PCR assay with reduced assay time to detect P. falciparum without compromising the sensitivity and specificity of the assay using saliva as a testing matrix. Conclusions The development of this PCR allows nucleic acid amplification without the need to transfer amplicon from the first PCR step to a second reaction tube with nested primers, thus reducing both the chance of contamination and the time for analysis to < two hours. Analysis of the PCR amplicon yield was adapted to lateral flow detection using the quantitative up-converting phosphor (UCP) reporter technology. This approach provides a basis for migration of the assay to a POC microfluidic format. In addition the assay was successfully evaluated with oral samples. Oral fluid collection provides a simple non-invasive method to collect clinical samples. PMID:23433252

  12. Quantitative analysis of detailed lignin monomer composition by pyrolysis-gas chromatography combined with preliminary acetylation of the samples.

    PubMed

    Sonoda, T; Ona, T; Yokoi, H; Ishida, Y; Ohtani, H; Tsuge, S

    2001-11-15

    Detailed quantitative analysis of lignin monomer composition comprising p-coumaryl, coniferyl, and sinapyl alcohol and p-coumaraldehyde, coniferaldehyde, and sinapaldehyde in plant has not been studied from every point mainly because of artifact formation during the lignin isolation procedure, partial loss of the lignin components inherent in the chemical degradative methods, and difficulty in the explanation of the complex spectra generally observed for the lignin components. Here we propose a new method to quantify lignin monomer composition in detail by pyrolysis-gas chromatography (Py-GC) using acetylated lignin samples. The lignin acetylation procedure would contribute to prevent secondary formation of cinnamaldehydes from the corresponding alcohol forms during pyrolysis, which are otherwise unavoidable in conventional Py-GC process to some extent. On the basis of the characteristic peaks on the pyrograms of the acetylated sample, lignin monomer compositions in various dehydrogenative polymers (DHP) as lignin model compounds were determined, taking even minor components such as cinnamaldehydes into consideration. The observed compositions by Py-GC were in good agreement with the supplied lignin monomer contents on DHP synthesis. The new Py-GC method combined with sample preacetylation allowed us an accurate quantitative analysis of detailed lignin monomer composition using a microgram order of extractive-free plant samples.

  13. Rethinking vulnerability analysis and governance with emphasis on a participatory approach.

    PubMed

    Rossignol, Nicolas; Delvenne, Pierre; Turcanu, Catrinel

    2015-01-01

    This article draws on vulnerability analysis as it emerged as a complement to classical risk analysis, and it aims at exploring its ability for nurturing risk and vulnerability governance actions. An analysis of the literature on vulnerability analysis allows us to formulate a three-fold critique: first, vulnerability analysis has been treated separately in the natural and the technological hazards fields. This separation prevents vulnerability from unleashing the full range of its potential, as it constrains appraisals into artificial categories and thus already closes down the outcomes of the analysis. Second, vulnerability analysis focused on assessment tools that are mainly quantitative, whereas qualitative appraisal is a key to assessing vulnerability in a comprehensive way and to informing policy making. Third, a systematic literature review of case studies reporting on participatory approaches to vulnerability analysis allows us to argue that participation has been important to address the above, but it remains too closed down in its approach and would benefit from embracing a more open, encompassing perspective. Therefore, we suggest rethinking vulnerability analysis as one part of a dynamic process between opening-up and closing-down strategies, in order to support a vulnerability governance framework. © 2014 Society for Risk Analysis.

  14. A simple quantitative diagnostic alternative for MGMT DNA-methylation testing on RCL2 fixed paraffin embedded tumors using restriction coupled qPCR.

    PubMed

    Pulverer, Walter; Hofner, Manuela; Preusser, Matthias; Dirnberger, Elisabeth; Hainfellner, Johannes A; Weinhaeusel, Andreas

    2014-01-01

    MGMT promoter methylation is associated with favorable prognosis and chemosensitivity in glioblastoma multiforme (GBM), especially in elderly patients. We aimed to develop a simple methylation-sensitive restriction enzyme (MSRE)-based quantitative PCR (qPCR) assay, allowing the quantification of MGMT promoter methylation. DNA was extracted from non-neoplastic brain (n = 24) and GBM samples (n = 20) upon 3 different sample conservation conditions (-80 °C, formalin-fixed and paraffin-embedded (FFPE); RCL2-fixed). We evaluated the suitability of each fixation method with respect to the MSRE-coupled qPCR methylation analyses. Methylation data were validated by MALDITOF. qPCR was used for evaluation of alternative tissue conservation procedures. DNA from FFPE tissue failed reliable testing; DNA from both RCL2-fixed and fresh frozen tissues performed equally well and was further used for validation of the quantitative MGMT methylation assay (limit of detection (LOD): 19.58 pg), using individual's undigested sample DNA for calibration. MGMT methylation analysis in non-neoplastic brain identified a background methylation of 0.10 ± 11% which we used for defining a cut-off of 0.32% for patient stratification. Of GBM patients 9 were MGMT methylationpositive (range: 0.56 - 91.95%), and 11 tested negative. MALDI-TOF measurements resulted in a concordant classification of 94% of GBM samples in comparison to qPCR. The presented methodology allows quantitative MGMT promoter methylation analyses. An amount of 200 ng DNA is sufficient for triplicate analyses including control reactions and individual calibration curves, thus excluding any DNA qualityderived bias. The combination of RCL2-fixation and quantitative methylation analyses improves pathological routine examination when histological and molecular analyses on limited amounts of tumor samples are necessary for patient stratification.

  15. Direct Detection of Pharmaceuticals and Personal Care Products from Aqueous Samples with Thermally-Assisted Desorption Electrospray Ionization Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Campbell, Ian S.; Ton, Alain T.; Mulligan, Christopher C.

    2011-07-01

    An ambient mass spectrometric method based on desorption electrospray ionization (DESI) has been developed to allow rapid, direct analysis of contaminated water samples, and the technique was evaluated through analysis of a wide array of pharmaceutical and personal care product (PPCP) contaminants. Incorporating direct infusion of aqueous sample and thermal assistance into the source design has allowed low ppt detection limits for the target analytes in drinking water matrices. With this methodology, mass spectral information can be collected in less than 1 min, consuming ~100 μL of total sample. Quantitative ability was also demonstrated without the use of an internal standard, yielding decent linearity and reproducibility. Initial results suggest that this source configuration is resistant to carryover effects and robust towards multi-component samples. The rapid, continuous analysis afforded by this method offers advantages in terms of sample analysis time and throughput over traditional hyphenated mass spectrometric techniques.

  16. Point-by-point compositional analysis for atom probe tomography.

    PubMed

    Stephenson, Leigh T; Ceguerra, Anna V; Li, Tong; Rojhirunsakool, Tanaporn; Nag, Soumya; Banerjee, Rajarshi; Cairney, Julie M; Ringer, Simon P

    2014-01-01

    This new alternate approach to data processing for analyses that traditionally employed grid-based counting methods is necessary because it removes a user-imposed coordinate system that not only limits an analysis but also may introduce errors. We have modified the widely used "binomial" analysis for APT data by replacing grid-based counting with coordinate-independent nearest neighbour identification, improving the measurements and the statistics obtained, allowing quantitative analysis of smaller datasets, and datasets from non-dilute solid solutions. It also allows better visualisation of compositional fluctuations in the data. Our modifications include:.•using spherical k-atom blocks identified by each detected atom's first k nearest neighbours.•3D data visualisation of block composition and nearest neighbour anisotropy.•using z-statistics to directly compare experimental and expected composition curves. Similar modifications may be made to other grid-based counting analyses (contingency table, Langer-Bar-on-Miller, sinusoidal model) and could be instrumental in developing novel data visualisation options.

  17. Direct detection of pharmaceuticals and personal care products from aqueous samples with thermally-assisted desorption electrospray ionization mass spectrometry.

    PubMed

    Campbell, Ian S; Ton, Alain T; Mulligan, Christopher C

    2011-07-01

    An ambient mass spectrometric method based on desorption electrospray ionization (DESI) has been developed to allow rapid, direct analysis of contaminated water samples, and the technique was evaluated through analysis of a wide array of pharmaceutical and personal care product (PPCP) contaminants. Incorporating direct infusion of aqueous sample and thermal assistance into the source design has allowed low ppt detection limits for the target analytes in drinking water matrices. With this methodology, mass spectral information can be collected in less than 1 min, consuming ~100 μL of total sample. Quantitative ability was also demonstrated without the use of an internal standard, yielding decent linearity and reproducibility. Initial results suggest that this source configuration is resistant to carryover effects and robust towards multi-component samples. The rapid, continuous analysis afforded by this method offers advantages in terms of sample analysis time and throughput over traditional hyphenated mass spectrometric techniques.

  18. Ecological Change, Sliding Baselines and the Importance of Historical Data: Lessons from Combing Observational and Quantitative Data on a Temperate Reef Over 70 Years

    PubMed Central

    Gatti, Giulia; Bianchi, Carlo Nike; Parravicini, Valeriano; Rovere, Alessio; Peirano, Andrea; Montefalcone, Monica; Massa, Francesco; Morri, Carla

    2015-01-01

    Understanding the effects of environmental change on ecosystems requires the identification of baselines that may act as reference conditions. However, the continuous change of these references challenges our ability to define the true natural status of ecosystems. The so-called sliding baseline syndrome can be overcome through the analysis of quantitative time series, which are, however, extremely rare. Here we show how combining historical quantitative data with descriptive ‘naturalistic’ information arranged in a chronological chain allows highlighting long-term trends and can be used to inform present conservation schemes. We analysed the long-term change of a coralligenous reef, a marine habitat endemic to the Mediterranean Sea. The coralligenous assemblages of Mesco Reef (Ligurian Sea, NW Mediterranean) have been studied, although discontinuously, since 1937 thus making available both detailed descriptive information and scanty quantitative data: while the former was useful to understand the natural history of the ecosystem, the analysis of the latter was of paramount importance to provide a formal measure of change over time. Epibenthic assemblages remained comparatively stable until the 1990s, when species replacement, invasion by alien algae, and biotic homogenisation occurred within few years, leading to a new and completely different ecosystem state. The shift experienced by the coralligenous assemblages of Mesco Reef was probably induced by a combination of seawater warming and local human pressures, the latter mainly resulting in increased water turbidity; in turn, cumulative stress may have favoured the establishment of alien species. This study showed that the combined analysis of quantitative and descriptive historical data represent a precious knowledge to understand ecosystem trends over time and provide help to identify baselines for ecological management. PMID:25714413

  19. Toward standardized quantitative image quality (IQ) assessment in computed tomography (CT): A comprehensive framework for automated and comparative IQ analysis based on ICRU Report 87.

    PubMed

    Pahn, Gregor; Skornitzke, Stephan; Schlemmer, Hans-Peter; Kauczor, Hans-Ulrich; Stiller, Wolfram

    2016-01-01

    Based on the guidelines from "Report 87: Radiation Dose and Image-quality Assessment in Computed Tomography" of the International Commission on Radiation Units and Measurements (ICRU), a software framework for automated quantitative image quality analysis was developed and its usability for a variety of scientific questions demonstrated. The extendable framework currently implements the calculation of the recommended Fourier image quality (IQ) metrics modulation transfer function (MTF) and noise-power spectrum (NPS), and additional IQ quantities such as noise magnitude, CT number accuracy, uniformity across the field-of-view, contrast-to-noise ratio (CNR) and signal-to-noise ratio (SNR) of simulated lesions for a commercially available cone-beam phantom. Sample image data were acquired with different scan and reconstruction settings on CT systems from different manufacturers. Spatial resolution is analyzed in terms of edge-spread function, line-spread-function, and MTF. 3D NPS is calculated according to ICRU Report 87, and condensed to 2D and radially averaged 1D representations. Noise magnitude, CT numbers, and uniformity of these quantities are assessed on large samples of ROIs. Low-contrast resolution (CNR, SNR) is quantitatively evaluated as a function of lesion contrast and diameter. Simultaneous automated processing of several image datasets allows for straightforward comparative assessment. The presented framework enables systematic, reproducible, automated and time-efficient quantitative IQ analysis. Consistent application of the ICRU guidelines facilitates standardization of quantitative assessment not only for routine quality assurance, but for a number of research questions, e.g. the comparison of different scanner models or acquisition protocols, and the evaluation of new technology or reconstruction methods. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  20. Development of an electron paramagnetic resonance methodology for studying the photo-generation of reactive species in semiconductor nano-particle assembled films

    NASA Astrophysics Data System (ADS)

    Twardoch, Marek; Messai, Youcef; Vileno, Bertrand; Hoarau, Yannick; Mekki, Djamel E.; Felix, Olivier; Turek, Philippe; Weiss, Jean; Decher, Gero; Martel, David

    2018-06-01

    An experimental approach involving electron paramagnetic resonance is proposed for studying photo-generated reactive species in semiconductor nano-particle-based films deposited on the internal wall of glass capillaries. This methodology is applied here to nano-TiO2 and allows a semi-quantitative analysis of the kinetic evolutions of radical production using a spin scavenger probe.

  1. [Analysis of women nutritional status during pregnancy--a survey].

    PubMed

    Selwet, Monika; Machura, Mariola; Sipiński, Adam; Kuna, Anna; Kazimierczak, Małgorzata

    2004-01-01

    The proper diet is one of the most important factor during pregnancy. The general knowledge about proper nourishment during pregnancy allows the women to avoid quantitative and qualitative nourishment mistakes. Because of this--the salubrious education in this aspect is very important. The aim of the study is to analyze the proper nourishment during pregnancy particularly in professionally active women and those who don't work during pregnancy.

  2. Determination of neutron flux distribution in an Am-Be irradiator using the MCNP.

    PubMed

    Shtejer-Diaz, K; Zamboni, C B; Zahn, G S; Zevallos-Chávez, J Y

    2003-10-01

    A neutron irradiator has been assembled at IPEN facilities to perform qualitative-quantitative analysis of many materials using thermal and fast neutrons outside the nuclear reactor premises. To establish the prototype specifications, the neutron flux distribution and the absorbed dose rates were calculated using the MCNP computer code. These theoretical predictions then allow one to discuss the optimum irradiator design and its performance.

  3. A new method for qualitative simulation of water resources systems: 1. Theory

    NASA Astrophysics Data System (ADS)

    Camara, A. S.; Pinheiro, M.; Antunes, M. P.; Seixas, M. J.

    1987-11-01

    A new dynamic modeling methodology, SLIN (Simulação Linguistica), allowing for the analysis of systems defined by linguistic variables, is presented. SLIN applies a set of logical rules avoiding fuzzy theoretic concepts. To make the transition from qualitative to quantitative modes, logical rules are used as well. Extensions of the methodology to simulation-optimization applications and multiexpert system modeling are also discussed.

  4. Qualis-SIS: automated standard curve generation and quality assessment for multiplexed targeted quantitative proteomic experiments with labeled standards.

    PubMed

    Mohammed, Yassene; Percy, Andrew J; Chambers, Andrew G; Borchers, Christoph H

    2015-02-06

    Multiplexed targeted quantitative proteomics typically utilizes multiple reaction monitoring and allows the optimized quantification of a large number of proteins. One challenge, however, is the large amount of data that needs to be reviewed, analyzed, and interpreted. Different vendors provide software for their instruments, which determine the recorded responses of the heavy and endogenous peptides and perform the response-curve integration. Bringing multiplexed data together and generating standard curves is often an off-line step accomplished, for example, with spreadsheet software. This can be laborious, as it requires determining the concentration levels that meet the required accuracy and precision criteria in an iterative process. We present here a computer program, Qualis-SIS, that generates standard curves from multiplexed MRM experiments and determines analyte concentrations in biological samples. Multiple level-removal algorithms and acceptance criteria for concentration levels are implemented. When used to apply the standard curve to new samples, the software flags each measurement according to its quality. From the user's perspective, the data processing is instantaneous due to the reactivity paradigm used, and the user can download the results of the stepwise calculations for further processing, if necessary. This allows for more consistent data analysis and can dramatically accelerate the downstream data analysis.

  5. 3D OCT imaging in clinical settings: toward quantitative measurements of retinal structures

    NASA Astrophysics Data System (ADS)

    Zawadzki, Robert J.; Fuller, Alfred R.; Zhao, Mingtao; Wiley, David F.; Choi, Stacey S.; Bower, Bradley A.; Hamann, Bernd; Izatt, Joseph A.; Werner, John S.

    2006-02-01

    The acquisition speed of current FD-OCT (Fourier Domain - Optical Coherence Tomography) instruments allows rapid screening of three-dimensional (3D) volumes of human retinas in clinical settings. To take advantage of this ability requires software used by physicians to be capable of displaying and accessing volumetric data as well as supporting post processing in order to access important quantitative information such as thickness maps and segmented volumes. We describe our clinical FD-OCT system used to acquire 3D data from the human retina over the macula and optic nerve head. B-scans are registered to remove motion artifacts and post-processed with customized 3D visualization and analysis software. Our analysis software includes standard 3D visualization techniques along with a machine learning support vector machine (SVM) algorithm that allows a user to semi-automatically segment different retinal structures and layers. Our program makes possible measurements of the retinal layer thickness as well as volumes of structures of interest, despite the presence of noise and structural deformations associated with retinal pathology. Our software has been tested successfully in clinical settings for its efficacy in assessing 3D retinal structures in healthy as well as diseased cases. Our tool facilitates diagnosis and treatment monitoring of retinal diseases.

  6. [Prediction of the molecular response to pertubations from single cell measurements].

    PubMed

    Remacle, Françoise; Levine, Raphael D

    2014-12-01

    The response of protein signalization networks to perturbations is analysed from single cell measurements. This experimental approach allows characterizing the fluctuations in protein expression levels from cell to cell. The analysis is based on an information theoretic approach grounded in thermodynamics leading to a quantitative version of Le Chatelier principle which allows to predict the molecular response. Two systems are investigated: human macrophages subjected to lipopolysaccharide challenge, analogous to the immune response against Gram-negative bacteria and the response of the proteins involved in the mTOR signalizing network of GBM cancer cells to changes in partial oxygen pressure. © 2014 médecine/sciences – Inserm.

  7. Web-based interactive visualization in a Grid-enabled neuroimaging application using HTML5.

    PubMed

    Siewert, René; Specovius, Svenja; Wu, Jie; Krefting, Dagmar

    2012-01-01

    Interactive visualization and correction of intermediate results are required in many medical image analysis pipelines. To allow certain interaction in the remote execution of compute- and data-intensive applications, new features of HTML5 are used. They allow for transparent integration of user interaction into Grid- or Cloud-enabled scientific workflows. Both 2D and 3D visualization and data manipulation can be performed through a scientific gateway without the need to install specific software or web browser plugins. The possibilities of web-based visualization are presented along the FreeSurfer-pipeline, a popular compute- and data-intensive software tool for quantitative neuroimaging.

  8. FRET excited ratiometric oxygen sensing in living tissue

    PubMed Central

    Ingram, Justin M.; Zhang, Chunfeng; Xu, Jian; Schiff, Steven J.

    2013-01-01

    Dynamic analysis of oxygen (O2) has been limited by the lack of a real-time, quantitative, and biocompatible sensor. To address these demands, we designed a ratiometric optode matrix consisting of the phosphorescence quenching dye platinum (II) octaethylporphine ketone (PtOEPK) and nanocystal quantum dots (NQDs), which when embedded within an inert polymer matrix allows long-term pre-designed excitation through fluorescence resonance energy transfer (FRET). Depositing this matrix on various glass substrates allowed the development of a series of optical sensors able to measure interstitial oxygen concentration [O2] with several hundred millisecond temporal resolution in varying biological microdomains of active brain tissue. PMID:23333398

  9. Conservative and dissipative force imaging of switchable rotaxanes with frequency-modulation atomic force microscopy

    NASA Astrophysics Data System (ADS)

    Farrell, Alan A.; Fukuma, Takeshi; Uchihashi, Takayuki; Kay, Euan R.; Bottari, Giovanni; Leigh, David A.; Yamada, Hirofumi; Jarvis, Suzanne P.

    2005-09-01

    We compare constant amplitude frequency modulation atomic force microscopy (FM-AFM) in ambient conditions to ultrahigh vacuum (UHV) experiments by analysis of thin films of rotaxane molecules. Working in ambient conditions is important for the development of real-world molecular devices. We show that the FM-AFM technique allows quantitative measurement of conservative and dissipative forces without instabilities caused by any native water layer. Molecular resolution is achieved despite the low Q-factor in the air. Furthermore, contrast in the energy dissipation is observed even at the molecular level. This should allow investigations into stimuli-induced sub-molecular motion of organic films.

  10. Ex-vivo imaging of excised tissue using vital dyes and confocal microscopy

    PubMed Central

    Johnson, Simon; Rabinovitch, Peter

    2012-01-01

    Vital dyes routinely used for staining cultured cells can also be used to stain and image live tissue slices ex-vivo. Staining tissue with vital dyes allows researchers to collect structural and functional data simultaneously and can be used for qualitative or quantitative fluorescent image collection. The protocols presented here are useful for structural and functional analysis of viable properties of cells in intact tissue slices, allowing for the collection of data in a structurally relevant environment. With these protocols, vital dyes can be applied as a research tool to disease processes and properties of tissue not amenable to cell culture based studies. PMID:22752953

  11. In situ detection of porosity initiation during aluminum thin film anodizing

    NASA Astrophysics Data System (ADS)

    Van Overmeere, Quentin; Nysten, Bernard; Proost, Joris

    2009-02-01

    High-resolution curvature measurements have been performed in situ during aluminum thin film anodizing in sulfuric acid. A well-defined transition in the rate of internal stress-induced curvature change is shown to allow for the accurate, real-time detection of porosity initiation. The validity of this in situ diagnostic tool was confirmed by a quantitative analysis of the spectral density distributions of the anodized surfaces. These were obtained by analyzing ex situ atomic force microscopy images of surfaces anodized for different times, and allowed to correlate the in situ detected transition in the rate of curvature change with the appearance of porosity.

  12. Label-free cell-cycle analysis by high-throughput quantitative phase time-stretch imaging flow cytometry

    NASA Astrophysics Data System (ADS)

    Mok, Aaron T. Y.; Lee, Kelvin C. M.; Wong, Kenneth K. Y.; Tsia, Kevin K.

    2018-02-01

    Biophysical properties of cells could complement and correlate biochemical markers to characterize a multitude of cellular states. Changes in cell size, dry mass and subcellular morphology, for instance, are relevant to cell-cycle progression which is prevalently evaluated by DNA-targeted fluorescence measurements. Quantitative-phase microscopy (QPM) is among the effective biophysical phenotyping tools that can quantify cell sizes and sub-cellular dry mass density distribution of single cells at high spatial resolution. However, limited camera frame rate and thus imaging throughput makes QPM incompatible with high-throughput flow cytometry - a gold standard in multiparametric cell-based assay. Here we present a high-throughput approach for label-free analysis of cell cycle based on quantitative-phase time-stretch imaging flow cytometry at a throughput of > 10,000 cells/s. Our time-stretch QPM system enables sub-cellular resolution even at high speed, allowing us to extract a multitude (at least 24) of single-cell biophysical phenotypes (from both amplitude and phase images). Those phenotypes can be combined to track cell-cycle progression based on a t-distributed stochastic neighbor embedding (t-SNE) algorithm. Using multivariate analysis of variance (MANOVA) discriminant analysis, cell-cycle phases can also be predicted label-free with high accuracy at >90% in G1 and G2 phase, and >80% in S phase. We anticipate that high throughput label-free cell cycle characterization could open new approaches for large-scale single-cell analysis, bringing new mechanistic insights into complex biological processes including diseases pathogenesis.

  13. A quantitative image cytometry technique for time series or population analyses of signaling networks.

    PubMed

    Ozaki, Yu-ichi; Uda, Shinsuke; Saito, Takeshi H; Chung, Jaehoon; Kubota, Hiroyuki; Kuroda, Shinya

    2010-04-01

    Modeling of cellular functions on the basis of experimental observation is increasingly common in the field of cellular signaling. However, such modeling requires a large amount of quantitative data of signaling events with high spatio-temporal resolution. A novel technique which allows us to obtain such data is needed for systems biology of cellular signaling. We developed a fully automatable assay technique, termed quantitative image cytometry (QIC), which integrates a quantitative immunostaining technique and a high precision image-processing algorithm for cell identification. With the aid of an automated sample preparation system, this device can quantify protein expression, phosphorylation and localization with subcellular resolution at one-minute intervals. The signaling activities quantified by the assay system showed good correlation with, as well as comparable reproducibility to, western blot analysis. Taking advantage of the high spatio-temporal resolution, we investigated the signaling dynamics of the ERK pathway in PC12 cells. The QIC technique appears as a highly quantitative and versatile technique, which can be a convenient replacement for the most conventional techniques including western blot, flow cytometry and live cell imaging. Thus, the QIC technique can be a powerful tool for investigating the systems biology of cellular signaling.

  14. Quantitative microbial faecal source tracking with sampling guided by hydrological catchment dynamics.

    PubMed

    Reischer, G H; Haider, J M; Sommer, R; Stadler, H; Keiblinger, K M; Hornek, R; Zerobin, W; Mach, R L; Farnleitner, A H

    2008-10-01

    The impairment of water quality by faecal pollution is a global public health concern. Microbial source tracking methods help to identify faecal sources but the few recent quantitative microbial source tracking applications disregarded catchment hydrology and pollution dynamics. This quantitative microbial source tracking study, conducted in a large karstic spring catchment potentially influenced by humans and ruminant animals, was based on a tiered sampling approach: a 31-month water quality monitoring (Monitoring) covering seasonal hydrological dynamics and an investigation of flood events (Events) as periods of the strongest pollution. The detection of a ruminant-specific and a human-specific faecal Bacteroidetes marker by quantitative real-time PCR was complemented by standard microbiological and on-line hydrological parameters. Both quantitative microbial source tracking markers were detected in spring water during Monitoring and Events, with preponderance of the ruminant-specific marker. Applying multiparametric analysis of all data allowed linking the ruminant-specific marker to general faecal pollution indicators, especially during Events. Up to 80% of the variation of faecal indicator levels during Events could be explained by ruminant-specific marker levels proving the dominance of ruminant faecal sources in the catchment. Furthermore, soil was ruled out as a source of quantitative microbial source tracking markers. This study demonstrates the applicability of quantitative microbial source tracking methods and highlights the prerequisite of considering hydrological catchment dynamics in source tracking study design.

  15. An eQTL Analysis of Partial Resistance to Puccinia hordei in Barley

    PubMed Central

    Chen, Xinwei; Hackett, Christine A.; Niks, Rients E.; Hedley, Peter E.; Booth, Clare; Druka, Arnis; Marcel, Thierry C.; Vels, Anton; Bayer, Micha; Milne, Iain; Morris, Jenny; Ramsay, Luke; Marshall, David; Cardle, Linda; Waugh, Robbie

    2010-01-01

    Background Genetic resistance to barley leaf rust caused by Puccinia hordei involves both R genes and quantitative trait loci. The R genes provide higher but less durable resistance than the quantitative trait loci. Consequently, exploring quantitative or partial resistance has become a favorable alternative for controlling disease. Four quantitative trait loci for partial resistance to leaf rust have been identified in the doubled haploid Steptoe (St)/Morex (Mx) mapping population. Further investigations are required to study the molecular mechanisms underpinning partial resistance and ultimately identify the causal genes. Methodology/Principal Findings We explored partial resistance to barley leaf rust using a genetical genomics approach. We recorded RNA transcript abundance corresponding to each probe on a 15K Agilent custom barley microarray in seedlings from St and Mx and 144 doubled haploid lines of the St/Mx population. A total of 1154 and 1037 genes were, respectively, identified as being P. hordei-responsive among the St and Mx and differentially expressed between P. hordei-infected St and Mx. Normalized ratios from 72 distant-pair hybridisations were used to map the genetic determinants of variation in transcript abundance by expression quantitative trait locus (eQTL) mapping generating 15685 eQTL from 9557 genes. Correlation analysis identified 128 genes that were correlated with resistance, of which 89 had eQTL co-locating with the phenotypic quantitative trait loci (pQTL). Transcript abundance in the parents and conservation of synteny with rice allowed us to prioritise six genes as candidates for Rphq11, the pQTL of largest effect, and highlight one, a phospholipid hydroperoxide glutathione peroxidase (HvPHGPx) for detailed analysis. Conclusions/Significance The eQTL approach yielded information that led to the identification of strong candidate genes underlying pQTL for resistance to leaf rust in barley and on the general pathogen response pathway. The dataset will facilitate a systems appraisal of this host-pathogen interaction and, potentially, for other traits measured in this population. PMID:20066049

  16. A joint analysis of the Drake equation and the Fermi paradox

    NASA Astrophysics Data System (ADS)

    Prantzos, Nikos

    2013-07-01

    I propose a unified framework for a joint analysis of the Drake equation and the Fermi paradox, which enables a simultaneous, quantitative study of both of them. The analysis is based on a simplified form of the Drake equation and on a fairly simple scheme for the colonization of the Milky Way. It appears that for sufficiently long-lived civilizations, colonization of the Galaxy is the only reasonable option to gain knowledge about other life forms. This argument allows one to define a region in the parameter space of the Drake equation, where the Fermi paradox definitely holds (`Strong Fermi paradox').

  17. Computational Fluid Dynamics Analysis of Thoracic Aortic Dissection

    NASA Astrophysics Data System (ADS)

    Tang, Yik; Fan, Yi; Cheng, Stephen; Chow, Kwok

    2011-11-01

    Thoracic Aortic Dissection (TAD) is a cardiovascular disease with high mortality. An aortic dissection is formed when blood infiltrates the layers of the vascular wall, and a new artificial channel, the false lumen, is created. The expansion of the blood vessel due to the weakened wall enhances the risk of rupture. Computational fluid dynamics analysis is performed to study the hemodynamics of this pathological condition. Both idealized geometry and realistic patient configurations from computed tomography (CT) images are investigated. Physiological boundary conditions from in vivo measurements are employed. Flow configuration and biomechanical forces are studied. Quantitative analysis allows clinicians to assess the risk of rupture in making decision regarding surgical intervention.

  18. Metabolomic Fingerprinting of Romaneschi Globe Artichokes by NMR Spectroscopy and Multivariate Data Analysis.

    PubMed

    de Falco, Bruna; Incerti, Guido; Pepe, Rosa; Amato, Mariana; Lanzotti, Virginia

    2016-09-01

    Globe artichoke (Cynara cardunculus L. var. scolymus L. Fiori) and cardoon (Cynara cardunculus L. var. altilis DC) are sources of nutraceuticals and bioactive compounds. To apply a NMR metabolomic fingerprinting approach to Cynara cardunculus heads to obtain simultaneous identification and quantitation of the major classes of organic compounds. The edible part of 14 Globe artichoke populations, belonging to the Romaneschi varietal group, were extracted to obtain apolar and polar organic extracts. The analysis was also extended to one species of cultivated cardoon for comparison. The (1) H-NMR of the extracts allowed simultaneous identification of the bioactive metabolites whose quantitation have been obtained by spectral integration followed by principal component analysis (PCA). Apolar organic extracts were mainly based on highly unsaturated long chain lipids. Polar organic extracts contained organic acids, amino acids, sugars (mainly inulin), caffeoyl derivatives (mainly cynarin), flavonoids, and terpenes. The level of nutraceuticals was found to be highest in the Italian landraces Bianco di Pertosa zia E and Natalina while cardoon showed the lowest content of all metabolites thus confirming the genetic distance between artichokes and cardoon. Metabolomic approach coupling NMR spectroscopy with multivariate data analysis allowed for a detailed metabolite profile of artichoke and cardoon varieties to be obtained. Relevant differences in the relative content of the metabolites were observed for the species analysed. This work is the first application of (1) H-NMR with multivariate statistics to provide a metabolomic fingerprinting of Cynara scolymus. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  19. Client - server programs analysis in the EPOCA environment

    NASA Astrophysics Data System (ADS)

    Donatelli, Susanna; Mazzocca, Nicola; Russo, Stefano

    1996-09-01

    Client - server processing is a popular paradigm for distributed computing. In the development of client - server programs, the designer has first to ensure that the implementation behaves correctly, in particular that it is deadlock free. Second, he has to guarantee that the program meets predefined performance requirements. This paper addresses the issues in the analysis of client - server programs in EPOCA. EPOCA is a computer-aided software engeneering (CASE) support system that allows the automated construction and analysis of generalized stochastic Petri net (GSPN) models of concurrent applications. The paper describes, on the basis of a realistic case study, how client - server systems are modelled in EPOCA, and the kind of qualitative and quantitative analysis supported by its tools.

  20. Quantitative analysis of cardiovascular MR images.

    PubMed

    van der Geest, R J; de Roos, A; van der Wall, E E; Reiber, J H

    1997-06-01

    The diagnosis of cardiovascular disease requires the precise assessment of both morphology and function. Nearly all aspects of cardiovascular function and flow can be quantified nowadays with fast magnetic resonance (MR) imaging techniques. Conventional and breath-hold cine MR imaging allow the precise and highly reproducible assessment of global and regional left ventricular function. During the same examination, velocity encoded cine (VEC) MR imaging provides measurements of blood flow in the heart and great vessels. Quantitative image analysis often still relies on manual tracing of contours in the images. Reliable automated or semi-automated image analysis software would be very helpful to overcome the limitations associated with the manual and tedious processing of the images. Recent progress in MR imaging of the coronary arteries and myocardial perfusion imaging with contrast media, along with the further development of faster imaging sequences, suggest that MR imaging could evolve into a single technique ('one stop shop') for the evaluation of many aspects of heart disease. As a result, it is very likely that the need for automated image segmentation and analysis software algorithms will further increase. In this paper the developments directed towards the automated image analysis and semi-automated contour detection for cardiovascular MR imaging are presented.

  1. Quantitative Large-Scale Three-Dimensional Imaging of Human Kidney Biopsies: A Bridge to Precision Medicine in Kidney Disease.

    PubMed

    Winfree, Seth; Dagher, Pierre C; Dunn, Kenneth W; Eadon, Michael T; Ferkowicz, Michael; Barwinska, Daria; Kelly, Katherine J; Sutton, Timothy A; El-Achkar, Tarek M

    2018-06-05

    Kidney biopsy remains the gold standard for uncovering the pathogenesis of acute and chronic kidney diseases. However, the ability to perform high resolution, quantitative, molecular and cellular interrogation of this precious tissue is still at a developing stage compared to other fields such as oncology. Here, we discuss recent advances in performing large-scale, three-dimensional (3D), multi-fluorescence imaging of kidney biopsies and quantitative analysis referred to as 3D tissue cytometry. This approach allows the accurate measurement of specific cell types and their spatial distribution in a thick section spanning the entire length of the biopsy. By uncovering specific disease signatures, including rare occurrences, and linking them to the biology in situ, this approach will enhance our understanding of disease pathogenesis. Furthermore, by providing accurate quantitation of cellular events, 3D cytometry may improve the accuracy of prognosticating the clinical course and response to therapy. Therefore, large-scale 3D imaging and cytometry of kidney biopsy is poised to become a bridge towards personalized medicine for patients with kidney disease. © 2018 S. Karger AG, Basel.

  2. Modified HS-SPME for determination of quantitative relations between low-molecular oxygen compounds in various matrices.

    PubMed

    Dawidowicz, Andrzej L; Szewczyk, Joanna; Dybowski, Michal P

    2016-09-07

    Similar quantitative relations between individual constituents of the liquid sample established by its direct injection can be obtained applying Polydimethylsiloxane (PDMS) fiber in the headspace solid phase microextraction (HS-SPME) system containing the examined sample suspended in methyl silica oil. This paper proves that the analogous system composed of sample suspension/emulsion in polyethylene glycol (PEG) and Carbowax fiber allows to get similar quantitative relations between components of the mixture as those established by its direct analysis, but only for polar constituents. It is demonstrated for essential oil (EO) components of savory, sage, mint and thyme, and of artificial liquid mixture of polar constituents. The observed differences in quantitative relations between polar constituents estimated by both applied procedures are insignificant (Fexp < Fcrit). The presented results indicates that wider applicability of the system composed of a sample suspended in the oil of the same physicochemical character as that of used SPME fiber coating strongly depends on the character of interactions between analytes-suspending liquid and analytes-fiber coating. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Alzheimer disease: Quantitative analysis of I-123-iodoamphetamine SPECT brain imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hellman, R.S.; Tikofsky, R.S.; Collier, B.D.

    1989-07-01

    To enable a more quantitative diagnosis of senile dementia of the Alzheimer type (SDAT), the authors developed and tested a semiautomated method to define regions of interest (ROIs) to be used in quantitating results from single photon emission computed tomography (SPECT) of regional cerebral blood flow performed with N-isopropyl iodine-123-iodoamphetamine. SPECT/IMP imaging was performed in ten patients with probable SDAT and seven healthy subjects. Multiple ROIs were manually and semiautomatically generated, and uptake was quantitated for each ROI. Mean cortical activity was estimated as the average of the mean activity in 24 semiautomatically generated ROIs; mean cerebellar activity was determinedmore » from the mean activity in separate ROIs. A ratio of parietal to cerebellar activity less than 0.60 and a ratio of parietal to mean cortical activity less than 0.90 allowed correct categorization of nine of ten and eight of ten patients, respectively, with SDAT and all control subjects. The degree of diminished mental status observed in patients with SDAT correlated with both global and regional changes in IMP uptake.« less

  4. Development of a multi-variate calibration approach for quantitative analysis of oxidation resistant Mo-Si-B coatings using laser ablation inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Cakara, Anja; Bonta, Maximilian; Riedl, Helmut; Mayrhofer, Paul H.; Limbeck, Andreas

    2016-06-01

    Nowadays, for the production of oxidation protection coatings in ultrahigh temperature environments, alloys of Mo-Si-B are employed. The properties of the material, mainly the oxidation resistance, are strongly influenced by the Si to B ratio; thus reliable analytical methods are needed to assure exact determination of the material composition for the respective applications. For analysis of such coatings, laser ablation inductively coupled mass spectrometry (LA-ICP-MS) has been reported as a versatile method with no specific requirements on the nature of the sample. However, matrix effects represent the main limitation of laser-based solid sampling techniques and usually the use of matrix-matched standards for quantitative analysis is required. In this work, LA-ICP-MS analysis of samples with known composition and varying Mo, Si and B content was carried out. Between known analyte concentrations and derived LA-ICP-MS signal intensities no linear correlation could be found. In order to allow quantitative analysis independent of matrix effects, a multiple linear regression model was developed. Besides the three target analytes also the signals of possible argides (40Ar36Ar and 98Mo40Ar) as well as detected impurities of the Mo-Si-B coatings (108Pd) were considered. Applicability of the model to unknown samples was confirmed using external validation. Relative deviations from the values determined using conventional liquid analysis after sample digestion between 5 and 10% for the main components Mo and Si were observed.

  5. Clustering Methods with Qualitative Data: A Mixed Methods Approach for Prevention Research with Small Samples

    PubMed Central

    Henry, David; Dymnicki, Allison B.; Mohatt, Nathaniel; Allen, James; Kelly, James G.

    2016-01-01

    Qualitative methods potentially add depth to prevention research, but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data, but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-Means clustering, and latent class analysis produced similar levels of accuracy with binary data, and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a “real-world” example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities. PMID:25946969

  6. Airborne particulate matter (PM) filter analysis and modeling by total reflection X-ray fluorescence (TXRF) and X-ray standing wave (XSW).

    PubMed

    Borgese, L; Salmistraro, M; Gianoncelli, A; Zacco, A; Lucchini, R; Zimmerman, N; Pisani, L; Siviero, G; Depero, L E; Bontempi, E

    2012-01-30

    This work is presented as an improvement of a recently introduced method for airborne particulate matter (PM) filter analysis [1]. X-ray standing wave (XSW) and total reflection X-ray fluorescence (TXRF) were performed with a new dedicated laboratory instrumentation. The main advantage of performing both XSW and TXRF, is the possibility to distinguish the nature of the sample: if it is a small droplet dry residue, a thin film like or a bulk sample. Another advantage is related to the possibility to select the angle of total reflection to make TXRF measurements. Finally, the possibility to switch the X-ray source allows to measure with more accuracy lighter and heavier elements (with a change in X-ray anode, for example from Mo to Cu). The aim of the present study is to lay the theoretical foundation of the new proposed method for airborne PM filters quantitative analysis improving the accuracy and efficiency of quantification by means of an external standard. The theoretical model presented and discussed demonstrated that airborne PM filters can be considered as thin layers. A set of reference samples is prepared in laboratory and used to obtain a calibration curve. Our results demonstrate that the proposed method for quantitative analysis of air PM filters is affordable and reliable without the necessity to digest filters to obtain quantitative chemical analysis, and that the use of XSW improve the accuracy of TXRF analysis. Copyright © 2011 Elsevier B.V. All rights reserved.

  7. Clustering Methods with Qualitative Data: a Mixed-Methods Approach for Prevention Research with Small Samples.

    PubMed

    Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G

    2015-10-01

    Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities.

  8. High-Throughput Quantitative Lipidomics Analysis of Nonesterified Fatty Acids in Human Plasma.

    PubMed

    Christinat, Nicolas; Morin-Rivron, Delphine; Masoodi, Mojgan

    2016-07-01

    We present a high-throughput, nontargeted lipidomics approach using liquid chromatography coupled to high-resolution mass spectrometry for quantitative analysis of nonesterified fatty acids. We applied this method to screen a wide range of fatty acids from medium-chain to very long-chain (8 to 24 carbon atoms) in human plasma samples. The method enables us to chromatographically separate branched-chain species from their straight-chain isomers as well as separate biologically important ω-3 and ω-6 polyunsaturated fatty acids. We used 51 fatty acid species to demonstrate the quantitative capability of this method with quantification limits in the nanomolar range; however, this method is not limited only to these fatty acid species. High-throughput sample preparation was developed and carried out on a robotic platform that allows extraction of 96 samples simultaneously within 3 h. This high-throughput platform was used to assess the influence of different types of human plasma collection and preparation on the nonesterified fatty acid profile of healthy donors. Use of the anticoagulants EDTA and heparin has been compared with simple clotting, and only limited changes have been detected in most nonesterified fatty acid concentrations.

  9. Qualitative and quantitative processing of side-scan sonar data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dwan, F.S.; Anderson, A.L.; Hilde, T.W.C.

    1990-06-01

    Modern side-scan sonar systems allow vast areas of seafloor to be rapidly imaged and quantitatively mapped in detail. The application of remote sensing image processing techniques can be used to correct for various distortions inherent in raw sonography. Corrections are possible for water column, slant-range, aspect ratio, speckle and striping noise, multiple returns, power drop-off, and for georeferencing. The final products reveal seafloor features and patterns that are geometrically correct, georeferenced, and have improved signal/noise ratio. These products can be merged with other georeferenced data bases for further database management and information extraction. In order to compare data collected bymore » different systems from a common area and to ground truth measurements and geoacoustic models, quantitative correction must be made for calibrated sonar system and bathymetry effects. Such data inversion must account for system source level, beam pattern, time-varying gain, processing gain, transmission loss, absorption, insonified area, and grazing angle effects. Seafloor classification can then be performed on the calculated back-scattering strength using Lambert's Law and regression analysis. Examples are given using both approaches: image analysis and inversion of data based on the sonar equation.« less

  10. Major advances in testing of dairy products: milk component and dairy product attribute testing.

    PubMed

    Barbano, D M; Lynch, J M

    2006-04-01

    Milk component analysis is relatively unusual in the field of quantitative analytical chemistry because an analytical test result determines the allocation of very large amounts of money between buyers and sellers of milk. Therefore, there is high incentive to develop and refine these methods to achieve a level of analytical performance rarely demanded of most methods or laboratory staff working in analytical chemistry. In the last 25 yr, well-defined statistical methods to characterize and validate analytical method performance combined with significant improvements in both the chemical and instrumental methods have allowed achievement of improved analytical performance for payment testing. A shift from marketing commodity dairy products to the development, manufacture, and marketing of value added dairy foods for specific market segments has created a need for instrumental and sensory approaches and quantitative data to support product development and marketing. Bringing together sensory data from quantitative descriptive analysis and analytical data from gas chromatography olfactometry for identification of odor-active compounds in complex natural dairy foods has enabled the sensory scientist and analytical chemist to work together to improve the consistency and quality of dairy food flavors.

  11. Estimation of the number of fluorescent end-members for quantitative analysis of multispectral FLIM data.

    PubMed

    Gutierrez-Navarro, Omar; Campos-Delgado, Daniel U; Arce-Santana, Edgar R; Maitland, Kristen C; Cheng, Shuna; Jabbour, Joey; Malik, Bilal; Cuenca, Rodrigo; Jo, Javier A

    2014-05-19

    Multispectral fluorescence lifetime imaging (m-FLIM) can potentially allow identifying the endogenous fluorophores present in biological tissue. Quantitative description of such data requires estimating the number of components in the sample, their characteristic fluorescent decays, and their relative contributions or abundances. Unfortunately, this inverse problem usually requires prior knowledge about the data, which is seldom available in biomedical applications. This work presents a new methodology to estimate the number of potential endogenous fluorophores present in biological tissue samples from time-domain m-FLIM data. Furthermore, a completely blind linear unmixing algorithm is proposed. The method was validated using both synthetic and experimental m-FLIM data. The experimental m-FLIM data include in-vivo measurements from healthy and cancerous hamster cheek-pouch epithelial tissue, and ex-vivo measurements from human coronary atherosclerotic plaques. The analysis of m-FLIM data from in-vivo hamster oral mucosa identified healthy from precancerous lesions, based on the relative concentration of their characteristic fluorophores. The algorithm also provided a better description of atherosclerotic plaques in term of their endogenous fluorophores. These results demonstrate the potential of this methodology to provide quantitative description of tissue biochemical composition.

  12. Simultaneous two-wavelength holographic interferometry in a superorbital expansion tube facility.

    PubMed

    McIntyre, T J; Wegener, M J; Bishop, A I; Rubinsztein-Dunlop, H

    1997-11-01

    A new variation of holographic interferometry has been utilized to perform simultaneous two-wavelength measurements, allowing quantitative analysis of the heavy particle and electron densities in a superorbital facility. An air test gas accelerated to 12 km/s was passed over a cylindrical model, simulating reentry conditions encountered by a space vehicle on a superorbital mission. Laser beams with two different wavelengths have been overlapped, passed through the test section, and simultaneously recorded on a single holographic plate. Reconstruction of the hologram generated two separate interferograms at different angles from which the quantitative measurements were made. With this technique, a peak electron concentration of (5.5 +/- 0.5) x 10(23) m(-3) was found behind a bow shock on a cylinder.

  13. Espresso coffee foam delays cooling of the liquid phase.

    PubMed

    Arii, Yasuhiro; Nishizawa, Kaho

    2017-04-01

    Espresso coffee foam, called crema, is known to be a marker of the quality of espresso coffee extraction. However, the role of foam in coffee temperature has not been quantitatively clarified. In this study, we used an automatic machine for espresso coffee extraction. We evaluated whether the foam prepared using the machine was suitable for foam analysis. After extraction, the percentage and consistency of the foam were measured using various techniques, and changes in the foam volume were tracked over time. Our extraction method, therefore, allowed consistent preparation of high-quality foam. We also quantitatively determined that the foam phase slowed cooling of the liquid phase after extraction. High-quality foam plays an important role in delaying the cooling of espresso coffee.

  14. Pentobarbital quantitation using EMIT serum barbiturate assay reagents: application to monitoring of high-dose pentobarbital therapy.

    PubMed

    Pape, B E; Cary, P L; Clay, L C; Godolphin, W

    1983-01-01

    Pentobarbital serum concentrations associated with a high-dose therapeutic regimen were determined using EMIT immunoassay reagents. Replicate analyses of serum controls resulted in a within-assay coefficient of variation of 5.0% and a between-assay coefficient of variation of 10%. Regression analysis of 44 serum samples analyzed by this technique (y) and a reference procedure (x) were y = 0.98x + 3.6 (r = 0.98; x = ultraviolet spectroscopy) and y = 1.04x + 2.4 (r = 0.96; x = high-performance liquid chromatography). Clinical evaluation of the results indicates the immunoassay is sufficiently sensitive and selective for pentobarbital to allow accurate quantitation within the therapeutic range associated with high-dose therapy.

  15. Ecological content validation of the Information Assessment Method for parents (IAM-parent): A mixed methods study.

    PubMed

    Bujold, M; El Sherif, R; Bush, P L; Johnson-Lafleur, J; Doray, G; Pluye, P

    2018-02-01

    This mixed methods study content validated the Information Assessment Method for parents (IAM-parent) that allows users to systematically rate and comment on online parenting information. Quantitative data and results: 22,407 IAM ratings were collected; of the initial 32 items, descriptive statistics showed that 10 had low relevance. Qualitative data and results: IAM-based comments were collected, and 20 IAM users were interviewed (maximum variation sample); the qualitative data analysis assessed the representativeness of IAM items, and identified items with problematic wording. Researchers, the program director, and Web editors integrated quantitative and qualitative results, which led to a shorter and clearer IAM-parent. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Protein detection by Simple Western™ analysis.

    PubMed

    Harris, Valerie M

    2015-01-01

    Protein Simple© has taken a well-known protein detection method, the western blot, and revolutionized it. The Simple Western™ system uses capillary electrophoresis to identify and quantitate a protein of interest. Protein Simple© provides multiple detection apparatuses (Wes, Sally Sue, or Peggy Sue) that are suggested to save scientists valuable time by allowing the researcher to prepare the protein sample, load it along with necessary antibodies and substrates, and walk away. Within 3-5 h the protein will be separated by size, or charge, immuno-detection of target protein will be accurately quantitated, and results will be immediately made available. Using the Peggy Sue instrument, one study recently examined changes in MAPK signaling proteins in the sex-determining stage of gonadal development. Here the methodology is described.

  17. A simultaneous screening and quantitative method for the multiresidue analysis of pesticides in spices using ultra-high performance liquid chromatography-high resolution (Orbitrap) mass spectrometry.

    PubMed

    Goon, Arnab; Khan, Zareen; Oulkar, Dasharath; Shinde, Raviraj; Gaikwad, Suresh; Banerjee, Kaushik

    2018-01-12

    A novel screening and quantitation method is reported for non-target multiresidue analysis of pesticides using ultra-HPLC-quadrupole-Orbitrap mass spectrometry in spice matrices, including black pepper, cardamom, chili, coriander, cumin, and turmeric. The method involved sequential full-scan (resolution = 70,000), and variable data independent acquisition (vDIA) with nine consecutive fragmentation events (resolution = 17,500). Samples were extracted by the QuEChERS method. The introduction of an SPE-based clean-up step through hydrophilic-lipophilic-balance (HLB) cartridges proved advantageous in minimizing the false negatives. For coriander, cumin, chili, and cardamom, the screening detection limit was largely at 2 ng/g, while it was 5 ng/g for black pepper, and turmeric. When the method was quantitatively validated for 199 pesticides, the limit of quantification (LOQ) was mostly at 10 ng/g (excluding black pepper, and turmeric with LOQ = 20 ng/g) with recoveries within 70-120%, and precision-RSDs <20%. Furthermore, the method allowed the identification of suspected non-target analytes through retrospective search of the accurate mass of the compound-specific precursor and product ions. Compared to LC-MS/MS, the quantitative performance of this Orbitrap-MS method had agreements in residue values between 78-100%. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Quantitative perceptual differences among over-the-counter vaginal products using a standardized methodology: implications for microbicide development☆

    PubMed Central

    Mahan, Ellen D.; Morrow, Kathleen M.; Hayes, John E.

    2015-01-01

    Background Increasing prevalence of HIV infection among women worldwide has motivated the development of female-initiated prevention methods, including gel-based microbicides. User acceptability is vital for microbicide success; however, varying cultural vaginal practices indicate multiple formulations must be developed to appeal to different populations. Perceptual attributes of microbicides have been identified as primary drivers of acceptability; however, previous studies do not allow for direct comparison of these qualities between multiple formulations. Study Design Six vaginal products were analyzed ex vivo using descriptive analysis. Perceptual attributes of samples were identified by trained participants (n=10) and rated quantitatively using scales based on a panel-developed lexicon. Data were analyzed using two-way ANOVAs for each attribute; product differences were assessed via Tukey’s honestly significant difference test. Results Significant differences were found between products for multiple attributes. Patterns were also seen for attributes across intended product usage (i.e., contraceptive, moisturizer or lubricant). For example, Options© Gynol II® (Caldwell Consumer Health, LLC) was significantly stickier and grainier than other products. Conclusions Descriptive analysis, a quantitative approach that is based on consensus lexicon usage among participants, successfully quantified perceptual differences among vaginal products. Since perceptual attributes of products can be directly compared quantitatively, this study represents a novel approach that could be used to inform rational design of microbicides. PMID:21757061

  19. Quantitative perceptual differences among over-the-counter vaginal products using a standardized methodology: implications for microbicide development.

    PubMed

    Mahan, Ellen D; Morrow, Kathleen M; Hayes, John E

    2011-08-01

    Increasing prevalence of HIV infection among women worldwide has motivated the development of female-initiated prevention methods, including gel-based microbicides. User acceptability is vital for microbicide success; however, varying cultural vaginal practices indicate multiple formulations must be developed to appeal to different populations. Perceptual attributes of microbicides have been identified as primary drivers of acceptability; however, previous studies do not allow for direct comparison of these qualities between multiple formulations. Six vaginal products were analyzed ex vivo using descriptive analysis. Perceptual attributes of samples were identified by trained participants (n=10) and rated quantitatively using scales based on a panel-developed lexicon. Data were analyzed using two-way ANOVAs for each attribute; product differences were assessed via Tukey's honestly significant difference test. Significant differences were found between products for multiple attributes. Patterns were also seen for attributes across intended product usage (i.e., contraceptive, moisturizer or lubricant). For example, Options© Gynol II® (Caldwell Consumer Health, LLC) was significantly stickier and grainier than other products. Descriptive analysis, a quantitative approach that is based on consensus lexicon usage among participants, successfully quantified perceptual differences among vaginal products. Since perceptual attributes of products can be directly compared quantitatively, this study represents a novel approach that could be used to inform rational design of microbicides. Copyright © 2011 Elsevier Inc. All rights reserved.

  20. Amino acid analysis in physiological samples by GC-MS with propyl chloroformate derivatization and iTRAQ-LC-MS/MS.

    PubMed

    Dettmer, Katja; Stevens, Axel P; Fagerer, Stephan R; Kaspar, Hannelore; Oefner, Peter J

    2012-01-01

    Two mass spectrometry-based methods for the quantitative analysis of free amino acids are described. The first method uses propyl chloroformate/propanol derivatization and gas chromatography-quadrupole mass spectrometry (GC-qMS) analysis in single-ion monitoring mode. Derivatization is carried out directly in aqueous samples, thereby allowing automation of the entire procedure, including addition of reagents, extraction, and injection into the GC-MS. The method delivers the quantification of 26 amino acids. The isobaric tagging for relative and absolute quantification (iTRAQ) method employs the labeling of amino acids with isobaric iTRAQ tags. The tags contain two different cleavable reporter ions, one for the sample and one for the standard, which are detected by fragmentation in a tandem mass spectrometer. Reversed-phase liquid chromatography of the labeled amino acids is performed prior to mass spectrometric analysis to separate isobaric amino acids. The commercial iTRAQ kit allows for the analysis of 42 physiological amino acids with a respective isotope-labeled standard for each of these 42 amino acids.

  1. Application of Standards-Based Quantitative SEM-EDS Analysis to Oxide Minerals

    NASA Astrophysics Data System (ADS)

    Mengason, M. J.; Ritchie, N. W.; Newbury, D. E.

    2016-12-01

    SEM and EPMA analysis are powerful tools for documenting and evaluating the relationships between minerals in thin sections and for determining chemical compositions in-situ. The time and costs associated with determining major, minor, and some trace element concentrations in geologic materials can be reduced due to advances in EDS spectrometer performance and the availability of software tools such as NIST DTSA II to perform multiple linear least squares (MLLS) fitting of energy spectra from standards to the spectra from samples recorded under the same analytical conditions. MLLS fitting is able to overcome spectral peak overlaps among the transition-metal elements that commonly occur in oxide minerals, which had previously been seen as too difficult for EDS analysis, allowing for rapid and accurate determination of concentrations. The quantitative use of EDS is demonstrated in the chemical analysis of magnetite (NMNH 114887) and ilmenite (NMNH 96189) from the Smithsonian Natural History Museum Microbeam Standards Collection. Average concentrations from nine total spots over three grains are given in mass % listed as (recommended; measured concentration ± one standard deviation). Spectra were collected for sixty seconds live time at 15 kV and 10 nA over a 12 micrometer wide scan area. Analysis of magnetite yielded Magnesium (0.03; 0.04 ± 0.01), Aluminum (none given; 0.040 ± 0.006), Titanium (0.10; 0.11 ± 0.02), Vanadium (none given; 0.16 ± 0.01), Chromium (0.17; 0.14 ± 0.02), and Iron (70.71, 71.4 ± 0.2). Analysis of ilmenite yielded Magnesium (0.19; 0.183 ± 0.008), Aluminum (none given; 0.04 ± 0.02), Titanium (27.4, 28.1 ± 0.1), Chromium (none given; 0.04 ± 0.01), Manganese (3.69; 3.73 ± 0.03), Iron (36.18; 35.8 ± 0.1), and Niobium (0.64; 0.68 ± 0.03). The analysis of geologic materials by standards-based quantitative EDS can be further illustrated with chemical analyses of oxides from ocean island basalts representing several locations globally to illustrate the suitability of the method to the goal of evaluating trends in major and minor element concentrations and variability among locations. The shorter collection times of EDS, compared to WDS, allow greater sampling of the populations of oxides present as fine-grained quench products in addition to sampling larger inclusions hosted by silicate minerals.

  2. Soft-tissue sarcoma: imaged with technetium-99m pyrophosphate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blatt, C.J.; Hayt, D.B.; Desai, M.

    1977-11-01

    A liposarcoma showed intense concentration of technetium-99m pyrophosphate. An angiogram demonstrated a highly vascular lesion, and it is suggested that blood flow played a major role in allowing the tumor to be demonstrated on scintiphotography. There was some histologic evidence of calcification which probably also contributed to bone-tracer disposition. Quantitative analysis of the specimen demonstrated that this calcification was located primarily in the areas of hemorrhage and necrosis.

  3. Representation and Reconstruction of Three-dimensional Microstructures in Ni-based Superalloys

    DTIC Science & Technology

    2010-12-20

    Materiala, 56, pp. 427-437 (2009); • Application of joint histogram and mutual information to registration and data fusion problems in serial...sectioning data sets and synthetically generated microstructures. The method is easy to use, and allows for a quantitative description of shapes. Further...following objectives were achieved: • we have successfully applied 3-D moment invariant analysis to several experimental data sets; • we have extended 2-D

  4. Sensitive and quantitative measurement of gene expression directly from a small amount of whole blood.

    PubMed

    Zheng, Zhi; Luo, Yuling; McMaster, Gary K

    2006-07-01

    Accurate and precise quantification of mRNA in whole blood is made difficult by gene expression changes during blood processing, and by variations and biases introduced by sample preparations. We sought to develop a quantitative whole-blood mRNA assay that eliminates blood purification, RNA isolation, reverse transcription, and target amplification while providing high-quality data in an easy assay format. We performed single- and multiplex gene expression analysis with multiple hybridization probes to capture mRNA directly from blood lysate and used branched DNA to amplify the signal. The 96-well plate singleplex assay uses chemiluminescence detection, and the multiplex assay combines Luminex-encoded beads with fluorescent detection. The single- and multiplex assays could quantitatively measure as few as 6000 and 24,000 mRNA target molecules (0.01 and 0.04 amoles), respectively, in up to 25 microL of whole blood. Both formats had CVs < 10% and dynamic ranges of 3-4 logs. Assay sensitivities allowed quantitative measurement of gene expression in the minority of cells in whole blood. The signals from whole-blood lysate correlated well with signals from purified RNA of the same sample, and absolute mRNA quantification results from the assay were similar to those obtained by quantitative reverse transcription-PCR. Both single- and multiplex assay formats were compatible with common anticoagulants and PAXgene-treated samples; however, PAXgene preparations induced expression of known antiapoptotic genes in whole blood. Both the singleplex and the multiplex branched DNA assays can quantitatively measure mRNA expression directly from small volumes of whole blood. The assay offers an alternative to current technologies that depend on RNA isolation and is amenable to high-throughput gene expression analysis of whole blood.

  5. Guidelines for reporting quantitative mass spectrometry based experiments in proteomics.

    PubMed

    Martínez-Bartolomé, Salvador; Deutsch, Eric W; Binz, Pierre-Alain; Jones, Andrew R; Eisenacher, Martin; Mayer, Gerhard; Campos, Alex; Canals, Francesc; Bech-Serra, Joan-Josep; Carrascal, Montserrat; Gay, Marina; Paradela, Alberto; Navajas, Rosana; Marcilla, Miguel; Hernáez, María Luisa; Gutiérrez-Blázquez, María Dolores; Velarde, Luis Felipe Clemente; Aloria, Kerman; Beaskoetxea, Jabier; Medina-Aunon, J Alberto; Albar, Juan P

    2013-12-16

    Mass spectrometry is already a well-established protein identification tool and recent methodological and technological developments have also made possible the extraction of quantitative data of protein abundance in large-scale studies. Several strategies for absolute and relative quantitative proteomics and the statistical assessment of quantifications are possible, each having specific measurements and therefore, different data analysis workflows. The guidelines for Mass Spectrometry Quantification allow the description of a wide range of quantitative approaches, including labeled and label-free techniques and also targeted approaches such as Selected Reaction Monitoring (SRM). The HUPO Proteomics Standards Initiative (HUPO-PSI) has invested considerable efforts to improve the standardization of proteomics data handling, representation and sharing through the development of data standards, reporting guidelines, controlled vocabularies and tooling. In this manuscript, we describe a key output from the HUPO-PSI-namely the MIAPE Quant guidelines, which have developed in parallel with the corresponding data exchange format mzQuantML [1]. The MIAPE Quant guidelines describe the HUPO-PSI proposal concerning the minimum information to be reported when a quantitative data set, derived from mass spectrometry (MS), is submitted to a database or as supplementary information to a journal. The guidelines have been developed with input from a broad spectrum of stakeholders in the proteomics field to represent a true consensus view of the most important data types and metadata, required for a quantitative experiment to be analyzed critically or a data analysis pipeline to be reproduced. It is anticipated that they will influence or be directly adopted as part of journal guidelines for publication and by public proteomics databases and thus may have an impact on proteomics laboratories across the world. This article is part of a Special Issue entitled: Standardization and Quality Control. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. A Pilot Study of the Noninvasive Assessment of the Lung Microbiota as a Potential Tool for the Early Diagnosis of Ventilator-Associated Pneumonia

    PubMed Central

    Brady, Jacob S.; Romano-Keeler, Joann; Drake, Wonder P.; Norris, Patrick R.; Jenkins, Judith M.; Isaacs, Richard J.; Boczko, Erik M.

    2015-01-01

    BACKGROUND: Ventilator-associated pneumonia (VAP) remains a common complication in critically ill surgical patients, and its diagnosis remains problematic. Exhaled breath contains aerosolized droplets that reflect the lung microbiota. We hypothesized that exhaled breath condensate fluid (EBCF) in hygroscopic condenser humidifier/heat and moisture exchanger (HCH/HME) filters would contain bacterial DNA that qualitatively and quantitatively correlate with pathogens isolated from quantitative BAL samples obtained for clinical suspicion of pneumonia. METHODS: Forty-eight adult patients who were mechanically ventilated and undergoing quantitative BAL (n = 51) for suspected pneumonia in the surgical ICU were enrolled. Per protocol, patients fulfilling VAP clinical criteria undergo quantitative BAL bacterial culture. Immediately prior to BAL, time-matched HCH/HME filters were collected for study of EBCF by real-time polymerase chain reaction. Additionally, convenience samples of serially collected filters in patients with BAL-diagnosed VAP were analyzed. RESULTS: Forty-nine of 51 time-matched EBCF/BAL fluid samples were fully concordant (concordance > 95% by κ statistic) relative to identified pathogens and strongly correlated with clinical cultures. Regression analysis of quantitative bacterial DNA in paired samples revealed a statistically significant positive correlation (r = 0.85). In a convenience sample, qualitative and quantitative polymerase chain reaction analysis of serial HCH/HME samples for bacterial DNA demonstrated an increase in load that preceded the suspicion of pneumonia. CONCLUSIONS: Bacterial DNA within EBCF demonstrates a high correlation with BAL fluid and clinical cultures. Bacterial DNA within EBCF increases prior to the suspicion of pneumonia. Further study of this novel approach may allow development of a noninvasive tool for the early diagnosis of VAP. PMID:25474571

  7. Advancing the Fork detector for quantitative spent nuclear fuel verification

    DOE PAGES

    Vaccaro, S.; Gauld, I. C.; Hu, J.; ...

    2018-01-31

    The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations.more » A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This study describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms. Finally, the results are summarized, and sources and magnitudes of uncertainties are identified, and the impact of analysis uncertainties on the ability to confirm operator declarations is quantified.« less

  8. Advancing the Fork detector for quantitative spent nuclear fuel verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vaccaro, S.; Gauld, I. C.; Hu, J.

    The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations.more » A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This study describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms. Finally, the results are summarized, and sources and magnitudes of uncertainties are identified, and the impact of analysis uncertainties on the ability to confirm operator declarations is quantified.« less

  9. Advancing the Fork detector for quantitative spent nuclear fuel verification

    NASA Astrophysics Data System (ADS)

    Vaccaro, S.; Gauld, I. C.; Hu, J.; De Baere, P.; Peterson, J.; Schwalbach, P.; Smejkal, A.; Tomanin, A.; Sjöland, A.; Tobin, S.; Wiarda, D.

    2018-04-01

    The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations. A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This paper describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms. The results are summarized, and sources and magnitudes of uncertainties are identified, and the impact of analysis uncertainties on the ability to confirm operator declarations is quantified.

  10. Saturation recovery EPR spin-labeling method for quantification of lipids in biological membrane domains.

    PubMed

    Mainali, Laxman; Camenisch, Theodore G; Hyde, James S; Subczynski, Witold K

    2017-12-01

    The presence of integral membrane proteins induces the formation of distinct domains in the lipid bilayer portion of biological membranes. Qualitative application of both continuous wave (CW) and saturation recovery (SR) electron paramagnetic resonance (EPR) spin-labeling methods allowed discrimination of the bulk, boundary, and trapped lipid domains. A recently developed method, which is based on the CW EPR spectra of phospholipid (PL) and cholesterol (Chol) analog spin labels, allows evaluation of the relative amount of PLs (% of total PLs) in the boundary plus trapped lipid domain and the relative amount of Chol (% of total Chol) in the trapped lipid domain [ M. Raguz, L. Mainali, W. J. O'Brien, and W. K. Subczynski (2015), Exp. Eye Res., 140:179-186 ]. Here, a new method is presented that, based on SR EPR spin-labeling, allows quantitative evaluation of the relative amounts of PLs and Chol in the trapped lipid domain of intact membranes. This new method complements the existing one, allowing acquisition of more detailed information about the distribution of lipids between domains in intact membranes. The methodological transition of the SR EPR spin-labeling approach from qualitative to quantitative is demonstrated. The abilities of this method are illustrated for intact cortical and nuclear fiber cell plasma membranes from porcine eye lenses. Statistical analysis (Student's t -test) of the data allowed determination of the separations of mean values above which differences can be treated as statistically significant ( P ≤ 0.05) and can be attributed to sources other than preparation/technique.

  11. Exploring the utility of high resolution "nano-" computed tomography imaging to place quantitative constraints on shell biometric changes in marine pteropods in response to ocean acidification

    NASA Astrophysics Data System (ADS)

    Eagle, R.; Howes, E.; Lischka, S.; Rudolph, R.; Büdenbender, J.; Bijma, J.; Gattuso, J. P.; Riebesell, U.

    2014-12-01

    Understanding and quantifying the response of marine organisms to present and future ocean acidification remains a major challenge encompassing observations on single species in culture and scaling up to the ecosystem and global scale. Understanding calcification changes in culture experiments designed to simulate present and future ocean conditions under potential CO2 emissions scenarios, and especially detecting the likely more subtle changes that may occur prior to the onset of more extreme ocean acidification, depends on the tools available. Here we explore the utility of high-resolution computed tomography (nano-CT) to provide quantitative biometric data on field collected and cultured marine pteropods, using the General Electric Company Phoenix Nanotom S Instrument. The technique is capable of quantitating the whole shell of the organism, allowing shell dimensions to be determined as well as parameters such as average shell thickness, the variation in thickness across the whole shell and in localized areas, total shell volume and surface area and when combined with weight measurements shell density can be calculated. The potential power of the technique is the ability to derive these parameters even on very small organisms less than 1 millimeter in size. Tuning the X-ray strength of the instrument allows organic material to be excluded from the analysis. Through replicate analysis of standards, we assess the reproducibility of data, and by comparison with dimension measurements derived from light microscopy we assess the accuracy of dimension determinations. We present results from historical and modern pteropod populations from the Mediterranean and cultured polar pteropods, resolving statistically significant differences in shell biometrics in both cases that may represent responses to ocean acidification.

  12. SILAC-Based Comparative Proteomic Analysis of Lysosomes from Mammalian Cells Using LC-MS/MS.

    PubMed

    Thelen, Melanie; Winter, Dominic; Braulke, Thomas; Gieselmann, Volkmar

    2017-01-01

    Mass spectrometry-based proteomics of lysosomal proteins has led to significant advances in understanding lysosomal function and pathology. The ever-increasing sensitivity and resolution of mass spectrometry in combination with labeling procedures which allow comparative quantitative proteomics can be applied to shed more light on the steadily increasing range of lysosomal functions. In addition, investigation of alterations in lysosomal protein composition in the many lysosomal storage diseases may yield further insights into the molecular pathology of these disorders. Here, we describe a protocol which allows to determine quantitative differences in the lysosomal proteome of cells which are genetically and/or biochemically different or have been exposed to certain stimuli. The method is based on stable isotope labeling of amino acids in cell culture (SILAC). Cells are exposed to superparamagnetic iron oxide particles which are endocytosed and delivered to lysosomes. After homogenization of cells, intact lysosomes are rapidly enriched by passing the cell homogenates over a magnetic column. Lysosomes are eluted after withdrawal of the magnetic field and subjected to mass spectrometry.

  13. Application of Particle Image Velocimetry and Reference Image Topography to jet shock cells using the hydraulic analogy

    NASA Astrophysics Data System (ADS)

    Kumar, Vaibhav; Ng, Ivan; Sheard, Gregory J.; Brocher, Eric; Hourigan, Kerry; Fouras, Andreas

    2011-08-01

    This paper examines the shock cell structure, vorticity and velocity field at the exit of an underexpanded jet nozzle using a hydraulic analogy and the Reference Image Topography technique. Understanding the flow in this region is important for the mitigation of screech, an aeroacoustic problem harmful to aircraft structures. Experiments are conducted on a water table, allowing detailed quantitative investigation of this important flow regime at a greatly reduced expense. Conventional Particle Image Velocimetry is employed to determine the velocity and vorticity fields of the nozzle exit region. Applying Reference Image Topography, the wavy water surface is reconstructed and when combined with the hydraulic analogy, provides a pressure map of the region. With this approach subtraction of surfaces is used to highlight the unsteady regions of the flow, which is not as convenient or quantitative with conventional Schlieren techniques. This allows a detailed analysis of the shock cell structures and their interaction with flow instabilities in the shear layer that are the underlying cause of jet screech.

  14. On the Application of Quantitative EEG for Characterizing Autistic Brain: A Systematic Review

    PubMed Central

    Billeci, Lucia; Sicca, Federico; Maharatna, Koushik; Apicella, Fabio; Narzisi, Antonio; Campatelli, Giulia; Calderoni, Sara; Pioggia, Giovanni; Muratori, Filippo

    2013-01-01

    Autism-Spectrum Disorders (ASD) are thought to be associated with abnormalities in neural connectivity at both the global and local levels. Quantitative electroencephalography (QEEG) is a non-invasive technique that allows a highly precise measurement of brain function and connectivity. This review encompasses the key findings of QEEG application in subjects with ASD, in order to assess the relevance of this approach in characterizing brain function and clustering phenotypes. QEEG studies evaluating both the spontaneous brain activity and brain signals under controlled experimental stimuli were examined. Despite conflicting results, literature analysis suggests that QEEG features are sensitive to modification in neuronal regulation dysfunction which characterize autistic brain. QEEG may therefore help in detecting regions of altered brain function and connectivity abnormalities, in linking behavior with brain activity, and subgrouping affected individuals within the wide heterogeneity of ASD. The use of advanced techniques for the increase of the specificity and of spatial localization could allow finding distinctive patterns of QEEG abnormalities in ASD subjects, paving the way for the development of tailored intervention strategies. PMID:23935579

  15. Visualizing Ebolavirus Particles Using Single-Particle Interferometric Reflectance Imaging Sensor (SP-IRIS).

    PubMed

    Carter, Erik P; Seymour, Elif Ç; Scherr, Steven M; Daaboul, George G; Freedman, David S; Selim Ünlü, M; Connor, John H

    2017-01-01

    This chapter describes an approach for the label-free imaging and quantification of intact Ebola virus (EBOV) and EBOV viruslike particles (VLPs) using a light microscopy technique. In this technique, individual virus particles are captured onto a silicon chip that has been printed with spots of virus-specific capture antibodies. These captured virions are then detected using an optical approach called interference reflectance imaging. This approach allows for the detection of each virus particle that is captured on an antibody spot and can resolve the filamentous structure of EBOV VLPs without the need for electron microscopy. Capture of VLPs and virions can be done from a variety of sample types ranging from tissue culture medium to blood. The technique also allows automated quantitative analysis of the number of virions captured. This can be used to identify the virus concentration in an unknown sample. In addition, this technique offers the opportunity to easily image virions captured from native solutions without the need for additional labeling approaches while offering a means of assessing the range of particle sizes and morphologies in a quantitative manner.

  16. Usefulness of quantitative susceptibility mapping for the diagnosis of Parkinson disease.

    PubMed

    Murakami, Y; Kakeda, S; Watanabe, K; Ueda, I; Ogasawara, A; Moriya, J; Ide, S; Futatsuya, K; Sato, T; Okada, K; Uozumi, T; Tsuji, S; Liu, T; Wang, Y; Korogi, Y

    2015-06-01

    Quantitative susceptibility mapping allows overcoming several nonlocal restrictions of susceptibility-weighted and phase imaging and enables quantification of magnetic susceptibility. We compared the diagnostic accuracy of quantitative susceptibility mapping and R2* (1/T2*) mapping to discriminate between patients with Parkinson disease and controls. For 21 patients with Parkinson disease and 21 age- and sex-matched controls, 2 radiologists measured the quantitative susceptibility mapping values and R2* values in 6 brain structures (the thalamus, putamen, caudate nucleus, pallidum, substantia nigra, and red nucleus). The quantitative susceptibility mapping values and R2* values of the substantia nigra were significantly higher in patients with Parkinson disease (P < .01); measurements in other brain regions did not differ significantly between patients and controls. For the discrimination of patients with Parkinson disease from controls, receiver operating characteristic analysis suggested that the optimal cutoff values for the substantia nigra, based on the Youden Index, were >0.210 for quantitative susceptibility mapping and >28.8 for R2*. The sensitivity, specificity, and accuracy of quantitative susceptibility mapping were 90% (19 of 21), 86% (18 of 21), and 88% (37 of 42), respectively; for R2* mapping, they were 81% (17 of 21), 52% (11 of 21), and 67% (28 of 42). Pair-wise comparisons showed that the areas under the receiver operating characteristic curves were significantly larger for quantitative susceptibility mapping than for R2* mapping (0.91 versus 0.69, P < .05). Quantitative susceptibility mapping showed higher diagnostic performance than R2* mapping for the discrimination between patients with Parkinson disease and controls. © 2015 by American Journal of Neuroradiology.

  17. Chemically stable Au nanorods as probes for sensitive surface enhanced scattering (SERS) analysis of blue BIC ballpoint pens

    NASA Astrophysics Data System (ADS)

    Alyami, Abeer; Saviello, Daniela; McAuliffe, Micheal A. P.; Cucciniello, Raffaele; Mirabile, Antonio; Proto, Antonio; Lewis, Liam; Iacopino, Daniela

    2017-08-01

    Au nanorods were used as an alternative to commonly used Ag nanoparticles as Surface Enhanced Raman Scattering (SERS) probes for identification of dye composition of blue BIC ballpoint pens. When used in combination with Thin Layer Chromatography (TLC), Au nanorod colloids allowed identification of the major dye components of the BIC pen ink, otherwise not identifiable by normal Raman spectroscopy. Thanks to their enhanced chemical stability compared to Ag colloids, Au nanorods provided stable and reproducible SERS signals and allowed easy identification of phthalocyanine and triarylene dyes in the pen ink mixture. These findings were supported by FTIR and MALDI analyses, also performed on the pen ink. Furthermore, the self-assembly of Au nanorods into large area ordered superstructures allowed identification of BIC pen traces. SERS spectra of good intensity and high reproducibility were obtained using Au nanorod vertical arrays, due to the high density of hot spots and morphological reproducibility of these superstructures. These results open the way to the employment of SERS for fast screening analysis and for quantitative analysis of pens and faded pens which are relevant for the fields of forensic and art conservation sciences.

  18. Techniques in helical scanning, dynamic imaging and image segmentation for improved quantitative analysis with X-ray micro-CT

    NASA Astrophysics Data System (ADS)

    Sheppard, Adrian; Latham, Shane; Middleton, Jill; Kingston, Andrew; Myers, Glenn; Varslot, Trond; Fogden, Andrew; Sawkins, Tim; Cruikshank, Ron; Saadatfar, Mohammad; Francois, Nicolas; Arns, Christoph; Senden, Tim

    2014-04-01

    This paper reports on recent advances at the micro-computed tomography facility at the Australian National University. Since 2000 this facility has been a significant centre for developments in imaging hardware and associated software for image reconstruction, image analysis and image-based modelling. In 2010 a new instrument was constructed that utilises theoretically-exact image reconstruction based on helical scanning trajectories, allowing higher cone angles and thus better utilisation of the available X-ray flux. We discuss the technical hurdles that needed to be overcome to allow imaging with cone angles in excess of 60°. We also present dynamic tomography algorithms that enable the changes between one moment and the next to be reconstructed from a sparse set of projections, allowing higher speed imaging of time-varying samples. Researchers at the facility have also created a sizeable distributed-memory image analysis toolkit with capabilities ranging from tomographic image reconstruction to 3D shape characterisation. We show results from image registration and present some of the new imaging and experimental techniques that it enables. Finally, we discuss the crucial question of image segmentation and evaluate some recently proposed techniques for automated segmentation.

  19. Combined elemental and microstructural analysis of genuine and fake copper-alloy coins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartoli, L; Agresti, J; Mascalchi, M

    2011-07-31

    Innovative noninvasive material analysis techniques are applied to determine archaeometallurgical characteristics of copper-alloy coins from Florence's National Museum of Archaeology. Three supposedly authentic Roman coins and three hypothetically fraudolent imitations are thoroughly investigated using laser-induced plasma spectroscopy and time of flight neutron diffraction along with 3D videomicroscopy and electron microscopy. Material analyses are aimed at collecting data allowing for objective discrimination between genuine Roman productions and late fakes. The results show the mentioned techniques provide quantitative compositional and textural data, which are strictly related to the manufacturing processes and aging of copper alloys. (laser applications)

  20. Quantitative phase analysis and microstructure characterization of magnetite nanocrystals obtained by microwave assisted non-hydrolytic sol–gel synthesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sciancalepore, Corrado, E-mail: corrado.sciancalepore@unimore.it; Bondioli, Federica; INSTM Consortium, Via G. Giusti 9, 51121 Firenze

    2015-02-15

    An innovative preparation procedure, based on microwave assisted non-hydrolytic sol–gel synthesis, to obtain spherical magnetite nanoparticles was reported together with a detailed quantitative phase analysis and microstructure characterization of the synthetic products. The nanoparticle growth was analyzed as a function of the synthesis time and was described in terms of crystallization degree employing the Rietveld method on the magnetic nanostructured system for the determination of the amorphous content using hematite as internal standard. Product crystallinity increases as the microwave thermal treatment is increased and reaches very high percentages for synthesis times longer than 1 h. Microstructural evolution of nanocrystals wasmore » followed by the integral breadth methods to obtain information on the crystallite size-strain distribution. The results of diffraction line profile analysis were compared with nanoparticle grain distribution estimated by dimensional analysis of the transmission electron microscopy (TEM) images. A variation both in the average grain size and in the distribution of the coherently diffraction domains is evidenced, allowing to suppose a relationship between the two quantities. The traditional integral breadth methods have proven to be valid for a rapid assessment of the diffraction line broadening effects in the above-mentioned nanostructured systems and the basic assumption for the correct use of these methods are discussed as well. - Highlights: • Fe{sub 3}O{sub 4} nanocrystals were obtained by MW-assisted non-hydrolytic sol–gel synthesis. • Quantitative phase analysis revealed that crystallinity up to 95% was reached. • The strategy of Rietveld refinements was discussed in details. • Dimensional analysis showed nanoparticles ranging from 4 to 8 nm. • Results of integral breadth methods were compared with microscopic analysis.« less

  1. High-resolution high-speed dynamic mechanical spectroscopy of cells and other soft materials with the help of atomic force microscopy.

    PubMed

    Dokukin, M; Sokolov, I

    2015-07-28

    Dynamic mechanical spectroscopy (DMS), which allows measuring frequency-dependent viscoelastic properties, is important to study soft materials, tissues, biomaterials, polymers. However, the existing DMS techniques (nanoindentation) have limited resolution when used on soft materials, preventing them from being used to study mechanics at the nanoscale. The nanoindenters are not capable of measuring cells, nanointerfaces of composite materials. Here we present a highly accurate DMS modality, which is a combination of three different methods: quantitative nanoindentation (nanoDMA), gentle force and fast response of atomic force microscopy (AFM), and Fourier transform (FT) spectroscopy. This new spectroscopy (which we suggest to call FT-nanoDMA) is fast and sensitive enough to allow DMS imaging of nanointerfaces, single cells, while attaining about 100x improvements on polymers in both spatial (to 10-70 nm) and temporal resolution (to 0.7 s/pixel) compared to the current art. Multiple frequencies are measured simultaneously. The use of 10 frequencies are demonstrated here (up to 300 Hz which is a rather relevant range for biological materials and polymers, in both ambient conditions and liquid). The method is quantitatively verified on known polymers and demonstrated on cells and polymers blends. Analysis shows that FT-nanoDMA is highly quantitative. The FT-nanoDMA spectroscopy can easily be implemented in the existing AFMs.

  2. High-resolution high-speed dynamic mechanical spectroscopy of cells and other soft materials with the help of atomic force microscopy

    PubMed Central

    Dokukin, M.; Sokolov, I.

    2015-01-01

    Dynamic mechanical spectroscopy (DMS), which allows measuring frequency-dependent viscoelastic properties, is important to study soft materials, tissues, biomaterials, polymers. However, the existing DMS techniques (nanoindentation) have limited resolution when used on soft materials, preventing them from being used to study mechanics at the nanoscale. The nanoindenters are not capable of measuring cells, nanointerfaces of composite materials. Here we present a highly accurate DMS modality, which is a combination of three different methods: quantitative nanoindentation (nanoDMA), gentle force and fast response of atomic force microscopy (AFM), and Fourier transform (FT) spectroscopy. This new spectroscopy (which we suggest to call FT-nanoDMA) is fast and sensitive enough to allow DMS imaging of nanointerfaces, single cells, while attaining about 100x improvements on polymers in both spatial (to 10–70 nm) and temporal resolution (to 0.7s/pixel) compared to the current art. Multiple frequencies are measured simultaneously. The use of 10 frequencies are demonstrated here (up to 300 Hz which is a rather relevant range for biological materials and polymers, in both ambient conditions and liquid). The method is quantitatively verified on known polymers and demonstrated on cells and polymers blends. Analysis shows that FT-nanoDMA is highly quantitative. The FT-nanoDMA spectroscopy can easily be implemented in the existing AFMs. PMID:26218346

  3. Development of a GC/MS method for the qualitative and quantitative analysis of mixtures of free fatty acids and metal soaps in paint samples.

    PubMed

    La Nasa, Jacopo; Modugno, Francesca; Aloisi, Matteo; Lluveras-Tenorio, Anna; Bonaduce, Ilaria

    2018-02-25

    In this paper we present a new analytical GC/MS method for the analysis of mixtures of free fatty acids and metal soaps in paint samples. This approach is based on the use of two different silylating agents: N,O-bis(trimethylsilyl)trifluoroacetamide (BSTFA) and 1,1,1,3,3,3-hexamethyldisilazane (HMDS). Our experimentation demonstrated that HMDS does not silylate fatty acid carboxylates, so it can be used for the selective derivatization and GC/MS quantitative analysis of free fatty acids. On the other hand BSTFA is able to silylate both free fatty acids and fatty acids carboxylates. The reaction conditions for the derivatization of carboxylates with BSTFA were thus optimized with a full factorial 3 2 experimental design using lead stearate and lead palmitate as model systems. The analytical method was validated following the ICH guidelines. The method allows the qualitative and quantitative analysis of fatty acid carboxylates of sodium, calcium, magnesium, aluminium, manganese, cobalt, copper, zinc, cadmium, and lead and of lead azelate. In order to exploit the performances of the new analytical method, samples collected from two reference paint layers, from a gilded 16th century marble sculpture, and from a paint tube belonging to the atelier of Edvard Munch, used in the last period of his life (1916-1944), were characterized. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Detection of mandarin in orange juice by single-nucleotide polymorphism qPCR assay.

    PubMed

    Aldeguer, Miriam; López-Andreo, María; Gabaldón, José A; Puyet, Antonio

    2014-02-15

    A dual-probe real time PCR (qPCR) DNA-based analysis was devised for the identification of mandarin in orange juice. A single nucleotide polymorphism at the trnL-trnF intergenic region of the chloroplast chromosome was confirmed in nine orange (Citrus sinensis) and thirteen commercial varieties of mandarin, including Citrus reticulata and Citrus unshiu species and a mandarin × tangelo hybrid. Two short minor-groove binding fluorescent probes targeting the polymorphic sequence were used in the dual-probe qPCR, which allowed the detection of both species in single-tube reactions. The similarity of PCR efficiencies allowed a simple estimation of the ratio mandarin/orange in the juice samples, which correlated to the measured difference of threshold cycle values for both probes. The limit of detection of the assay was 5% of mandarin in orange juice, both when the juice was freshly prepared (not from concentrate) or reconstituted from concentrate, which would allow the detection of fraudulently added mandarin juice. The possible use of the dual-probe system for quantitative measurements was also tested on fruit juice mixtures. qPCR data obtained from samples containing equal amounts of mandarin and orange juice revealed that the mandarin target copy number was approximately 2.6-fold higher than in orange juice. The use of a matrix-adapted control as calibrator to compensate the resulting C(T) bias allowed accurate quantitative measurements to be obtained. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Patient-specific analysis of post-operative aortic hemodynamics: a focus on thoracic endovascular repair (TEVAR)

    NASA Astrophysics Data System (ADS)

    Auricchio, F.; Conti, M.; Lefieux, A.; Morganti, S.; Reali, A.; Sardanelli, F.; Secchi, F.; Trimarchi, S.; Veneziani, A.

    2014-10-01

    The purpose of this study is to quantitatively evaluate the impact of endovascular repair on aortic hemodynamics. The study addresses the assessment of post-operative hemodynamic conditions of a real clinical case through patient-specific analysis, combining accurate medical image analysis and advanced computational fluid-dynamics (CFD). Although the main clinical concern was firstly directed to the endoluminal protrusion of the prosthesis, the CFD simulations have demonstrated that there are two other important areas where the local hemodynamics is impaired and a disturbed blood flow is present: the first one is the ostium of the subclavian artery, which is partially closed by the graft; the second one is the stenosis of the distal thoracic aorta. Besides the clinical relevance of these specific findings, this study highlights how CFD analyses allow to observe important flow effects resulting from the specific features of patient vessel geometries. Consequently, our results demonstrate the potential impact of computational biomechanics not only on the basic knowledge of physiopathology, but also on the clinical practice, thanks to a quantitative extraction of knowledge made possible by merging medical data and mathematical models.

  6. Quantitative holographic interferometry applied to combustion and compressible flow research

    NASA Astrophysics Data System (ADS)

    Bryanston-Cross, Peter J.; Towers, D. P.

    1993-03-01

    The application of holographic interferometry to phase object analysis is described. Emphasis has been given to a method of extracting quantitative information automatically from the interferometric fringe data. To achieve this a carrier frequency has been added to the holographic data. This has made it possible, firstly to form a phase map using a fast Fourier transform (FFT) algorithm. Then to `solve,' or unwrap, this image to give a contiguous density map using a minimum weight spanning tree (MST) noise immune algorithm, known as fringe analysis (FRAN). Applications of this work to a burner flame and a compressible flow are presented. In both cases the spatial frequency of the fringes exceed the resolvable limit of conventional digital framestores. Therefore, a flatbed scanner with a resolution of 3200 X 2400 pixels has been used to produce very high resolution digital images from photographs. This approach has allowed the processing of data despite the presence of caustics, generated by strong thermal gradients at the edge of the combustion field. A similar example is presented from the analysis of a compressible transonic flow in the shock wave and trailing edge regions.

  7. Quantitative Cell Cycle Analysis Based on an Endogenous All-in-One Reporter for Cell Tracking and Classification.

    PubMed

    Zerjatke, Thomas; Gak, Igor A; Kirova, Dilyana; Fuhrmann, Markus; Daniel, Katrin; Gonciarz, Magdalena; Müller, Doris; Glauche, Ingmar; Mansfeld, Jörg

    2017-05-30

    Cell cycle kinetics are crucial to cell fate decisions. Although live imaging has provided extensive insights into this relationship at the single-cell level, the limited number of fluorescent markers that can be used in a single experiment has hindered efforts to link the dynamics of individual proteins responsible for decision making directly to cell cycle progression. Here, we present fluorescently tagged endogenous proliferating cell nuclear antigen (PCNA) as an all-in-one cell cycle reporter that allows simultaneous analysis of cell cycle progression, including the transition into quiescence, and the dynamics of individual fate determinants. We also provide an image analysis pipeline for automated segmentation, tracking, and classification of all cell cycle phases. Combining the all-in-one reporter with labeled endogenous cyclin D1 and p21 as prime examples of cell-cycle-regulated fate determinants, we show how cell cycle and quantitative protein dynamics can be simultaneously extracted to gain insights into G1 phase regulation and responses to perturbations. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  8. Steroid receptors analysis in human mammary tumors by isoelectric focusing in agarose.

    PubMed

    Bailleul, S; Gauduchon, P; Malas, J P; Lechevrel, C; Roussel, G; Goussard, J

    1988-08-01

    A high resolution and quantitative method for isoelectric focusing has been developed to separate the isoforms of estrogen and progesterone receptors in human mammary tumor cytosols stabilized by sodium molybdate. Agarose gels (0.5%) were used. Six samples can be analyzed on one gel in about 2 h, and 35-microliters samples are sufficient to determine the estrogen receptor isoform pattern. The constant yields and the reproducibility of data allow a quantitative analysis of these receptors. Four estrogen receptor isoforms have been observed (pI 4.7, 5.5, 6, and 6.5), isoforms with pI 4.7 and 6.5 being present in all tumors. After incubation at 28 degrees C in high ionic strength, the comparison of isoelectric focusing and high-performance size exclusion chromatography patterns of estrogen receptor confirms the oligomeric structure of the pI 4.7 isoform and suggests a monomeric structure for the pI 6.5 isoform. Under the same conditions of analysis, only one progesterone receptor isoform has been detected with pI 4.7.

  9. High-throughput 3D whole-brain quantitative histopathology in rodents

    PubMed Central

    Vandenberghe, Michel E.; Hérard, Anne-Sophie; Souedet, Nicolas; Sadouni, Elmahdi; Santin, Mathieu D.; Briet, Dominique; Carré, Denis; Schulz, Jocelyne; Hantraye, Philippe; Chabrier, Pierre-Etienne; Rooney, Thomas; Debeir, Thomas; Blanchard, Véronique; Pradier, Laurent; Dhenain, Marc; Delzescaux, Thierry

    2016-01-01

    Histology is the gold standard to unveil microscopic brain structures and pathological alterations in humans and animal models of disease. However, due to tedious manual interventions, quantification of histopathological markers is classically performed on a few tissue sections, thus restricting measurements to limited portions of the brain. Recently developed 3D microscopic imaging techniques have allowed in-depth study of neuroanatomy. However, quantitative methods are still lacking for whole-brain analysis of cellular and pathological markers. Here, we propose a ready-to-use, automated, and scalable method to thoroughly quantify histopathological markers in 3D in rodent whole brains. It relies on block-face photography, serial histology and 3D-HAPi (Three Dimensional Histology Analysis Pipeline), an open source image analysis software. We illustrate our method in studies involving mouse models of Alzheimer’s disease and show that it can be broadly applied to characterize animal models of brain diseases, to evaluate therapeutic interventions, to anatomically correlate cellular and pathological markers throughout the entire brain and to validate in vivo imaging techniques. PMID:26876372

  10. Elucidating dynamic metabolic physiology through network integration of quantitative time-course metabolomics

    DOE PAGES

    Bordbar, Aarash; Yurkovich, James T.; Paglia, Giuseppe; ...

    2017-04-07

    In this study, the increasing availability of metabolomics data necessitates novel methods for deeper data analysis and interpretation. We present a flux balance analysis method that allows for the computation of dynamic intracellular metabolic changes at the cellular scale through integration of time-course absolute quantitative metabolomics. This approach, termed “unsteady-state flux balance analysis” (uFBA), is applied to four cellular systems: three dynamic and one steady-state as a negative control. uFBA and FBA predictions are contrasted, and uFBA is found to be more accurate in predicting dynamic metabolic flux states for red blood cells, platelets, and Saccharomyces cerevisiae. Notably, only uFBAmore » predicts that stored red blood cells metabolize TCA intermediates to regenerate important cofactors, such as ATP, NADH, and NADPH. These pathway usage predictions were subsequently validated through 13C isotopic labeling and metabolic flux analysis in stored red blood cells. Utilizing time-course metabolomics data, uFBA provides an accurate method to predict metabolic physiology at the cellular scale for dynamic systems.« less

  11. Elucidating dynamic metabolic physiology through network integration of quantitative time-course metabolomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bordbar, Aarash; Yurkovich, James T.; Paglia, Giuseppe

    In this study, the increasing availability of metabolomics data necessitates novel methods for deeper data analysis and interpretation. We present a flux balance analysis method that allows for the computation of dynamic intracellular metabolic changes at the cellular scale through integration of time-course absolute quantitative metabolomics. This approach, termed “unsteady-state flux balance analysis” (uFBA), is applied to four cellular systems: three dynamic and one steady-state as a negative control. uFBA and FBA predictions are contrasted, and uFBA is found to be more accurate in predicting dynamic metabolic flux states for red blood cells, platelets, and Saccharomyces cerevisiae. Notably, only uFBAmore » predicts that stored red blood cells metabolize TCA intermediates to regenerate important cofactors, such as ATP, NADH, and NADPH. These pathway usage predictions were subsequently validated through 13C isotopic labeling and metabolic flux analysis in stored red blood cells. Utilizing time-course metabolomics data, uFBA provides an accurate method to predict metabolic physiology at the cellular scale for dynamic systems.« less

  12. Misinformation on vaccination: A quantitative analysis of YouTube videos.

    PubMed

    Donzelli, Gabriele; Palomba, Giacomo; Federigi, Ileana; Aquino, Francesco; Cioni, Lorenzo; Verani, Marco; Carducci, Annalaura; Lopalco, Pierluigi

    2018-03-19

    In Italy, the phenomenon of vaccine hesitancy has increased with time and represents a complex problem that requires a continuous monitoring. Misinformation on media and social media seems to be one of the determinants of the vaccine hesitancy since, for instance, 42.8 percent of Italian citizens used the internet to obtain vaccine information in 2016. This article reports a quantitative analysis of 560 YouTube videos related to the link between vaccines and autism or other serious side effects on children. The analysis revealed that most of the videos were negative in tone and that the annual number of uploaded videos has increased during the considered period, that goes from 27 December 2007 to 31 July 2017, with a peak of 224 videos in the first seven months of 2017. These findings suggest that the public institutions should be more engaged in establishing a web presence in order to provide reliable information, answers, stories, and videos so to respond to questions of the public about vaccination. These actions could be useful to allow citizens to make informed decisions about vaccines so to comply with vaccination regulations.

  13. Transition metal redox and Mn disproportional reaction in LiMn0.5Fe0.5PO4 electrodes cycled with aqueous electrolyte

    NASA Astrophysics Data System (ADS)

    Zhuo, Zengqing; Hu, Jiangtao; Duan, Yandong; Yang, Wanli; Pan, Feng

    2016-07-01

    We performed soft x-ray absorption spectroscopy (sXAS) and a quantitative analysis of the transition metal redox in the LiMn0.5Fe0.5PO4 electrodes upon electrochemical cycling. In order to circumvent the complication of the surface reactions with organic electrolyte at high potential, the LiMn0.5Fe0.5PO4 electrodes are cycled with aqueous electrolyte. The analysis of the transitional metal L-edge spectra allows a quantitative determination of the redox evolution of Mn and Fe during the electrochemical cycling. The sXAS analysis reveals the evolving Mn oxidation states in LiMn0.5Fe0.5PO4. We found that electrochemically inactive Mn2+ is formed on the electrode surface during cycling. Additionally, the signal indicates about 20% concentration of Mn4+ at the charged state, providing a strong experimental evidence of the disproportional reaction of Mn3+ to Mn2+ and Mn4+ on the surface of the charged LiMn0.5Fe0.5PO4 electrodes.

  14. Meta-analysis is not an exact science: Call for guidance on quantitative synthesis decisions.

    PubMed

    Haddaway, Neal R; Rytwinski, Trina

    2018-05-01

    Meta-analysis is becoming increasingly popular in the field of ecology and environmental management. It increases the effective power of analyses relative to single studies, and allows researchers to investigate effect modifiers and sources of heterogeneity that could not be easily examined within single studies. Many systematic reviewers will set out to conduct a meta-analysis as part of their synthesis, but meta-analysis requires a niche set of skills that are not widely held by the environmental research community. Each step in the process of carrying out a meta-analysis requires decisions that have both scientific and statistical implications. Reviewers are likely to be faced with a plethora of decisions over which effect size to choose, how to calculate variances, and how to build statistical models. Some of these decisions may be simple based on appropriateness of the options. At other times, reviewers must choose between equally valid approaches given the information available to them. This presents a significant problem when reviewers are attempting to conduct a reliable synthesis, such as a systematic review, where subjectivity is minimised and all decisions are documented and justified transparently. We propose three urgent, necessary developments within the evidence synthesis community. Firstly, we call on quantitative synthesis experts to improve guidance on how to prepare data for quantitative synthesis, providing explicit detail to support systematic reviewers. Secondly, we call on journal editors and evidence synthesis coordinating bodies (e.g. CEE) to ensure that quantitative synthesis methods are adequately reported in a transparent and repeatable manner in published systematic reviews. Finally, where faced with two or more broadly equally valid alternative methods or actions, reviewers should conduct multiple analyses, presenting all options, and discussing the implications of the different analytical approaches. We believe it is vital to tackle the possible subjectivity in quantitative synthesis described herein to ensure that the extensive efforts expended in producing systematic reviews and other evidence synthesis products is not wasted because of a lack of rigour or reliability in the final synthesis step. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. GeLC-MRM quantitation of mutant KRAS oncoprotein in complex biological samples.

    PubMed

    Halvey, Patrick J; Ferrone, Cristina R; Liebler, Daniel C

    2012-07-06

    Tumor-derived mutant KRAS (v-Ki-ras-2 Kirsten rat sarcoma viral oncogene) oncoprotein is a critical driver of cancer phenotypes and a potential biomarker for many epithelial cancers. Targeted mass spectrometry analysis by multiple reaction monitoring (MRM) enables selective detection and quantitation of wild-type and mutant KRAS proteins in complex biological samples. A recently described immunoprecipitation approach (Proc. Nat. Acad. Sci.2011, 108, 2444-2449) can be used to enrich KRAS for MRM analysis, but requires large protein inputs (2-4 mg). Here, we describe sodium dodecyl sulfate-polyacrylamide gel electrophoresis-based enrichment of KRAS in a low molecular weight (20-25 kDa) protein fraction prior to MRM analysis (GeLC-MRM). This approach reduces background proteome complexity, thus, allowing mutant KRAS to be reliably quantified in low protein inputs (5-50 μg). GeLC-MRM detected KRAS mutant variants (G12D, G13D, G12V, G12S) in a panel of cancer cell lines. GeLC-MRM analysis of wild-type and mutant was linear with respect to protein input and showed low variability across process replicates (CV = 14%). Concomitant analysis of a peptide from the highly similar HRAS and NRAS proteins enabled correction of KRAS-targeted measurements for contributions from these other proteins. KRAS peptides were also quantified in fluid from benign pancreatic cysts and pancreatic cancers at concentrations from 0.08 to 1.1 fmol/μg protein. GeLC-MRM provides a robust, sensitive approach to quantitation of mutant proteins in complex biological samples.

  16. Devolatilization Analysis in a Twin Screw Extruder by using the Flow Analysis Network (FAN) Method

    NASA Astrophysics Data System (ADS)

    Tomiyama, Hideki; Takamoto, Seiji; Shintani, Hiroaki; Inoue, Shigeki

    We derived the theoretical formulas for three mechanisms of devolatilization in a twin screw extruder. These are flash, surface refreshment and forced expansion. The method for flash devolatilization is based on the equation of equilibrium concentration which shows that volatiles break off from polymer when they are relieved from high pressure condition. For surface refreshment devolatilization, we applied Latinen's model to allow estimation of polymer behavior in the unfilled screw conveying condition. Forced expansion devolatilization is based on the expansion theory in which foams are generated under reduced pressure and volatiles are diffused on the exposed surface layer after mixing with the injected devolatilization agent. Based on these models, we developed the simulation software of twin-screw extrusion by the FAN method and it allows us to quantitatively estimate volatile concentration and polymer temperature with a high accuracy in the actual multi-vent extrusion process for LDPE + n-hexane.

  17. Enzymatic production of single-molecule FISH and RNA capture probes.

    PubMed

    Gaspar, Imre; Wippich, Frank; Ephrussi, Anne

    2017-10-01

    Arrays of singly labeled short oligonucleotides that hybridize to a specific target revolutionized RNA biology, enabling quantitative, single-molecule microscopy analysis and high-efficiency RNA/RNP capture. Here, we describe a simple and efficient method that allows flexible functionalization of inexpensive DNA oligonucleotides by different fluorescent dyes or biotin using terminal deoxynucleotidyl transferase and custom-made functional group conjugated dideoxy-UTP. We show that (i) all steps of the oligonucleotide labeling-including conjugation, enzymatic synthesis, and product purification-can be performed in a standard biology laboratory, (ii) the process yields >90%, often >95% labeled product with minimal carryover of impurities, and (iii) the oligonucleotides can be labeled with different dyes or biotin, allowing single-molecule FISH, RNA affinity purification, and Northern blot analysis to be performed. © 2017 Gaspar et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  18. Quantitative Imaging In Pathology (QUIP) | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    This site hosts web accessible applications, tools and data designed to support analysis, management, and exploration of whole slide tissue images for cancer research. The following tools are included: caMicroscope: A digital pathology data management and visualization plaform that enables interactive viewing of whole slide tissue images and segmentation results. caMicroscope can be also used independently of QUIP. FeatureExplorer: An interactive tool to allow patient-level feature exploration across multiple dimensions.

  19. Multivariate approach in popcorn genotypes using the Ward-MLM strategy: morpho-agronomic analysis and incidence of Fusarium spp.

    PubMed

    Kurosawa, R N F; do Amaral Junior, A T; Silva, F H L; Dos Santos, A; Vivas, M; Kamphorst, S H; Pena, G F

    2017-02-08

    The multivariate analyses are useful tools to estimate the genetic variability between accessions. In the breeding programs, the Ward-Modified Location Model (MLM) multivariate method has been a powerful strategy to quantify variability using quantitative and qualitative variables simultaneously. The present study was proposed in view of the dearth of information about popcorn breeding programs under a multivariate approach using the Ward-MLM methodology. The objective of this study was thus to estimate the genetic diversity among 37 genotypes of popcorn aiming to identify divergent groups associated with morpho-agronomic traits and traits related to resistance to Fusarium spp. To this end, 7 qualitative and 17 quantitative variables were analyzed. The experiment was conducted in 2014, at Universidade Estadual do Norte Fluminense, located in Campos dos Goytacazes, RJ, Brazil. The Ward-MLM strategy allowed the identification of four groups as follows: Group I with 10 genotypes, Group II with 11 genotypes, Group III with 9 genotypes, and Group IV with 7 genotypes. Group IV was distant in relation to the other groups, while groups I, II, and III were near. The crosses between genotypes from the other groups with those of group IV allow an exploitation of heterosis. The Ward-MLM strategy provided an appropriate grouping of genotypes; ear weight, ear diameter, and grain yield were the traits that most contributed to the analysis of genetic diversity.

  20. Development of carbon plasma-coated multiwell plates for high-throughput mass spectrometric analysis of highly lipophilic fermentation products.

    PubMed

    Heinig, Uwe; Scholz, Susanne; Dahm, Pia; Grabowy, Udo; Jennewein, Stefan

    2010-08-01

    Classical approaches to strain improvement and metabolic engineering rely on rapid qualitative and quantitative analyses of the metabolites of interest. As an analytical tool, mass spectrometry (MS) has proven to be efficient and nearly universally applicable for timely screening of metabolites. Furthermore, gas chromatography (GC)/MS- and liquid chromatography (LC)/MS-based metabolite screens can often be adapted to high-throughput formats. We recently engineered a Saccharomyces cerevisiae strain to produce taxa-4(5),11(12)-diene, the first pathway-committing biosynthetic intermediate for the anticancer drug Taxol, through the heterologous and homologous expression of several genes related to isoprenoid biosynthesis. To date, GC/MS- and LC/MS-based high-throughput methods have been inherently difficult to adapt to the screening of isoprenoid-producing microbial strains due to the need for extensive sample preparation of these often highly lipophilic compounds. In the current work, we examined different approaches to the high-throughput analysis of taxa-4(5),11(12)-diene biosynthesizing yeast strains in a 96-deep-well format. Carbon plasma coating of standard 96-deep-well polypropylene plates allowed us to circumvent the inherent solvent instability of commonly used deep-well plates. In addition, efficient adsorption of the target isoprenoid product by the coated plates allowed rapid and simple qualitative and quantitative analyses of the individual cultures. Copyright 2010 Elsevier Inc. All rights reserved.

  1. Quantitative Morphology Measures in Galaxies: Ground-Truthing from Simulations

    NASA Astrophysics Data System (ADS)

    Narayanan, Desika T.; Abruzzo, Matthew W.; Dave, Romeel; Thompson, Robert

    2017-01-01

    The process of galaxy assembly is a prevalent question in astronomy; there are a variety of potentially important effects, including baryonic accretion from the intergalactic medium, as well as major galaxy mergers. Recent years have ushered in the development of quantitative measures of morphology such as the Gini coefficient (G), the second-order moment of the brightest quintile of a galaxy’s light (M20), and the concentration (C), asymmetry (A), and clumpiness (S) of galaxies. To investigate the efficacy of these observational methods at identifying major mergers, we have run a series of very high resolution cosmological zoom simulations, and coupled these with 3D Monte Carlo dust radiative transfer. Our methodology is powerful in that it allows us to “observe” the simulation as an observer would, while maintaining detailed knowledge of the true merger history of the galaxy. In this presentation, we will present our main results from our analysis of these quantitative morphology measures, with a particular focus on high-redshift (z>2) systems.

  2. Global, quantitative and dynamic mapping of protein subcellular localization

    PubMed Central

    Itzhak, Daniel N; Tyanova, Stefka; Cox, Jürgen; Borner, Georg HH

    2016-01-01

    Subcellular localization critically influences protein function, and cells control protein localization to regulate biological processes. We have developed and applied Dynamic Organellar Maps, a proteomic method that allows global mapping of protein translocation events. We initially used maps statically to generate a database with localization and absolute copy number information for over 8700 proteins from HeLa cells, approaching comprehensive coverage. All major organelles were resolved, with exceptional prediction accuracy (estimated at >92%). Combining spatial and abundance information yielded an unprecedented quantitative view of HeLa cell anatomy and organellar composition, at the protein level. We subsequently demonstrated the dynamic capabilities of the approach by capturing translocation events following EGF stimulation, which we integrated into a quantitative model. Dynamic Organellar Maps enable the proteome-wide analysis of physiological protein movements, without requiring any reagents specific to the investigated process, and will thus be widely applicable in cell biology. DOI: http://dx.doi.org/10.7554/eLife.16950.001 PMID:27278775

  3. 3D quantitative analysis of early decomposition changes of the human face.

    PubMed

    Caplova, Zuzana; Gibelli, Daniele Maria; Poppa, Pasquale; Cummaudo, Marco; Obertova, Zuzana; Sforza, Chiarella; Cattaneo, Cristina

    2018-03-01

    Decomposition of the human body and human face is influenced, among other things, by environmental conditions. The early decomposition changes that modify the appearance of the face may hamper the recognition and identification of the deceased. Quantitative assessment of those changes may provide important information for forensic identification. This report presents a pilot 3D quantitative approach of tracking early decomposition changes of a single cadaver in controlled environmental conditions by summarizing the change with weekly morphological descriptions. The root mean square (RMS) value was used to evaluate the changes of the face after death. The results showed a high correlation (r = 0.863) between the measured RMS and the time since death. RMS values of each scan are presented, as well as the average weekly RMS values. The quantification of decomposition changes could improve the accuracy of antemortem facial approximation and potentially could allow the direct comparisons of antemortem and postmortem 3D scans.

  4. Ancestral effect on HOMA-IR levels quantitated in an American population of Mexican origin.

    PubMed

    Qu, Hui-Qi; Li, Quan; Lu, Yang; Hanis, Craig L; Fisher-Hoch, Susan P; McCormick, Joseph B

    2012-12-01

    An elevated insulin resistance index (homeostasis model assessment of insulin resistance [HOMA-IR]) is more commonly seen in the Mexican American population than in European populations. We report quantitative ancestral effects within a Mexican American population, and we correlate ancestral components with HOMA-IR. We performed ancestral analysis in 1,551 participants of the Cameron County Hispanic Cohort by genotyping 103 ancestry-informative markers (AIMs). These AIMs allow determination of the percentage (0-100%) ancestry from three major continental populations, i.e., European, African, and Amerindian. We observed that predominantly Amerindian ancestral components were associated with increased HOMA-IR (β = 0.124, P = 1.64 × 10(-7)). The correlation was more significant in males (Amerindian β = 0.165, P = 5.08 × 10(-7)) than in females (Amerindian β = 0.079, P = 0.019). This unique study design demonstrates how genomic markers for quantitative ancestral information can be used in admixed populations to predict phenotypic traits such as insulin resistance.

  5. The human embryonic stem cell proteome revealed by multidimensional fractionation followed by tandem mass spectrometry

    PubMed Central

    Zhao, Peng; Schulz, Thomas C.; Sherrer, Eric S.; Weatherly, D. Brent; Robins, Allan J.; Wells, Lance

    2015-01-01

    Human embryonic stem cells (hESCs) have received considerable attention due to their therapeutic potential and usefulness in understanding early development and cell fate commitment. In order to appreciate the unique properties of these pluripotent, self-renewing cells, we have performed an in-depth multidimensional fractionation followed by LC-MS/MS analysis of the hESCs harvested from defined media to elucidate expressed, phosphorylated, O-linked β-N-acetylglucosamine (O-GlcNAc) modified, and secreted proteins. From the triplicate analysis, we were able to assign more than 3000 proteins with less than 1% false-discovery rate. This analysis also allowed us to identify nearly 500 phosphorylation sites and 68 sites of O-GlcNAc modification with the same high confidence. Investigation of the phosphorylation sites allowed us to deduce the set of kinases that are likely active in these cells. We also identified more than 100 secreted proteins of hESCs that likely play a role in extracellular matrix formation and remodeling, as well as autocrine signaling for self-renewal and maintenance of the undifferentiated state. Finally, by performing in-depth analysis in triplicate, spectral counts were obtained for these proteins and posttranslationally modified peptides, which will allow us to perform relative quantitative analysis between these cells and any derived cell type in the future. PMID:25367160

  6. Fuzzy method of recognition of high molecular substances in evidence-based biology

    NASA Astrophysics Data System (ADS)

    Olevskyi, V. I.; Smetanin, V. T.; Olevska, Yu. B.

    2017-10-01

    Nowadays modern requirements to achieving reliable results along with high quality of researches put mathematical analysis methods of results at the forefront. Because of this, evidence-based methods of processing experimental data have become increasingly popular in the biological sciences and medicine. Their basis is meta-analysis, a method of quantitative generalization of a large number of randomized trails contributing to a same special problem, which are often contradictory and performed by different authors. It allows identifying the most important trends and quantitative indicators of the data, verification of advanced hypotheses and discovering new effects in the population genotype. The existing methods of recognizing high molecular substances by gel electrophoresis of proteins under denaturing conditions are based on approximate methods for comparing the contrast of electrophoregrams with a standard solution of known substances. We propose a fuzzy method for modeling experimental data to increase the accuracy and validity of the findings of the detection of new proteins.

  7. MetaPathways v2.5: quantitative functional, taxonomic and usability improvements.

    PubMed

    Konwar, Kishori M; Hanson, Niels W; Bhatia, Maya P; Kim, Dongjae; Wu, Shang-Ju; Hahn, Aria S; Morgan-Lang, Connor; Cheung, Hiu Kan; Hallam, Steven J

    2015-10-15

    Next-generation sequencing is producing vast amounts of sequence information from natural and engineered ecosystems. Although this data deluge has an enormous potential to transform our lives, knowledge creation and translation need software applications that scale with increasing data processing and analysis requirements. Here, we present improvements to MetaPathways, an annotation and analysis pipeline for environmental sequence information that expedites this transformation. We specifically address pathway prediction hazards through integration of a weighted taxonomic distance and enable quantitative comparison of assembled annotations through a normalized read-mapping measure. Additionally, we improve LAST homology searches through BLAST-equivalent E-values and output formats that are natively compatible with prevailing software applications. Finally, an updated graphical user interface allows for keyword annotation query and projection onto user-defined functional gene hierarchies, including the Carbohydrate-Active Enzyme database. MetaPathways v2.5 is available on GitHub: http://github.com/hallamlab/metapathways2. shallam@mail.ubc.ca Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  8. Quantitative analysis of autophagic flux by confocal pH-imaging of autophagic intermediates

    PubMed Central

    Maulucci, Giuseppe; Chiarpotto, Michela; Papi, Massimiliano; Samengo, Daniela; Pani, Giovambattista; De Spirito, Marco

    2015-01-01

    Although numerous techniques have been developed to monitor autophagy and to probe its cellular functions, these methods cannot evaluate in sufficient detail the autophagy process, and suffer limitations from complex experimental setups and/or systematic errors. Here we developed a method to image, contextually, the number and pH of autophagic intermediates by using the probe mRFP-GFP-LC3B as a ratiometric pH sensor. This information is expressed functionally by AIPD, the pH distribution of the number of autophagic intermediates per cell. AIPD analysis reveals how intermediates are characterized by a continuous pH distribution, in the range 4.5–6.5, and therefore can be described by a more complex set of states rather than the usual biphasic one (autophagosomes and autolysosomes). AIPD shape and amplitude are sensitive to alterations in the autophagy pathway induced by drugs or environmental states, and allow a quantitative estimation of autophagic flux by retrieving the concentrations of autophagic intermediates. PMID:26506895

  9. Analysis of Fringe Field Formed Inside LDA Measurement Volume Using Compact Two Hololens Imaging Systems

    NASA Astrophysics Data System (ADS)

    Ghosh, Abhijit; Nirala, A. K.; Yadav, H. L.

    2018-03-01

    We have designed and fabricated four LDA optical setups consisting of aberration compensated four different compact two hololens imaging systems. We have experimentally investigated and realized a hololens recording geometry which is interferogram of converging spherical wavefront with mutually coherent planar wavefront. Proposed real time monitoring and actual fringe field analysis techniques allow complete characterizations of fringes formed at measurement volume and permit to evaluate beam quality, alignment and fringe uniformity with greater precision. After experimentally analyzing the fringes formed at measurement volume by all four imaging systems, it is found that fringes obtained using compact two hololens imaging systems get improved both qualitatively and quantitatively compared to that obtained using conventional imaging system. Results indicate qualitative improvement of non-uniformity in fringe thickness and micro intensity variations perpendicular to the fringes, and quantitative improvement of 39.25% in overall average normalized standard deviations of fringe width formed by compact two hololens imaging systems compare to that of conventional imaging system.

  10. An artificial tongue fluorescent sensor array for identification and quantitation of various heavy metal ions.

    PubMed

    Xu, Wang; Ren, Changliang; Teoh, Chai Lean; Peng, Juanjuan; Gadre, Shubhankar Haribhau; Rhee, Hyun-Woo; Lee, Chi-Lik Ken; Chang, Young-Tae

    2014-09-02

    Herein, a small-molecule fluorescent sensor array for rapid identification of seven heavy metal ions was designed and synthesized, with its sensing mechanism mimicking that of a tongue. The photoinduced electron transfer and intramolecular charge transfer mechanism result in combinatorial interactions between sensor array and heavy metal ions, which lead to diversified fluorescence wavelength shifts and emission intensity changes. Upon principle component analysis (PCA), this result renders clear identification of each heavy metal ion on a 3D spatial dispersion graph. Further exploration provides a concentration-dependent pattern, allowing both qualitative and quantitative measurements of heavy metal ions. On the basis of this information, a "safe-zone" concept was proposed, which provides rapid exclusion of versatile hazardous species from clean water samples based on toxicity characteristic leaching procedure standards. This type of small-molecule fluorescent sensor array could open a new avenue for multiple heavy metal ion detection and simplified water quality analysis.

  11. Statistical methodology: II. Reliability and validity assessment in study design, Part B.

    PubMed

    Karras, D J

    1997-02-01

    Validity measures the correspondence between a test and other purported measures of the same or similar qualities. When a reference standard exists, a criterion-based validity coefficient can be calculated. If no such standard is available, the concepts of content and construct validity may be used, but quantitative analysis may not be possible. The Pearson and Spearman tests of correlation are often used to assess the correspondence between tests, but do not account for measurement biases and may yield misleading results. Techniques that measure interest differences may be more meaningful in validity assessment, and the kappa statistic is useful for analyzing categorical variables. Questionnaires often can be designed to allow quantitative assessment of reliability and validity, although this may be difficult. Inclusion of homogeneous questions is necessary to assess reliability. Analysis is enhanced by using Likert scales or similar techniques that yield ordinal data. Validity assessment of questionnaires requires careful definition of the scope of the test and comparison with previously validated tools.

  12. In vitro quantitative analysis of Salmonella typhimurium preference for amino acids secreted by human breast tumor

    NASA Astrophysics Data System (ADS)

    Choi, Eunpyo; Maeng, Bohee; Lee, Jae-hun; Chang, Hyung-kwan; Park, Jungyul

    2016-12-01

    Bacterial therapies have been paid significant attentions by their ability to penetrate deep into the solid tumor tissue and its propensity to naturally accumulate in tumors of living animals. Understanding the actual mechanism for bacteria to target the tumor is therapeutically crucial but is poorly understood. We hypothesized that amino acids released from the specific tumors induced bacteria to those tumors and the experiments for chemotactic response of bacteria toward the cancer secreting amino acids was then performed by using the diffusion based multiple chemical gradient generator constructed by in situ self-assembly of microspheres. The quantitative analysis was carried out by comparison of intensity using green fluorescent protein (GFP) tagged Salmonella typhimurium ( S. typhimurium) in the gradient generator, which showed the clear preference to the released amino acids, especially from breast cancer patients. The understanding chemotaxis toward the cancer secreting amino acids is essential for controlling S. typhimurium targeting in tumors and will allow for the development of bacterial therapies.

  13. Exploring the Possibility of Peak Individualism, Humanity's Existential Crisis, and an Emerging Age of Purpose.

    PubMed

    Grant, Gabriel B

    2017-01-01

    There is an emerging cultural narrative in the United States that we are entering an age of purpose-that millennials, more than any other generation, are searching for purpose and purposeful work (Sheahan, 2005) and that we are entering an era or economy of purpose (Hurst, 2014). For profit, non-profit, and educational institutions are perceiving and adapting to serve millennials' demand for purpose in life, specifically within the workplace (Klein et al., 2015). Yet, longitudinal studies of purpose do not exist, and millennials are also referred to as GenMe. Existing quantitative research suggests they (we) are increasingly individualistic, materialistic, and narcissistic (Greenfield, 2013). Google's digitization of millions of books and the Ngram Viewer allow for quantified analysis of culture over the past two centuries. This tool was used to quantitatively test the popular notion that there is a rise in demand for purpose. Analysis reveals a growing interest in purpose-in-life and a shift toward collectivistic values emerging over the lifespan of the millennial generation.

  14. Thermal image analysis using the serpentine method

    NASA Astrophysics Data System (ADS)

    Koprowski, Robert; Wilczyński, Sławomir

    2018-03-01

    Thermal imaging is an increasingly widespread alternative to other imaging methods. As a supplementary method in diagnostics, it can be used both statically and with dynamic temperature changes. The paper proposes a new image analysis method that allows for the acquisition of new diagnostic information as well as object segmentation. The proposed serpentine analysis uses known and new methods of image analysis and processing proposed by the authors. Affine transformations of an image and subsequent Fourier analysis provide a new diagnostic quality. The method is fully repeatable and automatic and independent of inter-individual variability in patients. The segmentation results are by 10% better than those obtained from the watershed method and the hybrid segmentation method based on the Canny detector. The first and second harmonics of serpentine analysis enable to determine the type of temperature changes in the region of interest (gradient, number of heat sources etc.). The presented serpentine method provides new quantitative information on thermal imaging and more. Since it allows for image segmentation and designation of contact points of two and more heat sources (local minimum), it can be used to support medical diagnostics in many areas of medicine.

  15. Analysis of high accuracy, quantitative proteomics data in the MaxQB database.

    PubMed

    Schaab, Christoph; Geiger, Tamar; Stoehr, Gabriele; Cox, Juergen; Mann, Matthias

    2012-03-01

    MS-based proteomics generates rapidly increasing amounts of precise and quantitative information. Analysis of individual proteomic experiments has made great strides, but the crucial ability to compare and store information across different proteome measurements still presents many challenges. For example, it has been difficult to avoid contamination of databases with low quality peptide identifications, to control for the inflation in false positive identifications when combining data sets, and to integrate quantitative data. Although, for example, the contamination with low quality identifications has been addressed by joint analysis of deposited raw data in some public repositories, we reasoned that there should be a role for a database specifically designed for high resolution and quantitative data. Here we describe a novel database termed MaxQB that stores and displays collections of large proteomics projects and allows joint analysis and comparison. We demonstrate the analysis tools of MaxQB using proteome data of 11 different human cell lines and 28 mouse tissues. The database-wide false discovery rate is controlled by adjusting the project specific cutoff scores for the combined data sets. The 11 cell line proteomes together identify proteins expressed from more than half of all human genes. For each protein of interest, expression levels estimated by label-free quantification can be visualized across the cell lines. Similarly, the expression rank order and estimated amount of each protein within each proteome are plotted. We used MaxQB to calculate the signal reproducibility of the detected peptides for the same proteins across different proteomes. Spearman rank correlation between peptide intensity and detection probability of identified proteins was greater than 0.8 for 64% of the proteome, whereas a minority of proteins have negative correlation. This information can be used to pinpoint false protein identifications, independently of peptide database scores. The information contained in MaxQB, including high resolution fragment spectra, is accessible to the community via a user-friendly web interface at http://www.biochem.mpg.de/maxqb.

  16. Analysis of organic acids and acylglycines for the diagnosis of related inborn errors of metabolism by GC- and HPLC-MS.

    PubMed

    la Marca, Giancarlo; Rizzo, Cristiano

    2011-01-01

    The analysis of organic acids in urine is commonly included in routine procedures for detecting many inborn errors of metabolism. Many analytical methods allow for both qualitative and quantitative determination of organic acids, mainly in urine but also in plasma, serum, whole blood, amniotic fluid, and cerebrospinal fluid. Liquid-liquid extraction and solid-phase extraction using anion exchange or silica columns are commonly employed approaches for sample treatment. Before analysis can be carried out using gas chromatography-mass spectrometry, organic acids must be converted into more thermally stable, volatile, and chemically inert forms, mainly trimethylsilyl ethers, esters, or methyl esters.

  17. DNA DAMAGE QUANTITATION BY ALKALINE GEL ELECTROPHORESIS.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SUTHERLAND,B.M.; BENNETT,P.V.; SUTHERLAND, J.C.

    2004-03-24

    Physical and chemical agents in the environment, those used in clinical applications, or encountered during recreational exposures to sunlight, induce damages in DNA. Understanding the biological impact of these agents requires quantitation of the levels of such damages in laboratory test systems as well as in field or clinical samples. Alkaline gel electrophoresis provides a sensitive (down to {approx} a few lesions/5Mb), rapid method of direct quantitation of a wide variety of DNA damages in nanogram quantities of non-radioactive DNAs from laboratory, field, or clinical specimens, including higher plants and animals. This method stems from velocity sedimentation studies of DNAmore » populations, and from the simple methods of agarose gel electrophoresis. Our laboratories have developed quantitative agarose gel methods, analytical descriptions of DNA migration during electrophoresis on agarose gels (1-6), and electronic imaging for accurate determinations of DNA mass (7-9). Although all these components improve sensitivity and throughput of large numbers of samples (7,8,10), a simple version using only standard molecular biology equipment allows routine analysis of DNA damages at moderate frequencies. We present here a description of the methods, as well as a brief description of the underlying principles, required for a simplified approach to quantitation of DNA damages by alkaline gel electrophoresis.« less

  18. Quantitative analysis of low-density SNP data for parentage assignment and estimation of family contributions to pooled samples.

    PubMed

    Henshall, John M; Dierens, Leanne; Sellars, Melony J

    2014-09-02

    While much attention has focused on the development of high-density single nucleotide polymorphism (SNP) assays, the costs of developing and running low-density assays have fallen dramatically. This makes it feasible to develop and apply SNP assays for agricultural species beyond the major livestock species. Although low-cost low-density assays may not have the accuracy of the high-density assays widely used in human and livestock species, we show that when combined with statistical analysis approaches that use quantitative instead of discrete genotypes, their utility may be improved. The data used in this study are from a 63-SNP marker Sequenom® iPLEX Platinum panel for the Black Tiger shrimp, for which high-density SNP assays are not currently available. For quantitative genotypes that could be estimated, in 5% of cases the most likely genotype for an individual at a SNP had a probability of less than 0.99. Matrix formulations of maximum likelihood equations for parentage assignment were developed for the quantitative genotypes and also for discrete genotypes perturbed by an assumed error term. Assignment rates that were based on maximum likelihood with quantitative genotypes were similar to those based on maximum likelihood with perturbed genotypes but, for more than 50% of cases, the two methods resulted in individuals being assigned to different families. Treating genotypes as quantitative values allows the same analysis framework to be used for pooled samples of DNA from multiple individuals. Resulting correlations between allele frequency estimates from pooled DNA and individual samples were consistently greater than 0.90, and as high as 0.97 for some pools. Estimates of family contributions to the pools based on quantitative genotypes in pooled DNA had a correlation of 0.85 with estimates of contributions from DNA-derived pedigree. Even with low numbers of SNPs of variable quality, parentage testing and family assignment from pooled samples are sufficiently accurate to provide useful information for a breeding program. Treating genotypes as quantitative values is an alternative to perturbing genotypes using an assumed error distribution, but can produce very different results. An understanding of the distribution of the error is required for SNP genotyping platforms.

  19. Using CT Data to Improve the Quantitative Analysis of 18F-FBB PET Neuroimages

    PubMed Central

    Segovia, Fermín; Sánchez-Vañó, Raquel; Górriz, Juan M.; Ramírez, Javier; Sopena-Novales, Pablo; Testart Dardel, Nathalie; Rodríguez-Fernández, Antonio; Gómez-Río, Manuel

    2018-01-01

    18F-FBB PET is a neuroimaging modality that is been increasingly used to assess brain amyloid deposits in potential patients with Alzheimer's disease (AD). In this work, we analyze the usefulness of these data to distinguish between AD and non-AD patients. A dataset with 18F-FBB PET brain images from 94 subjects diagnosed with AD and other disorders was evaluated by means of multiple analyses based on t-test, ANOVA, Fisher Discriminant Analysis and Support Vector Machine (SVM) classification. In addition, we propose to calculate amyloid standardized uptake values (SUVs) using only gray-matter voxels, which can be estimated using Computed Tomography (CT) images. This approach allows assessing potential brain amyloid deposits along with the gray matter loss and takes advantage of the structural information provided by most of the scanners used for PET examination, which allow simultaneous PET and CT data acquisition. The results obtained in this work suggest that SUVs calculated according to the proposed method allow AD and non-AD subjects to be more accurately differentiated than using SUVs calculated with standard approaches. PMID:29930505

  20. [3D visualization and analysis of vocal fold dynamics].

    PubMed

    Bohr, C; Döllinger, M; Kniesburges, S; Traxdorf, M

    2016-04-01

    Visual investigation methods of the larynx mainly allow for the two-dimensional presentation of the three-dimensional structures of the vocal fold dynamics. The vertical component of the vocal fold dynamics is often neglected, yielding a loss of information. The latest studies show that the vertical dynamic components are in the range of the medio-lateral dynamics and play a significant role within the phonation process. This work presents a method for future 3D reconstruction and visualization of endoscopically recorded vocal fold dynamics. The setup contains a high-speed camera (HSC) and a laser projection system (LPS). The LPS projects a regular grid on the vocal fold surfaces and in combination with the HSC allows a three-dimensional reconstruction of the vocal fold surface. Hence, quantitative information on displacements and velocities can be provided. The applicability of the method is presented for one ex-vivo human larynx, one ex-vivo porcine larynx and one synthetic silicone larynx. The setup introduced allows the reconstruction of the entire visible vocal fold surfaces for each oscillation status. This enables a detailed analysis of the three dimensional dynamics (i. e. displacements, velocities, accelerations) of the vocal folds. The next goal is the miniaturization of the LPS to allow clinical in-vivo analysis in humans. We anticipate new insight on dependencies between 3D dynamic behavior and the quality of the acoustic outcome for healthy and disordered phonation.

  1. Congruent climate-related genecological responses from molecular markers and quantitative traits for western white pine (Pinus monticola)

    Treesearch

    Bryce A. Richardson; Gerald E. Rehfeldt; Mee-Sook Kim

    2009-01-01

    Analyses of molecular and quantitative genetic data demonstrate the existence of congruent climate-related patterns in western white pine (Pinus monticola). Two independent studies allowed comparisons of amplified fragment length polymorphism (AFLP) markers with quantitative variation in adaptive traits. Principal component analyses...

  2. Effect of platform, reference material, and quantification model on enumeration of Enterococcus by quantitative PCR methods

    EPA Science Inventory

    Quantitative polymerase chain reaction (qPCR) is increasingly being used for the quantitative detection of fecal indicator bacteria in beach water. QPCR allows for same-day health warnings, and its application is being considered as an optionn for recreational water quality testi...

  3. Quantitative Aging Pattern in Mouse Urine Vapor as Measured by Gas-Liquid Chromatography

    NASA Technical Reports Server (NTRS)

    Robinson, Arthur B.; Dirren, Henri; Sheets, Alan; Miquel, Jaime; Lundgren, Paul R.

    1975-01-01

    We have discovered a quantitative aging pattern in mouse urine vapor. The diagnostic power of the pattern has been found to be high. We hope that this pattern will eventually allow quantitative estimates of physiological age and some insight into the biochemistry of aging.

  4. Stroke onset time estimation from multispectral quantitative magnetic resonance imaging in a rat model of focal permanent cerebral ischemia.

    PubMed

    McGarry, Bryony L; Rogers, Harriet J; Knight, Michael J; Jokivarsi, Kimmo T; Sierra, Alejandra; Gröhn, Olli Hj; Kauppinen, Risto A

    2016-08-01

    Quantitative T2 relaxation magnetic resonance imaging allows estimation of stroke onset time. We aimed to examine the accuracy of quantitative T1 and quantitative T2 relaxation times alone and in combination to provide estimates of stroke onset time in a rat model of permanent focal cerebral ischemia and map the spatial distribution of elevated quantitative T1 and quantitative T2 to assess tissue status. Permanent middle cerebral artery occlusion was induced in Wistar rats. Animals were scanned at 9.4T for quantitative T1, quantitative T2, and Trace of Diffusion Tensor (Dav) up to 4 h post-middle cerebral artery occlusion. Time courses of differentials of quantitative T1 and quantitative T2 in ischemic and non-ischemic contralateral brain tissue (ΔT1, ΔT2) and volumes of tissue with elevated T1 and T2 relaxation times (f1, f2) were determined. TTC staining was used to highlight permanent ischemic damage. ΔT1, ΔT2, f1, f2, and the volume of tissue with both elevated quantitative T1 and quantitative T2 (V(Overlap)) increased with time post-middle cerebral artery occlusion allowing stroke onset time to be estimated. V(Overlap) provided the most accurate estimate with an uncertainty of ±25 min. At all times-points regions with elevated relaxation times were smaller than areas with Dav defined ischemia. Stroke onset time can be determined by quantitative T1 and quantitative T2 relaxation times and tissue volumes. Combining quantitative T1 and quantitative T2 provides the most accurate estimate and potentially identifies irreversibly damaged brain tissue. © 2016 World Stroke Organization.

  5. Potassium-based algorithm allows correction for the hematocrit bias in quantitative analysis of caffeine and its major metabolite in dried blood spots.

    PubMed

    De Kesel, Pieter M M; Capiau, Sara; Stove, Veronique V; Lambert, Willy E; Stove, Christophe P

    2014-10-01

    Although dried blood spot (DBS) sampling is increasingly receiving interest as a potential alternative to traditional blood sampling, the impact of hematocrit (Hct) on DBS results is limiting its final breakthrough in routine bioanalysis. To predict the Hct of a given DBS, potassium (K(+)) proved to be a reliable marker. The aim of this study was to evaluate whether application of an algorithm, based upon predicted Hct or K(+) concentrations as such, allowed correction for the Hct bias. Using validated LC-MS/MS methods, caffeine, chosen as a model compound, was determined in whole blood and corresponding DBS samples with a broad Hct range (0.18-0.47). A reference subset (n = 50) was used to generate an algorithm based on K(+) concentrations in DBS. Application of the developed algorithm on an independent test set (n = 50) alleviated the assay bias, especially at lower Hct values. Before correction, differences between DBS and whole blood concentrations ranged from -29.1 to 21.1%. The mean difference, as obtained by Bland-Altman comparison, was -6.6% (95% confidence interval (CI), -9.7 to -3.4%). After application of the algorithm, differences between corrected and whole blood concentrations lay between -19.9 and 13.9% with a mean difference of -2.1% (95% CI, -4.5 to 0.3%). The same algorithm was applied to a separate compound, paraxanthine, which was determined in 103 samples (Hct range, 0.17-0.47), yielding similar results. In conclusion, a K(+)-based algorithm allows correction for the Hct bias in the quantitative analysis of caffeine and its metabolite paraxanthine.

  6. Quantitative image feature variability amongst CT scanners with a controlled scan protocol

    NASA Astrophysics Data System (ADS)

    Ger, Rachel B.; Zhou, Shouhao; Chi, Pai-Chun Melinda; Goff, David L.; Zhang, Lifei; Lee, Hannah J.; Fuller, Clifton D.; Howell, Rebecca M.; Li, Heng; Stafford, R. Jason; Court, Laurence E.; Mackin, Dennis S.

    2018-02-01

    Radiomics studies often analyze patient computed tomography (CT) images acquired from different CT scanners. This may result in differences in imaging parameters, e.g. different manufacturers, different acquisition protocols, etc. However, quantifiable differences in radiomics features can occur based on acquisition parameters. A controlled protocol may allow for minimization of these effects, thus allowing for larger patient cohorts from many different CT scanners. In order to test radiomics feature variability across different CT scanners a radiomics phantom was developed with six different cartridges encased in high density polystyrene. A harmonized protocol was developed to control for tube voltage, tube current, scan type, pitch, CTDIvol, convolution kernel, display field of view, and slice thickness across different manufacturers. The radiomics phantom was imaged on 18 scanners using the control protocol. A linear mixed effects model was created to assess the impact of inter-scanner variability with decomposition of feature variation between scanners and cartridge materials. The inter-scanner variability was compared to the residual variability (the unexplained variability) and to the inter-patient variability using two different patient cohorts. The patient cohorts consisted of 20 non-small cell lung cancer (NSCLC) and 30 head and neck squamous cell carcinoma (HNSCC) patients. The inter-scanner standard deviation was at least half of the residual standard deviation for 36 of 49 quantitative image features. The ratio of inter-scanner to patient coefficient of variation was above 0.2 for 22 and 28 of the 49 features for NSCLC and HNSCC patients, respectively. Inter-scanner variability was a significant factor compared to patient variation in this small study for many of the features. Further analysis with a larger cohort will allow more thorough analysis with additional variables in the model to truly isolate the interscanner difference.

  7. Quantitative profiling of O-glycans by electrospray ionization- and matrix-assisted laser desorption ionization-time-of-flight-mass spectrometry after in-gel derivatization with isotope-coded 1-phenyl-3-methyl-5-pyrazolone.

    PubMed

    Sić, Siniša; Maier, Norbert M; Rizzi, Andreas M

    2016-09-07

    The potential and benefits of isotope-coded labeling in the context of MS-based glycan profiling are evaluated focusing on the analysis of O-glycans. For this purpose, a derivatization strategy using d0/d5-1-phenyl-3-methyl-5-pyrazolone (PMP) is employed, allowing O-glycan release and derivatization to be achieved in one single step. The paper demonstrates that this release and derivatization reaction can be carried out also in-gel with only marginal loss in sensitivity compared to in-solution derivatization. Such an effective in-gel reaction allows one to extend this release/labeling method also to glycoprotein/glycoform samples pre-separated by gel-electrophoresis without the need of extracting the proteins/digested peptides from the gel. With highly O-glycosylated proteins (e.g. mucins) LODs in the range of 0.4 μg glycoprotein (100 fmol) loaded onto the electrophoresis gel can be attained, with minor glycosylated proteins (like IgAs, FVII, FIX) the LODs were in the range of 80-100 μg (250 pmol-1.5 nmol) glycoprotein loaded onto the gel. As second aspect, the potential of isotope coded labeling as internal standardization strategy for the reliable determination of quantitative glycan profiles via MALDI-MS is investigated. Towards this goal, a number of established and emerging MALDI matrices were tested for PMP-glycan quantitation, and their performance is compared with that of ESI-based measurements. The crystalline matrix 2,6-dihydroxyacetophenone (DHAP) and the ionic liquid matrix N,N-diisopropyl-ethyl-ammonium 2,4,6-trihydroxyacetophenone (DIEA-THAP) showed potential for MALDI-based quantitation of PMP-labeled O-glycans. We also provide a comprehensive overview on the performance of MS-based glycan quantitation approaches by comparing sensitivity, LOD, accuracy and repeatability data obtained with RP-HPLC-ESI-MS, stand-alone nano-ESI-MS with a spray-nozzle chip, and MALDI-MS. Finally, the suitability of the isotope-coded PMP labeling strategy for O-glycan profiling of biological important proteins is demonstrated by comparative analysis of IgA immunoglobulins and two coagulation factors. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. An MRM-Based Cytokeratin Marker Assay as a Tool for Cancer Studies: Application to Lung Cancer Pleural Effusions.

    PubMed

    Perzanowska, Anna; Fatalska, Agnieszka; Wojtas, Grzegorz; Lewandowicz, Andrzej; Michalak, Agata; Krasowski, Grzegorz; Borchers, Christoph H; Dadlez, Michal; Domanski, Dominik

    2018-03-01

    The goal of this work was to develop an LC-MRM assay for the quantitative analysis of a set of established and diagnostically important cytokeratin (CK) markers used in cancer diagnosis, prognosis, and therapy monitoring. Second, the potential of this assay in lung cancer diagnosis through pleural effusion (PE) analysis was examined. A multiplexed MRM assay was developed for 17 CKs and their select caspase-cleaved fragments. Isotope-labeled standard peptides were used for high assay specificity and absolute peptide quantitation; with robust standard-flow LC coupled to a latest-generation triple-quadrupole instrument for high sensitivity. The potential clinical applicability was demonstrated by the analysis of 118 PE samples. The MRM assay was evaluated for endogenous detection, linearity, precision, upper and lower limits of quantification, selectivity, reproducibility and peptide stability, and is generally applicable to any epithelial cancer study. A set of 118 patients with known pathologies allowed us to define the range of CK levels in clinical PE samples. Specific CKs were able to differentiate cancer-related PEs from those caused by benign ailments. In addition, they allowed to differentiate between PEs from subjects with small cell lung cancer versus non-small cell lung carcinoma, and to further differentiate the latter into its two subtypes, adenocarcinoma and squamous cell carcinoma. An MRM-based CK assay for carcinoma studies can differentiate between the three lung cancer histological types using less-invasive PE sampling providing potential therapy-guiding information on patients that are inoperable. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Periictal activity in cooled asphyxiated neonates with seizures.

    PubMed

    Major, Philippe; Lortie, Anne; Dehaes, Mathieu; Lodygensky, Gregory Anton; Gallagher, Anne; Carmant, Lionel; Birca, Ala

    2017-04-01

    Seizures are common in critically ill neonates. Both seizures and antiepileptic treatments may lead to short term complications and worsen the outcomes. Predicting the risks of seizure reoccurrence could enable individual treatment regimens and better outcomes. We aimed to identify EEG signatures of seizure reoccurrence by investigating periictal electrographic features and spectral power characteristics in hypothermic neonates with hypoxic-ischemic encephalopathy (HIE) with or without reoccurrence of seizures on rewarming. We recruited five consecutive HIE neonates, submitted to continuous EEG monitoring, with high seizure burden (>20% per hour) while undergoing therapeutic hypothermia. Two of them had reoccurrence of seizures on rewarming. We performed quantitative analysis of fifteen artifact-free consecutive seizures to appreciate spectral power changes between the interictal, preictal and ictal periods, separately for each patient. Visual analysis allowed description of electrographic features associated with ictal events. Every patient demonstrated a significant increase in overall spectral power from the interictal to preictal and ictal periods (p<0.01). Alpha power increase was more pronounced in the two patients with reoccurrence of seizures on rewarming and significant when comparing both interictal-to-preictal and interictal-to-ictal periods. This alpha activity increase could be also appreciated using visual analysis and distinguished neonates with and without seizure reoccurrence. This distinct alpha activity preceding ictal onset could represent a biomarker of propensity for seizure reoccurrence in neonates. Future studies should be performed to confirm whether quantitative periictal characteristics and electrographic features allow predicting the risks of seizure reoccurrence in HIE neonates and other critically ill patients. Copyright © 2017 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  10. A quantitative analysis of IRAS maps of molecular clouds

    NASA Technical Reports Server (NTRS)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-01-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  11. Development of a method for urine bikunin/urinary trypsin inhibitor (UTI) quantitation and structural characterization: Application to type 1 and type 2 diabetes.

    PubMed

    Lepedda, Antonio Junior; Nieddu, Gabriele; Rocchiccioli, Silvia; Fresu, Pietro; De Muro, Pierina; Formato, Marilena

    2013-12-01

    Bikunin is a plasma proteinase inhibitor often associated with inflammatory conditions. It has a half-life of few minutes and it is rapidly excreted into urine as urinary trypsin inhibitor (UTI). UTI levels are usually low in healthy individuals but they can increase up to tenfold in both acute and chronic inflammatory diseases. This article describes a sensitive method for both direct UTI quantitation and structural characterization. UTI purification was performed by anion exchange micro-chromatography followed by SDS-PAGE. A calibration curve for protein quantitation was set up by using a purified UTI fraction. UTI identification and structural characterization was performed by Nano-LC-MS/MS analysis. The method was applied on urine samples from 9 patients with type 1 diabetes, 11 patients with type 2 diabetes, and 28 healthy controls, matched for age and sex with patients, evidencing higher UTI levels in both groups of patients with respect to controls (p < 0.001 and p = 0.001, respectively). Spearman's correlation tests highlighted no association between UTI levels and age in each group tested. Owing to the elevated sensitivity and specificity, the described method allows UTI quantitation from very low quantities of specimen. Furthermore, as UTI concentration is normalized for creatinine level, the analysis could be also performed on randomly collected urine samples. Finally, MS/MS analysis prospects the possibility of characterizing PTM sites potentially able to affect UTI localization, function, and pathophysiological activity. Preliminary results suggest that UTI levels could represent a useful marker of chronic inflammatory condition in type 1 and 2 diabetes. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Multifactorial Optimization of Contrast-Enhanced Nanofocus Computed Tomography for Quantitative Analysis of Neo-Tissue Formation in Tissue Engineering Constructs.

    PubMed

    Sonnaert, Maarten; Kerckhofs, Greet; Papantoniou, Ioannis; Van Vlierberghe, Sandra; Boterberg, Veerle; Dubruel, Peter; Luyten, Frank P; Schrooten, Jan; Geris, Liesbet

    2015-01-01

    To progress the fields of tissue engineering (TE) and regenerative medicine, development of quantitative methods for non-invasive three dimensional characterization of engineered constructs (i.e. cells/tissue combined with scaffolds) becomes essential. In this study, we have defined the most optimal staining conditions for contrast-enhanced nanofocus computed tomography for three dimensional visualization and quantitative analysis of in vitro engineered neo-tissue (i.e. extracellular matrix containing cells) in perfusion bioreactor-developed Ti6Al4V constructs. A fractional factorial 'design of experiments' approach was used to elucidate the influence of the staining time and concentration of two contrast agents (Hexabrix and phosphotungstic acid) and the neo-tissue volume on the image contrast and dataset quality. Additionally, the neo-tissue shrinkage that was induced by phosphotungstic acid staining was quantified to determine the operating window within which this contrast agent can be accurately applied. For Hexabrix the staining concentration was the main parameter influencing image contrast and dataset quality. Using phosphotungstic acid the staining concentration had a significant influence on the image contrast while both staining concentration and neo-tissue volume had an influence on the dataset quality. The use of high concentrations of phosphotungstic acid did however introduce significant shrinkage of the neo-tissue indicating that, despite sub-optimal image contrast, low concentrations of this staining agent should be used to enable quantitative analysis. To conclude, design of experiments allowed us to define the most optimal staining conditions for contrast-enhanced nanofocus computed tomography to be used as a routine screening tool of neo-tissue formation in Ti6Al4V constructs, transforming it into a robust three dimensional quality control methodology.

  13. Quantitative trait loci and metabolic pathways

    PubMed Central

    McMullen, M. D.; Byrne, P. F.; Snook, M. E.; Wiseman, B. R.; Lee, E. A.; Widstrom, N. W.; Coe, E. H.

    1998-01-01

    The interpretation of quantitative trait locus (QTL) studies is limited by the lack of information on metabolic pathways leading to most economic traits. Inferences about the roles of the underlying genes with a pathway or the nature of their interaction with other loci are generally not possible. An exception is resistance to the corn earworm Helicoverpa zea (Boddie) in maize (Zea mays L.) because of maysin, a C-glycosyl flavone synthesized in silks via a branch of the well characterized flavonoid pathway. Our results using flavone synthesis as a model QTL system indicate: (i) the importance of regulatory loci as QTLs, (ii) the importance of interconnecting biochemical pathways on product levels, (iii) evidence for “channeling” of intermediates, allowing independent synthesis of related compounds, (iv) the utility of QTL analysis in clarifying the role of specific genes in a biochemical pathway, and (v) identification of a previously unknown locus on chromosome 9S affecting flavone level. A greater understanding of the genetic basis of maysin synthesis and associated corn earworm resistance should lead to improved breeding strategies. More broadly, the insights gained in relating a defined genetic and biochemical pathway affecting a quantitative trait should enhance interpretation of the biological basis of variation for other quantitative traits. PMID:9482823

  14. Quantitative IR microscopy and spectromics open the way to 3D digital pathology.

    PubMed

    Bobroff, Vladimir; Chen, Hsiang-Hsin; Delugin, Maylis; Javerzat, Sophie; Petibois, Cyril

    2017-04-01

    Currently, only mass-spectrometry (MS) microscopy brings a quantitative analysis of chemical contents of tissue samples in 3D. Here, the reconstruction of a 3D quantitative chemical images of a biological tissue by FTIR spectro-microscopy is reported. An automated curve-fitting method is developed to extract all intense absorption bands constituting IR spectra. This innovation benefits from three critical features: (1) the correction of raw IR spectra to make them quantitatively comparable; (2) the automated and iterative data treatment allowing to transfer the IR-absorption spectrum into a IR-band spectrum; (3) the reconstruction of an 3D IR-band matrix (x, y, z for voxel position and a 4 th dimension with all IR-band parameters). Spectromics, which is a new method for exploiting spectral data for tissue metadata reconstruction, is proposed to further translate the related chemical information in 3D, as biochemical and anatomical tissue parameters. An example is given with oxidative stress distribution and the reconstruction of blood vessels in tissues. The requirements of IR microscopy instrumentation to propose 3D digital histology as a clinical routine technology is briefly discussed. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Using mixed-methods research to study the quality of life of coeliac women.

    PubMed

    Rodríguez Almagro, Julián; Hernández Martínez, Antonio; Solano Ruiz, María Carmen; Siles González, José

    2017-04-01

    To research the quality of life of Spanish women with coeliac disease. Women with coeliac disease express lower quality of life than men with coeliac disease. Explanatory sequential approach using mixed methods and with a gender perspective. The research was carried out between May and July 2015. In its quantitative stage, it aimed to determine the health-related quality of life in a representative sample (n = 1097) of Spanish adult women with coeliac disease using a specific questionnaire named Coeliac Disease-Quality of Life. In its qualitative phase, it aimed to describe the life experiences of a woman with coeliac disease in a qualitative manner by means of interviews (n = 19) with a semistructured script. Quantitative data were analysed using spss version 20 and presented in descriptive statistics. Qualitative data were analysed using the directed content analysis. The quantitative process gave us the values on the four aspects studied: dysphoria, disease limitations, health problems and inadequate treatment. These aspects allowed us to create a qualitative process, based on which we generated an interview, from which four larger categories emerged. These categories were feelings at diagnosis, limitations in day-to-day life, social perceptions of the disease and personal meanings of coeliac disease. Thus, both phases of our project are totally connected. There was a high level of congruence between quantitative scores and narratives. This study shows us the strong points of mixed-methods strategy in health sciences. The mixed-methods strategy gave us a wider view of the experience of women living with coeliac disease. In our case, a strength and not a limitation is having performed the quality of life study in women with coeliac disease using a mixed methodology, approaching the experience of being a woman with coeliac disease in Spain in two different but complementary ways. The quantitative and qualitative data allowed us to interpret the experiences of our participants. © 2016 John Wiley & Sons Ltd.

  16. Quantitative assessment of intermolecular interactions by atomic force microscopy imaging using copper oxide tips

    NASA Astrophysics Data System (ADS)

    Mönig, Harry; Amirjalayer, Saeed; Timmer, Alexander; Hu, Zhixin; Liu, Lacheng; Díaz Arado, Oscar; Cnudde, Marvin; Strassert, Cristian Alejandro; Ji, Wei; Rohlfing, Michael; Fuchs, Harald

    2018-05-01

    Atomic force microscopy is an impressive tool with which to directly resolve the bonding structure of organic compounds1-5. The methodology usually involves chemical passivation of the probe-tip termination by attaching single molecules or atoms such as CO or Xe (refs 1,6-9). However, these probe particles are only weakly connected to the metallic apex, which results in considerable dynamic deflection. This probe particle deflection leads to pronounced image distortions, systematic overestimation of bond lengths, and in some cases even spurious bond-like contrast features, thus inhibiting reliable data interpretation8-12. Recently, an alternative approach to tip passivation has been used in which slightly indenting a tip into oxidized copper substrates and subsequent contrast analysis allows for the verification of an oxygen-terminated Cu tip13-15. Here we show that, due to the covalently bound configuration of the terminal oxygen atom, this copper oxide tip (CuOx tip) has a high structural stability, allowing not only a quantitative determination of individual bond lengths and access to bond order effects, but also reliable intermolecular bond characterization. In particular, by removing the previous limitations of flexible probe particles, we are able to provide conclusive experimental evidence for an unusual intermolecular N-Au-N three-centre bond. Furthermore, we demonstrate that CuOx tips allow the characterization of the strength and configuration of individual hydrogen bonds within a molecular assembly.

  17. Large-scale label-free quantitative proteomics of the pea aphid-Buchnera symbiosis.

    PubMed

    Poliakov, Anton; Russell, Calum W; Ponnala, Lalit; Hoops, Harold J; Sun, Qi; Douglas, Angela E; van Wijk, Klaas J

    2011-06-01

    Many insects are nutritionally dependent on symbiotic microorganisms that have tiny genomes and are housed in specialized host cells called bacteriocytes. The obligate symbiosis between the pea aphid Acyrthosiphon pisum and the γ-proteobacterium Buchnera aphidicola (only 584 predicted proteins) is particularly amenable for molecular analysis because the genomes of both partners have been sequenced. To better define the symbiotic relationship between this aphid and Buchnera, we used large-scale, high accuracy tandem mass spectrometry (nanoLC-LTQ-Orbtrap) to identify aphid and Buchnera proteins in the whole aphid body, purified bacteriocytes, isolated Buchnera cells and the residual bacteriocyte fraction. More than 1900 aphid and 400 Buchnera proteins were identified. All enzymes in amino acid metabolism annotated in the Buchnera genome were detected, reflecting the high (68%) coverage of the proteome and supporting the core function of Buchnera in the aphid symbiosis. Transporters mediating the transport of predicted metabolites were present in the bacteriocyte. Label-free spectral counting combined with hierarchical clustering, allowed to define the quantitative distribution of a subset of these proteins across both symbiotic partners, yielding no evidence for the selective transfer of protein among the partners in either direction. This is the first quantitative proteome analysis of bacteriocyte symbiosis, providing a wealth of information about molecular function of both the host cell and bacterial symbiont.

  18. Simultaneous determination of eight major steroids from Polyporus umbellatus by high-performance liquid chromatography coupled with mass spectrometry detections.

    PubMed

    Zhao, Ying-yong; Cheng, Xian-long; Zhang, Yongmin; Zhao, Ye; Lin, Rui-chao; Sun, Wen-ji

    2010-02-01

    Polyporus umbellatus is a widely used diuretic herbal medicine. In this study, a high-performance liquid chromatography coupled with atmospheric pressure chemical ionization-mass spectrometric detection (HPLC-APCI-MS) method was developed for qualitative and quantitative analysis of steroids, as well as for the quality control of Polyporus umbellatus. The selectivity, reproducibility and sensitivity were compared with HPLC with photodiode array detection and evaporative light scattering detection (ELSD). Selective ion monitoring in positive mode was used for qualitative and quantitative analysis of eight major components and beta-ecdysterone was used as the internal standard. Limits of detection and quantification fell in the ranges 7-21 and 18-63 ng/mL for the eight analytes with an injection of 10 microL samples, and all calibration curves showed good linear regression (r(2) > 0.9919) within the test range. The quantitative results demonstrated that samples from different localities showed different qualities. Advantages, in comparison with conventional HPLC-diode array detection and HPLC-ELSD, are that reliable identification of target compounds could be achieved by accurate mass measurements along with characteristic retention time, and the great enhancement in selectivity and sensitivity allows identification and quantification of low levels of constituents in complex Polyporus umbellatus matrixes. (c) 2009 John Wiley & Sons, Ltd.

  19. Contrast-enhanced magnetic resonance imaging of pulmonary lesions: description of a technique aiming clinical practice.

    PubMed

    Koenigkam-Santos, Marcel; Optazaite, Elzbieta; Sommer, Gregor; Safi, Seyer; Heussel, Claus Peter; Kauczor, Hans-Ulrich; Puderbach, Michael

    2015-01-01

    To propose a technique for evaluation of pulmonary lesions using contrast-enhanced MRI; to assess morphological patterns of enhancement and correlate quantitative analysis with histopathology. Thirty-six patients were prospectively studied. Volumetric-interpolated T1W images were obtained during consecutive breath holds after bolus triggered contrast injection. Volume coverage of first three acquisitions was limited (higher temporal resolution) and last acquisition obtained at 4th min. Two radiologists individually evaluated the patterns of enhancement. Region-of-interest-based signal intensity (SI)-time curves were created to assess quantitative parameters. Readers agreed moderately to substantially concerning lesions' enhancement pattern. SI-time curves could be created for all lesions. In comparison to benign, malignant lesions showed higher values of maximum enhancement, early peak, slope and 4th min enhancement. Early peak >15% showed 100% sensitivity to detect malignancy, maximum enhancement >40% showed 100% specificity. The proposed technique is robust, simple to perform and can be applied in clinical scenario. It allows visual evaluation of enhancement pattern/progression together with creation of SI-time curves and assessment of derived quantitative parameters. Perfusion analysis was highly sensitive to detect malignancy, in accordance to what is recommended by most recent guidelines on imaging evaluation of pulmonary lesions. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  20. Quantitative disentanglement of coherent and incoherent laser-induced surface deformations by time-resolved x-ray reflectivity

    NASA Astrophysics Data System (ADS)

    Sander, M.; Pudell, J.-E.; Herzog, M.; Bargheer, M.; Bauer, R.; Besse, V.; Temnov, V.; Gaal, P.

    2017-12-01

    We present time-resolved x-ray reflectivity measurements on laser excited coherent and incoherent surface deformations of thin metallic films. Based on a kinematical diffraction model, we derive the surface amplitude from the diffracted x-ray intensity and resolve transient surface excursions with sub-Å spatial precision and 70 ps temporal resolution. The analysis allows for decomposition of the surface amplitude into multiple coherent acoustic modes and a substantial contribution from incoherent phonons which constitute the sample heating.

  1. Inertial impaction air sampling device

    DOEpatents

    Dewhurst, Katharine H.

    1990-01-01

    An inertial impactor to be used in an air sampling device for collection of respirable size particles in ambient air which may include a graphite furnace as the impaction substrate in a small-size, portable, direct analysis structure that gives immediate results and is totally self-contained allowing for remote and/or personal sampling. The graphite furnace collects suspended particles transported through the housing by means of the air flow system, and these particles may be analyzed for elements, quantitatively and qualitatively, by atomic absorption spectrophotometry.

  2. Development of a flash, bang, and smoke simulation of a shell burst

    NASA Technical Reports Server (NTRS)

    Williamson, F. R.; Kinney, J. F.; Wallace, T. V.

    1982-01-01

    A large number of experiments (cue test firings) were performed in the definition of the cue concepts and packaging configurations. A total of 344 of these experiments were recorded with instrumentation photography to allow a quantitative analysis of the smoke cloud to be made as a function of time. These analyses were predominantly made using a short test site. Supplementary long range visibility tests were conducted to insure the required 3 kilometer visibility of the smoke signature.

  3. Surface colour photometry of galaxies with Schmidt telescopes.

    NASA Technical Reports Server (NTRS)

    Wray, J. D.

    1972-01-01

    A method is described which owes its practicality to the capability of Schmidt telescopes to record a number of galaxy images on a single plate and to the existence of high speed computer controlled area-scanning precision microdensitometers such as the Photometric Data Systems model 1010. The method of analysis results in quantitative color-index information which is displayed in a manner that allows any user to effectively study the morphological properties of the distribution of color-index in galaxies.

  4. Inertial impaction air sampling device

    DOEpatents

    Dewhurst, K.H.

    1987-12-10

    An inertial impactor to be used in an air sampling device for collection of respirable size particles in ambient air which may include a graphite furnace as the impaction substrate in a small-size, portable, direct analysis structure that gives immediate results and is totally self-contained allowing for remote and/or personal sampling. The graphite furnace collects suspended particles transported through the housing by means of the air flow system, and these particles may be analyzed for elements, quantitatively and qualitatively, by atomic absorption spectrophotometry. 3 figs.

  5. A Quantitative Study Identifying Political Strategies Used by Principals of Dual Language Programs

    ERIC Educational Resources Information Center

    Girard, Guadalupe

    2017-01-01

    Purpose. The purpose of this quantitative study was to identify the external and internal political strategies used by principals that allow them to successfully navigate the political environment surrounding dual language programs. Methodology. This quantitative study used descriptive research to collect, analyze, and report data that identified…

  6. Fast Quantitative Analysis Of Museum Objects Using Laser-Induced Breakdown Spectroscopy And Multiple Regression Algorithms

    NASA Astrophysics Data System (ADS)

    Lorenzetti, G.; Foresta, A.; Palleschi, V.; Legnaioli, S.

    2009-09-01

    The recent development of mobile instrumentation, specifically devoted to in situ analysis and study of museum objects, allows the acquisition of many LIBS spectra in very short time. However, such large amount of data calls for new analytical approaches which would guarantee a prompt analysis of the results obtained. In this communication, we will present and discuss the advantages of statistical analytical methods, such as Partial Least Squares Multiple Regression algorithms vs. the classical calibration curve approach. PLS algorithms allows to obtain in real time the information on the composition of the objects under study; this feature of the method, compared to the traditional off-line analysis of the data, is extremely useful for the optimization of the measurement times and number of points associated with the analysis. In fact, the real time availability of the compositional information gives the possibility of concentrating the attention on the most `interesting' parts of the object, without over-sampling the zones which would not provide useful information for the scholars or the conservators. Some example on the applications of this method will be presented, including the studies recently performed by the researcher of the Applied Laser Spectroscopy Laboratory on museum bronze objects.

  7. Quantitative Functional Imaging Using Dynamic Positron Computed Tomography and Rapid Parameter Estimation Techniques

    NASA Astrophysics Data System (ADS)

    Koeppe, Robert Allen

    Positron computed tomography (PCT) is a diagnostic imaging technique that provides both three dimensional imaging capability and quantitative measurements of local tissue radioactivity concentrations in vivo. This allows the development of non-invasive methods that employ the principles of tracer kinetics for determining physiological properties such as mass specific blood flow, tissue pH, and rates of substrate transport or utilization. A physiologically based, two-compartment tracer kinetic model was derived to mathematically describe the exchange of a radioindicator between blood and tissue. The model was adapted for use with dynamic sequences of data acquired with a positron tomograph. Rapid estimation techniques were implemented to produce functional images of the model parameters by analyzing each individual pixel sequence of the image data. A detailed analysis of the performance characteristics of three different parameter estimation schemes was performed. The analysis included examination of errors caused by statistical uncertainties in the measured data, errors in the timing of the data, and errors caused by violation of various assumptions of the tracer kinetic model. Two specific radioindicators were investigated. ('18)F -fluoromethane, an inert freely diffusible gas, was used for local quantitative determinations of both cerebral blood flow and tissue:blood partition coefficient. A method was developed that did not require direct sampling of arterial blood for the absolute scaling of flow values. The arterial input concentration time course was obtained by assuming that the alveolar or end-tidal expired breath radioactivity concentration is proportional to the arterial blood concentration. The scale of the input function was obtained from a series of venous blood concentration measurements. The method of absolute scaling using venous samples was validated in four studies, performed on normal volunteers, in which directly measured arterial concentrations were compared to those predicted from the expired air and venous blood samples. The glucose analog ('18)F-3-deoxy-3-fluoro-D -glucose (3-FDG) was used for quantitating the membrane transport rate of glucose. The measured data indicated that the phosphorylation rate of 3-FDG was low enough to allow accurate estimation of the transport rate using a two compartment model.

  8. Genevar: a database and Java application for the analysis and visualization of SNP-gene associations in eQTL studies.

    PubMed

    Yang, Tsun-Po; Beazley, Claude; Montgomery, Stephen B; Dimas, Antigone S; Gutierrez-Arcelus, Maria; Stranger, Barbara E; Deloukas, Panos; Dermitzakis, Emmanouil T

    2010-10-01

    Genevar (GENe Expression VARiation) is a database and Java tool designed to integrate multiple datasets, and provides analysis and visualization of associations between sequence variation and gene expression. Genevar allows researchers to investigate expression quantitative trait loci (eQTL) associations within a gene locus of interest in real time. The database and application can be installed on a standard computer in database mode and, in addition, on a server to share discoveries among affiliations or the broader community over the Internet via web services protocols. http://www.sanger.ac.uk/resources/software/genevar.

  9. High Throughput Protein Quantitation using MRM Viewer Software and Dynamic MRM on a Triple Quadruple Mass Spectrometer

    PubMed Central

    Miller, C.; Waddell, K.; Tang, N.

    2010-01-01

    RP-122 Peptide quantitation using Multiple Reaction Monitoring (MRM) has been established as an important methodology for biomarker verification andvalidation.This requires high throughput combined with high sensitivity to analyze potentially thousands of target peptides in each sample.Dynamic MRM allows the system to only acquire the required MRMs of the peptide during a retention window corresponding to when each peptide is eluting. This reduces the number of concurrent MRM and therefore improves quantitation and sensitivity. MRM Selector allows the user to generate an MRM transition list with retention time information from discovery data obtained on a QTOF MS system.This list can be directly imported into the triple quadrupole acquisition software.However, situations can exist where a) the list of MRMs contain an excess of MRM transitions allowable under the ideal acquisition conditions chosen ( allowing for cycle time and chromatography conditions), or b) too many transitions in a certain retention time region which would result in an unacceptably low dwell time and cycle time.A new tool - MRM viewer has been developed to help users automatically generate multiple dynamic MRM methods from a single MRM list.In this study, a list of 3293 MRM transitions from a human plasma sample was compiled.A single dynamic MRM method with 3293 transitions results in a minimum dwell time of 2.18ms.Using MRM viewer we can generate three dynamic MRM methods with a minimum dwell time of 20ms which can give a better quality MRM quantitation.This tool facilitates both high throughput and high sensitivity for MRM quantitation.

  10. Deep learning approach for the detection and quantification of intraretinal cystoid fluid in multivendor optical coherence tomography.

    PubMed

    Venhuizen, Freerk G; van Ginneken, Bram; Liefers, Bart; van Asten, Freekje; Schreur, Vivian; Fauser, Sascha; Hoyng, Carel; Theelen, Thomas; Sánchez, Clara I

    2018-04-01

    We developed a deep learning algorithm for the automatic segmentation and quantification of intraretinal cystoid fluid (IRC) in spectral domain optical coherence tomography (SD-OCT) volumes independent of the device used for acquisition. A cascade of neural networks was introduced to include prior information on the retinal anatomy, boosting performance significantly. The proposed algorithm approached human performance reaching an overall Dice coefficient of 0.754 ± 0.136 and an intraclass correlation coefficient of 0.936, for the task of IRC segmentation and quantification, respectively. The proposed method allows for fast quantitative IRC volume measurements that can be used to improve patient care, reduce costs, and allow fast and reliable analysis in large population studies.

  11. Deep learning approach for the detection and quantification of intraretinal cystoid fluid in multivendor optical coherence tomography

    PubMed Central

    Venhuizen, Freerk G.; van Ginneken, Bram; Liefers, Bart; van Asten, Freekje; Schreur, Vivian; Fauser, Sascha; Hoyng, Carel; Theelen, Thomas; Sánchez, Clara I.

    2018-01-01

    We developed a deep learning algorithm for the automatic segmentation and quantification of intraretinal cystoid fluid (IRC) in spectral domain optical coherence tomography (SD-OCT) volumes independent of the device used for acquisition. A cascade of neural networks was introduced to include prior information on the retinal anatomy, boosting performance significantly. The proposed algorithm approached human performance reaching an overall Dice coefficient of 0.754 ± 0.136 and an intraclass correlation coefficient of 0.936, for the task of IRC segmentation and quantification, respectively. The proposed method allows for fast quantitative IRC volume measurements that can be used to improve patient care, reduce costs, and allow fast and reliable analysis in large population studies. PMID:29675301

  12. Quantitative kinetic analysis of lung nodules by temporal subtraction technique in dynamic chest radiography with a flat panel detector

    NASA Astrophysics Data System (ADS)

    Tsuchiya, Yuichiro; Kodera, Yoshie; Tanaka, Rie; Sanada, Shigeru

    2007-03-01

    Early detection and treatment of lung cancer is one of the most effective means to reduce cancer mortality; chest X-ray radiography has been widely used as a screening examination or health checkup. The new examination method and the development of computer analysis system allow obtaining respiratory kinetics by the use of flat panel detector (FPD), which is the expanded method of chest X-ray radiography. Through such changes functional evaluation of respiratory kinetics in chest has become available. Its introduction into clinical practice is expected in the future. In this study, we developed the computer analysis algorithm for the purpose of detecting lung nodules and evaluating quantitative kinetics. Breathing chest radiograph obtained by modified FPD was converted into 4 static images drawing the feature, by sequential temporal subtraction processing, morphologic enhancement processing, kinetic visualization processing, and lung region detection processing, after the breath synchronization process utilizing the diaphragmatic analysis of the vector movement. The artificial neural network used to analyze the density patterns detected the true nodules by analyzing these static images, and drew their kinetic tracks. For the algorithm performance and the evaluation of clinical effectiveness with 7 normal patients and simulated nodules, both showed sufficient detecting capability and kinetic imaging function without statistically significant difference. Our technique can quantitatively evaluate the kinetic range of nodules, and is effective in detecting a nodule on a breathing chest radiograph. Moreover, the application of this technique is expected to extend computer-aided diagnosis systems and facilitate the development of an automatic planning system for radiation therapy.

  13. Global Relative Quantification with Liquid Chromatography–Matrix-assisted Laser Desorption Ionization Time-of-flight (LC-MALDI-TOF)—Cross–validation with LTQ-Orbitrap Proves Reliability and Reveals Complementary Ionization Preferences*

    PubMed Central

    Hessling, Bernd; Büttner, Knut; Hecker, Michael; Becher, Dörte

    2013-01-01

    Quantitative LC-MALDI is an underrepresented method, especially in large-scale experiments. The additional fractionation step that is needed for most MALDI-TOF-TOF instruments, the comparatively long analysis time, and the very limited number of established software tools for the data analysis render LC-MALDI a niche application for large quantitative analyses beside the widespread LC–electrospray ionization workflows. Here, we used LC-MALDI in a relative quantification analysis of Staphylococcus aureus for the first time on a proteome-wide scale. Samples were analyzed in parallel with an LTQ-Orbitrap, which allowed cross-validation with a well-established workflow. With nearly 850 proteins identified in the cytosolic fraction and quantitative data for more than 550 proteins obtained with the MASCOT Distiller software, we were able to prove that LC-MALDI is able to process highly complex samples. The good correlation of quantities determined via this method and the LTQ-Orbitrap workflow confirmed the high reliability of our LC-MALDI approach for global quantification analysis. Because the existing literature reports differences for MALDI and electrospray ionization preferences and the respective experimental work was limited by technical or methodological constraints, we systematically compared biochemical attributes of peptides identified with either instrument. This genome-wide, comprehensive study revealed biases toward certain peptide properties for both MALDI-TOF-TOF- and LTQ-Orbitrap-based approaches. These biases are based on almost 13,000 peptides and result in a general complementarity of the two approaches that should be exploited in future experiments. PMID:23788530

  14. Global relative quantification with liquid chromatography-matrix-assisted laser desorption ionization time-of-flight (LC-MALDI-TOF)--cross-validation with LTQ-Orbitrap proves reliability and reveals complementary ionization preferences.

    PubMed

    Hessling, Bernd; Büttner, Knut; Hecker, Michael; Becher, Dörte

    2013-10-01

    Quantitative LC-MALDI is an underrepresented method, especially in large-scale experiments. The additional fractionation step that is needed for most MALDI-TOF-TOF instruments, the comparatively long analysis time, and the very limited number of established software tools for the data analysis render LC-MALDI a niche application for large quantitative analyses beside the widespread LC-electrospray ionization workflows. Here, we used LC-MALDI in a relative quantification analysis of Staphylococcus aureus for the first time on a proteome-wide scale. Samples were analyzed in parallel with an LTQ-Orbitrap, which allowed cross-validation with a well-established workflow. With nearly 850 proteins identified in the cytosolic fraction and quantitative data for more than 550 proteins obtained with the MASCOT Distiller software, we were able to prove that LC-MALDI is able to process highly complex samples. The good correlation of quantities determined via this method and the LTQ-Orbitrap workflow confirmed the high reliability of our LC-MALDI approach for global quantification analysis. Because the existing literature reports differences for MALDI and electrospray ionization preferences and the respective experimental work was limited by technical or methodological constraints, we systematically compared biochemical attributes of peptides identified with either instrument. This genome-wide, comprehensive study revealed biases toward certain peptide properties for both MALDI-TOF-TOF- and LTQ-Orbitrap-based approaches. These biases are based on almost 13,000 peptides and result in a general complementarity of the two approaches that should be exploited in future experiments.

  15. MONALISA for stochastic simulations of Petri net models of biochemical systems.

    PubMed

    Balazki, Pavel; Lindauer, Klaus; Einloft, Jens; Ackermann, Jörg; Koch, Ina

    2015-07-10

    The concept of Petri nets (PN) is widely used in systems biology and allows modeling of complex biochemical systems like metabolic systems, signal transduction pathways, and gene expression networks. In particular, PN allows the topological analysis based on structural properties, which is important and useful when quantitative (kinetic) data are incomplete or unknown. Knowing the kinetic parameters, the simulation of time evolution of such models can help to study the dynamic behavior of the underlying system. If the number of involved entities (molecules) is low, a stochastic simulation should be preferred against the classical deterministic approach of solving ordinary differential equations. The Stochastic Simulation Algorithm (SSA) is a common method for such simulations. The combination of the qualitative and semi-quantitative PN modeling and stochastic analysis techniques provides a valuable approach in the field of systems biology. Here, we describe the implementation of stochastic analysis in a PN environment. We extended MONALISA - an open-source software for creation, visualization and analysis of PN - by several stochastic simulation methods. The simulation module offers four simulation modes, among them the stochastic mode with constant firing rates and Gillespie's algorithm as exact and approximate versions. The simulator is operated by a user-friendly graphical interface and accepts input data such as concentrations and reaction rate constants that are common parameters in the biological context. The key features of the simulation module are visualization of simulation, interactive plotting, export of results into a text file, mathematical expressions for describing simulation parameters, and up to 500 parallel simulations of the same parameter sets. To illustrate the method we discuss a model for insulin receptor recycling as case study. We present a software that combines the modeling power of Petri nets with stochastic simulation of dynamic processes in a user-friendly environment supported by an intuitive graphical interface. The program offers a valuable alternative to modeling, using ordinary differential equations, especially when simulating single-cell experiments with low molecule counts. The ability to use mathematical expressions provides an additional flexibility in describing the simulation parameters. The open-source distribution allows further extensions by third-party developers. The software is cross-platform and is licensed under the Artistic License 2.0.

  16. Label-free tissue scanner for colorectal cancer screening

    NASA Astrophysics Data System (ADS)

    Kandel, Mikhail E.; Sridharan, Shamira; Liang, Jon; Luo, Zelun; Han, Kevin; Macias, Virgilia; Shah, Anish; Patel, Roshan; Tangella, Krishnarao; Kajdacsy-Balla, Andre; Guzman, Grace; Popescu, Gabriel

    2017-06-01

    The current practice of surgical pathology relies on external contrast agents to reveal tissue architecture, which is then qualitatively examined by a trained pathologist. The diagnosis is based on the comparison with standardized empirical, qualitative assessments of limited objectivity. We propose an approach to pathology based on interferometric imaging of "unstained" biopsies, which provides unique capabilities for quantitative diagnosis and automation. We developed a label-free tissue scanner based on "quantitative phase imaging," which maps out optical path length at each point in the field of view and, thus, yields images that are sensitive to the "nanoscale" tissue architecture. Unlike analysis of stained tissue, which is qualitative in nature and affected by color balance, staining strength and imaging conditions, optical path length measurements are intrinsically quantitative, i.e., images can be compared across different instruments and clinical sites. These critical features allow us to automate the diagnosis process. We paired our interferometric optical system with highly parallelized, dedicated software algorithms for data acquisition, allowing us to image at a throughput comparable to that of commercial tissue scanners while maintaining the nanoscale sensitivity to morphology. Based on the measured phase information, we implemented software tools for autofocusing during imaging, as well as image archiving and data access. To illustrate the potential of our technology for large volume pathology screening, we established an "intrinsic marker" for colorectal disease that detects tissue with dysplasia or colorectal cancer and flags specific areas for further examination, potentially improving the efficiency of existing pathology workflows.

  17. A quantitative PCR assay for the detection and quantification of Babesia bovis and B. bigemina.

    PubMed

    Buling, A; Criado-Fornelio, A; Asenzo, G; Benitez, D; Barba-Carretero, J C; Florin-Christensen, M

    2007-06-20

    The haemoparasites Babesia bovis and Babesia bigemina affect cattle over vast areas of the tropics and temperate parts of the world. Microscopic examination of blood smears allows the detection of clinical cases of babesiosis, but this procedure lacks sensitivity when parasitaemia levels are low. In addition, differentiating between similar haemoparasites can be very difficult. Molecular diagnostic procedures can, however, overcome these problems. This paper reports a quantitative PCR (qPCR) assay involving the use of SYBR Green. Based on the amplification of a small fragment of the cytochrome b gene, this method shows both high sensitivity and specificity, and allows quantification of parasite DNA. In tests, reproducible quantitative results were obtained over the range of 0.1 ng to 0.1 fg of parasite DNA. Melting curve analysis differentiated between B. bovis and B. bigemina. To assess the performance of the new qPCR procedure it was used to screen for babesiosis in 40 cows and 80 horses. B. bigemina was detected in five cows (three of these were also found to be positive by standard PCR techniques targeting the 18S rRNA gene). In addition, B. bovis was detected in one horse and B. bigemina in two horses using the proposed method, while none was found positive by ribosomal standard PCR. The sequences of the B. bigemina cytochrome b and 18S rRNA genes were completely conserved in isolates from Spain and Argentina, while those of B. bovis showed moderate polymorphism.

  18. Changes of the elemental distributions in marine diatoms as a reporter of sample preparation artefacts. A nuclear microscopy application

    NASA Astrophysics Data System (ADS)

    Godinho, R. M.; Cabrita, M. T.; Alves, L. C.; Pinheiro, T.

    2015-04-01

    Studies of the elemental composition of whole marine diatoms cells have high interest as they constitute a direct measurement of environmental changes, and allow anticipating consequences of anthropogenic alterations to organisms, ecosystems and global marine geochemical cycles. Nuclear microscopy is a powerful tool allowing direct measurement of whole cells giving qualitative imaging of distribution, and quantitative determination of intracellular concentration. Major obstacles to the analysis of marine microalgae are high medium salinity and the recurrent presence of extracellular exudates produced by algae to maintain colonies in natural media and in vitro. The objective of this paper was to optimize the methodology of sample preparation of marine unicellular algae for elemental analysis with nuclear microscopy, allowing further studies on cellular response to metals. Primary cultures of Coscinodiscus wailesii maintained in vitro were used to optimize protocols for elemental analysis with nuclear microscopy techniques. Adequate cell preparation procedures to isolate the cells from media components and exudates were established. The use of chemical agents proved to be inappropriate for elemental determination and for intracellular morphological analysis. The assessment of morphology and elemental partitioning in cell compartments obtained with nuclear microscopy techniques enabled to infer their function in natural environment and imbalances in exposure condition. Exposure to metal affected C. wailesii morphology and internal elemental distribution.

  19. Analysis of the clonal repertoire of gene-corrected cells in gene therapy.

    PubMed

    Paruzynski, Anna; Glimm, Hanno; Schmidt, Manfred; Kalle, Christof von

    2012-01-01

    Gene therapy-based clinical phase I/II studies using integrating retroviral vectors could successfully treat different monogenetic inherited diseases. However, with increased efficiency of this therapy, severe side effects occurred in various gene therapy trials. In all cases, integration of the vector close to or within a proto-oncogene contributed substantially to the development of the malignancies. Thus, the in-depth analysis of integration site patterns is of high importance to uncover potential clonal outgrowth and to assess the safety of gene transfer vectors and gene therapy protocols. The standard and nonrestrictive linear amplification-mediated PCR (nrLAM-PCR) in combination with high-throughput sequencing exhibits technologies that allow to comprehensively analyze the clonal repertoire of gene-corrected cells and to assess the safety of the used vector system at an early stage on the molecular level. It enables clarifying the biological consequences of the vector system on the fate of the transduced cell. Furthermore, the downstream performance of real-time PCR allows a quantitative estimation of the clonality of individual cells and their clonal progeny. Here, we present a guideline that should allow researchers to perform comprehensive integration site analysis in preclinical and clinical studies. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. Intracellular O2 sensing probe based on cell-penetrating phosphorescent nanoparticles.

    PubMed

    Fercher, Andreas; Borisov, Sergey M; Zhdanov, Alexander V; Klimant, Ingo; Papkovsky, Dmitri B

    2011-07-26

    A new intracellular O(2) (icO(2)) sensing probe is presented, which comprises a nanoparticle (NP) formulation of a cationic polymer Eudragit RL-100 and a hydrophobic phosphorescent dye Pt(II)-tetrakis(pentafluorophenyl)porphyrin (PtPFPP). Using the time-resolved fluorescence (TR-F) plate reader set-up, cell loading was investigated in detail, particularly the effects of probe concentration, loading time, serum content in the medium, cell type, density, etc. The use of a fluorescent analogue of the probe in conjunction with confocal microscopy and flow cytometry analysis, revealed that cellular uptake of the NPs is driven by nonspecific energy-dependent endocytosis and that the probe localizes inside the cell close to the nucleus. Probe calibration in biological environment was performed, which allowed conversion of measured phosphorescence lifetime signals into icO(2) concentration (μM). Its analytical performance in icO(2) sensing experiments was demonstrated by monitoring metabolic responses of mouse embryonic fibroblast cells under ambient and hypoxic macroenvironment. The NP probe was seen to generate stable and reproducible signals in different types of mammalian cells and robust responses to their metabolic stimulation, thus allowing accurate quantitative analysis. High brightness and photostability allow its use in screening experiments with cell populations on a commercial TR-F reader, and for single cell analysis on a fluorescent microscope.

Top