Quantitative aspects of inductively coupled plasma mass spectrometry
NASA Astrophysics Data System (ADS)
Bulska, Ewa; Wagner, Barbara
2016-10-01
Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue 'Quantitative mass spectrometry'.
Quantitative aspects of inductively coupled plasma mass spectrometry
Wagner, Barbara
2016-01-01
Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644971
Enterococci are frequently monitored in water samples as indicators of fecal pollution. Attention is now shifting from culture based methods for enumerating these organisms to more rapid molecular methods such as QPCR. Accurate quantitative analyses by this method requires highly...
Multidimensional quantitative analysis of mRNA expression within intact vertebrate embryos.
Trivedi, Vikas; Choi, Harry M T; Fraser, Scott E; Pierce, Niles A
2018-01-08
For decades, in situ hybridization methods have been essential tools for studies of vertebrate development and disease, as they enable qualitative analyses of mRNA expression in an anatomical context. Quantitative mRNA analyses typically sacrifice the anatomy, relying on embryo microdissection, dissociation, cell sorting and/or homogenization. Here, we eliminate the trade-off between quantitation and anatomical context, using quantitative in situ hybridization chain reaction (qHCR) to perform accurate and precise relative quantitation of mRNA expression with subcellular resolution within whole-mount vertebrate embryos. Gene expression can be queried in two directions: read-out from anatomical space to expression space reveals co-expression relationships in selected regions of the specimen; conversely, read-in from multidimensional expression space to anatomical space reveals those anatomical locations in which selected gene co-expression relationships occur. As we demonstrate by examining gene circuits underlying somitogenesis, quantitative read-out and read-in analyses provide the strengths of flow cytometry expression analyses, but by preserving subcellular anatomical context, they enable bi-directional queries that open a new era for in situ hybridization. © 2018. Published by The Company of Biologists Ltd.
NASA Technical Reports Server (NTRS)
Shekhar, R.; Cothren, R. M.; Vince, D. G.; Chandra, S.; Thomas, J. D.; Cornhill, J. F.
1999-01-01
Intravascular ultrasound (IVUS) provides exact anatomy of arteries, allowing accurate quantitative analysis. Automated segmentation of IVUS images is a prerequisite for routine quantitative analyses. We present a new three-dimensional (3D) segmentation technique, called active surface segmentation, which detects luminal and adventitial borders in IVUS pullback examinations of coronary arteries. The technique was validated against expert tracings by computing correlation coefficients (range 0.83-0.97) and William's index values (range 0.37-0.66). The technique was statistically accurate, robust to image artifacts, and capable of segmenting a large number of images rapidly. Active surface segmentation enabled geometrically accurate 3D reconstruction and visualization of coronary arteries and volumetric measurements.
Domanski, Dominik; Murphy, Leigh C.; Borchers, Christoph H.
2010-01-01
We have developed a phosphatase-based phosphopeptide quantitation (PPQ) method for determining phosphorylation stoichiometry in complex biological samples. This PPQ method is based on enzymatic dephosphorylation, combined with specific and accurate peptide identification and quantification by multiple reaction monitoring (MRM) detection with stable-isotope-labeled standard peptides. In contrast with the classical MRM methods for the quantitation of phosphorylation stoichiometry, the PPQ-MRM method needs only one non-phosphorylated SIS (stable isotope-coded standard) and two analyses (one for the untreated and one for the phosphatase-treated sample), from which the expression and modification levels can accurately be determined. From these analyses, the % phosphorylation can be determined. In this manuscript, we compare the PPQ-MRM method with an MRM method without phosphatase, and demonstrate the application of these methods to the detection and quantitation of phosphorylation of the classic phosphorylated breast cancer biomarkers (ERα and HER2), and for phosphorylated RAF and ERK1, which also contain phosphorylation sites with important biological implications. Using synthetic peptides spiked into a complex protein digest, we were able to use our PPQ-MRM method to accurately determine the total phosphorylation stoichiometry on specific peptides, as well as the absolute amount of the peptide and phosphopeptide present. Analyses of samples containing ERα protein revealed that the PPQ-MRM is capable of determining phosphorylation stoichiometry in proteins from cell lines, and is in good agreement with determinations obtained using the direct MRM approach in terms of phosphorylation and total protein amount. PMID:20524616
Quantitative analyses for elucidating mechanisms of cell fate commitment in the mouse blastocyst
NASA Astrophysics Data System (ADS)
Saiz, Néstor; Kang, Minjung; Puliafito, Alberto; Schrode, Nadine; Xenopoulos, Panagiotis; Lou, Xinghua; Di Talia, Stefano; Hadjantonakis, Anna-Katerina
2015-03-01
In recent years we have witnessed a shift from qualitative image analysis towards higher resolution, quantitative analyses of imaging data in developmental biology. This shift has been fueled by technological advances in both imaging and analysis software. We have recently developed a tool for accurate, semi-automated nuclear segmentation of imaging data from early mouse embryos and embryonic stem cells. We have applied this software to the study of the first lineage decisions that take place during mouse development and established analysis pipelines for both static and time-lapse imaging experiments. In this paper we summarize the conclusions from these studies to illustrate how quantitative, single-cell level analysis of imaging data can unveil biological processes that cannot be revealed by traditional qualitative studies.
METHODS TO CLASSIFY ENVIRONMENTAL SAMPLES BASED ON MOLD ANALYSES BY QPCR
Quantitative PCR (QPCR) analysis of molds in indoor environmental samples produces highly accurate speciation and enumeration data. In a number of studies, eighty of the most common or potentially problematic indoor molds were identified and quantified in dust samples from homes...
Quantitative analysis of periodontal pathogens by ELISA and real-time polymerase chain reaction.
Hamlet, Stephen M
2010-01-01
The development of analytical methods enabling the accurate identification and enumeration of bacterial species colonizing the oral cavity has led to the identification of a small number of bacterial pathogens that are major factors in the etiology of periodontal disease. Further, these methods also underpin more recent epidemiological analyses of the impact of periodontal disease on general health. Given the complex milieu of over 700 species of microorganisms known to exist within the complex biofilms found in the oral cavity, the identification and enumeration of oral periodontopathogens has not been an easy task. In recent years however, some of the intrinsic limitations of the more traditional microbiological analyses previously used have been overcome with the advent of immunological and molecular analytical methods. Of the plethora of methodologies reported in the literature, the enzyme-linked immunosorbent assay (ELISA), which combines the specificity of antibody with the sensitivity of simple enzyme assays and the polymerase chain reaction (PCR), has been widely utilized in both laboratory and clinical applications. Although conventional PCR does not allow quantitation of the target organism, real-time PCR (rtPCR) has the ability to detect amplicons as they accumulate in "real time" allowing subsequent quantitation. These methods enable the accurate quantitation of as few as 10(2) (using rtPCR) to 10(4) (using ELISA) periodontopathogens in dental plaque samples.
Rastogi, L.; Dash, K.; Arunachalam, J.
2013-01-01
The quantitative analysis of glutathione (GSH) is important in different fields like medicine, biology, and biotechnology. Accurate quantitative measurements of this analyte have been hampered by the lack of well characterized reference standards. The proposed procedure is intended to provide an accurate and definitive method for the quantitation of GSH for reference measurements. Measurement of the stoichiometrically existing sulfur content in purified GSH offers an approach for its quantitation and calibration through an appropriate characterized reference material (CRM) for sulfur would provide a methodology for the certification of GSH quantity, that is traceable to SI (International system of units). The inductively coupled plasma optical emission spectrometry (ICP-OES) approach negates the need for any sample digestion. The sulfur content of the purified GSH is quantitatively converted into sulfate ions by microwave-assisted UV digestion in the presence of hydrogen peroxide prior to ion chromatography (IC) measurements. The measurement of sulfur by ICP-OES and IC (as sulfate) using the “high performance” methodology could be useful for characterizing primary calibration standards and certified reference materials with low uncertainties. The relative expanded uncertainties (% U) expressed at 95% confidence interval for ICP-OES analyses varied from 0.1% to 0.3%, while in the case of IC, they were between 0.2% and 1.2%. The described methods are more suitable for characterizing primary calibration standards and certifying reference materials of GSH, than for routine measurements. PMID:29403814
Application of the accurate mass and time tag approach in studies of the human blood lipidome
Ding, Jie; Sorensen, Christina M.; Jaitly, Navdeep; Jiang, Hongliang; Orton, Daniel J.; Monroe, Matthew E.; Moore, Ronald J.; Smith, Richard D.; Metz, Thomas O.
2008-01-01
We report a preliminary demonstration of the accurate mass and time (AMT) tag approach for lipidomics. Initial data-dependent LC-MS/MS analyses of human plasma, erythrocyte, and lymphocyte lipids were performed in order to identify lipid molecular species in conjunction with complementary accurate mass and isotopic distribution information. Identified lipids were used to populate initial lipid AMT tag databases containing 250 and 45 entries for those species detected in positive and negative electrospray ionization (ESI) modes, respectively. The positive ESI database was then utilized to identify human plasma, erythrocyte, and lymphocyte lipids in high-throughput LC-MS analyses based on the AMT tag approach. We were able to define the lipid profiles of human plasma, erythrocytes, and lymphocytes based on qualitative and quantitative differences in lipid abundance. PMID:18502191
Impact of TRMM and SSM/I Rainfall Assimilation on Global Analysis and QPF
NASA Technical Reports Server (NTRS)
Hou, Arthur; Zhang, Sara; Reale, Oreste
2002-01-01
Evaluation of QPF skills requires quantitatively accurate precipitation analyses. We show that assimilation of surface rain rates derived from the Tropical Rainfall Measuring Mission (TRMM) Microwave Imager and Special Sensor Microwave/Imager (SSM/I) improves quantitative precipitation estimates (QPE) and many aspects of global analyses. Short-range forecasts initialized with analyses with satellite rainfall data generally yield significantly higher QPF threat scores and better storm track predictions. These results were obtained using a variational procedure that minimizes the difference between the observed and model rain rates by correcting the moist physics tendency of the forecast model over a 6h assimilation window. In two case studies of Hurricanes Bonnie and Floyd, synoptic analysis shows that this procedure produces initial conditions with better-defined tropical storm features and stronger precipitation intensity associated with the storm.
Wang, Peng; Zhang, Cheng; Liu, Hong-Wen; Xiong, Mengyi; Yin, Sheng-Yan; Yang, Yue; Hu, Xiao-Xiao; Yin, Xia; Zhang, Xiao-Bing; Tan, Weihong
2017-12-01
Fluorescence quantitative analyses for vital biomolecules are in great demand in biomedical science owing to their unique detection advantages with rapid, sensitive, non-damaging and specific identification. However, available fluorescence strategies for quantitative detection are usually hard to design and achieve. Inspired by supramolecular chemistry, a two-photon-excited fluorescent supramolecular nanoplatform ( TPSNP ) was designed for quantitative analysis with three parts: host molecules (β-CD polymers), a guest fluorophore of sensing probes (Np-Ad) and a guest internal reference (NpRh-Ad). In this strategy, the TPSNP possesses the merits of (i) improved water-solubility and biocompatibility; (ii) increased tissue penetration depth for bioimaging by two-photon excitation; (iii) quantitative and tunable assembly of functional guest molecules to obtain optimized detection conditions; (iv) a common approach to avoid the limitation of complicated design by adjustment of sensing probes; and (v) accurate quantitative analysis by virtue of reference molecules. As a proof-of-concept, we utilized the two-photon fluorescent probe NHS-Ad-based TPSNP-1 to realize accurate quantitative analysis of hydrogen sulfide (H 2 S), with high sensitivity and good selectivity in live cells, deep tissues and ex vivo -dissected organs, suggesting that the TPSNP is an ideal quantitative indicator for clinical samples. What's more, TPSNP will pave the way for designing and preparing advanced supramolecular sensors for biosensing and biomedicine.
Stereotypes and Representations of Aging in the Media
ERIC Educational Resources Information Center
Mason, Susan E.; Darnell, Emily A.; Prifti, Krisiola
2010-01-01
How are older adults presented in print and in the electronic media? Are they underrepresented? Are they accurately portrayed? Based on our examination of several forms of media over a four-month period, we discuss the role of the media in shaping our views on aging. Quantitative and qualitative analyses reveal that media representations often…
A quantitative test of population genetics using spatiogenetic patterns in bacterial colonies.
Korolev, Kirill S; Xavier, João B; Nelson, David R; Foster, Kevin R
2011-10-01
It is widely accepted that population-genetics theory is the cornerstone of evolutionary analyses. Empirical tests of the theory, however, are challenging because of the complex relationships between space, dispersal, and evolution. Critically, we lack quantitative validation of the spatial models of population genetics. Here we combine analytics, on- and off-lattice simulations, and experiments with bacteria to perform quantitative tests of the theory. We study two bacterial species, the gut microbe Escherichia coli and the opportunistic pathogen Pseudomonas aeruginosa, and show that spatiogenetic patterns in colony biofilms of both species are accurately described by an extension of the one-dimensional stepping-stone model. We use one empirical measure, genetic diversity at the colony periphery, to parameterize our models and show that we can then accurately predict another key variable: the degree of short-range cell migration along an edge. Moreover, the model allows us to estimate other key parameters, including effective population size (density) at the expansion frontier. While our experimental system is a simplification of natural microbial community, we argue that it constitutes proof of principle that the spatial models of population genetics can quantitatively capture organismal evolution.
Impact of immersion oils and mounting media on the confocal imaging of dendritic spines
Peterson, Brittni M.; Mermelstein, Paul G.; Meisel, Robert L.
2015-01-01
Background Structural plasticity, such as changes in dendritic spine morphology and density, reflect changes in synaptic connectivity and circuitry. Procedural variables used in different methods for labeling dendritic spines have been quantitatively evaluated for their impact on the ability to resolve individual spines in confocal microscopic analyses. In contrast, there have been discussions, though no quantitative analyses, of the potential effects of choosing specific mounting media and immersion oils on dendritic spine resolution. New Method Here we provide quantitative data measuring the impact of these variables on resolving dendritic spines in 3D confocal analyses. Medium spiny neurons from the rat striatum and nucleus accumbens are used as examples. Results Both choice of mounting media and immersion oil affected the visualization of dendritic spines, with choosing the appropriate immersion oil as being more imperative. These biologic data are supported by quantitative measures of the 3D diffraction pattern (i.e. point spread function) of a point source of light under the same mounting medium and immersion oil combinations. Comparison with Existing Method Although not a new method, this manuscript provides quantitative data demonstrating that different mounting media and immersion oils can impact the ability to resolve dendritic spines. These findings highlight the importance of reporting which mounting medium and immersion oil are used in preparations for confocal analyses, especially when comparing published results from different laboratories. Conclusion Collectively, these data suggest that choosing the appropriate immersion oil and mounting media is critical for obtaining the best resolution, and consequently more accurate measures of dendritic spine densities. PMID:25601477
Impact of immersion oils and mounting media on the confocal imaging of dendritic spines.
Peterson, Brittni M; Mermelstein, Paul G; Meisel, Robert L
2015-03-15
Structural plasticity, such as changes in dendritic spine morphology and density, reflect changes in synaptic connectivity and circuitry. Procedural variables used in different methods for labeling dendritic spines have been quantitatively evaluated for their impact on the ability to resolve individual spines in confocal microscopic analyses. In contrast, there have been discussions, though no quantitative analyses, of the potential effects of choosing specific mounting media and immersion oils on dendritic spine resolution. Here we provide quantitative data measuring the impact of these variables on resolving dendritic spines in 3D confocal analyses. Medium spiny neurons from the rat striatum and nucleus accumbens are used as examples. Both choice of mounting media and immersion oil affected the visualization of dendritic spines, with choosing the appropriate immersion oil as being more imperative. These biologic data are supported by quantitative measures of the 3D diffraction pattern (i.e. point spread function) of a point source of light under the same mounting medium and immersion oil combinations. Although not a new method, this manuscript provides quantitative data demonstrating that different mounting media and immersion oils can impact the ability to resolve dendritic spines. These findings highlight the importance of reporting which mounting medium and immersion oil are used in preparations for confocal analyses, especially when comparing published results from different laboratories. Collectively, these data suggest that choosing the appropriate immersion oil and mounting media is critical for obtaining the best resolution, and consequently more accurate measures of dendritic spine densities. Copyright © 2015 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Mayhew, Matthew J.; Simonoff, Jeffrey S.
2015-01-01
The purpose of this paper is to describe effect coding as an alternative quantitative practice for analyzing and interpreting categorical, multi-raced independent variables in higher education research. Not only may effect coding enable researchers to get closer to respondents' original intentions, it allows for more accurate analyses of all race…
Kim, Jae-Hwan; Park, Saet-Byul; Roh, Hyo-Jeong; Park, Sunghoon; Shin, Min-Ki; Moon, Gui Im; Hong, Jin-Hwan; Kim, Hae-Yeong
2015-06-01
With the increasing number of genetically modified (GM) events, unauthorized GMO releases into the food market have increased dramatically, and many countries have developed detection tools for them. This study described the qualitative and quantitative detection methods of unauthorized the GM wheat MON71800 with a reference plasmid (pGEM-M71800). The wheat acetyl-CoA carboxylase (acc) gene was used as the endogenous gene. The plasmid pGEM-M71800, which contains both the acc gene and the event-specific target MON71800, was constructed as a positive control for the qualitative and quantitative analyses. The limit of detection in the qualitative PCR assay was approximately 10 copies. In the quantitative PCR assay, the standard deviation and relative standard deviation repeatability values ranged from 0.06 to 0.25 and from 0.23% to 1.12%, respectively. This study supplies a powerful and very simple but accurate detection strategy for unauthorized GM wheat MON71800 that utilizes a single calibrator plasmid. Copyright © 2014 Elsevier Ltd. All rights reserved.
Digital PCR Quantitation of Muscle Mitochondrial DNA: Age, Fiber Type, and Mutation-Induced Changes.
Herbst, Allen; Widjaja, Kevin; Nguy, Beatrice; Lushaj, Entela B; Moore, Timothy M; Hevener, Andrea L; McKenzie, Debbie; Aiken, Judd M; Wanagat, Jonathan
2017-10-01
Definitive quantitation of mitochondrial DNA (mtDNA) and mtDNA deletion mutation abundances would help clarify the role of mtDNA instability in aging. To more accurately quantify mtDNA, we applied the emerging technique of digital polymerase chain reaction to individual muscle fibers and muscle homogenates from aged rodents. Individual fiber mtDNA content correlated with fiber type and decreased with age. We adapted a digital polymerase chain reaction deletion assay that was accurate in mixing experiments to a mutation frequency of 0.03% and quantitated an age-induced increase in deletion frequency from rat muscle homogenates. Importantly, the deletion frequency measured in muscle homogenates strongly correlated with electron transport chain-deficient fiber abundance determined by histochemical analyses. These data clarify the temporal accumulation of mtDNA deletions that lead to electron chain-deficient fibers, a process culminating in muscle fiber loss. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Stephens, A D; Colah, R; Fucharoen, S; Hoyer, J; Keren, D; McFarlane, A; Perrett, D; Wild, B J
2015-10-01
Automated high performance liquid chromatography and Capillary electrophoresis are used to quantitate the proportion of Hemoglobin A2 (HbA2 ) in blood samples order to enable screening and diagnosis of carriers of β-thalassemia. Since there is only a very small difference in HbA2 levels between people who are carriers and people who are not carriers such analyses need to be both precise and accurate. This paper examines the different parameters of such equipment and discusses how they should be assessed. © 2015 John Wiley & Sons Ltd.
The study of forensic toxicology should not be neglected in Japanese universities.
Ishihara, Kenji; Yajima, Daisuke; Abe, Hiroko; Nagasawa, Sayaka; Nara, Akina; Iwase, Hirotaro
2015-04-01
Forensic toxicology is aimed at identifying the relationship between drugs or poison and the cause of death or crime. In the authors' toxicology laboratory at Chiba University, the authors analyze almost every body for drugs and poisons. A simple inspection kit was used in an attempt to ascertain drug abuse. A mass spectrometer is used to perform highly accurate screening. When a poison is detected, quantitative analyses are required. A recent topic of interest is new psychoactive substances (NPS). Although NPS-related deaths may be decreasing, use of NPS as a cause of death is difficult to ascertain. Forensic institutes have recently begun to perform drug and poison tests on corpses. However, this approach presents several problems, as are discussed here. The hope is that highly accurate analyses of drugs and poisons will be performed throughout the country.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collee, R.; Govaerts, J.; Winand, L.
1959-10-31
A brief resume of the classical methods of quantitative determination of thorium in ores and thoriferous products is given to show that a rapid, accurate, and precise physical method based on the radioactivity of thorium would be of great utility. A method based on the utilization of the characteristic spectrum of the thorium gamma radiation is presented. The preparation of the samples and the instruments needed for the measurements is discussed. The experimental results show that the reproducibility is very satisfactory and that it is possible to detect Th contents of 1% or smaller. (J.S.R.)
Analyser-based phase contrast image reconstruction using geometrical optics.
Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A
2007-07-21
Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (<100 microm), whilst quantitative phase and attenuation information can be extracted using just two images when the approximations of geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser.
Matsunaga, Hiroko; Goto, Mari; Arikawa, Koji; Shirai, Masataka; Tsunoda, Hiroyuki; Huang, Huan; Kambara, Hideki
2015-02-15
Analyses of gene expressions in single cells are important for understanding detailed biological phenomena. Here, a highly sensitive and accurate method by sequencing (called "bead-seq") to obtain a whole gene expression profile for a single cell is proposed. A key feature of the method is to use a complementary DNA (cDNA) library on magnetic beads, which enables adding washing steps to remove residual reagents in a sample preparation process. By adding the washing steps, the next steps can be carried out under the optimal conditions without losing cDNAs. Error sources were carefully evaluated to conclude that the first several steps were the key steps. It is demonstrated that bead-seq is superior to the conventional methods for single-cell gene expression analyses in terms of reproducibility, quantitative accuracy, and biases caused during sample preparation and sequencing processes. Copyright © 2014 Elsevier Inc. All rights reserved.
Lo, Andy; Weiner, Joel H; Li, Liang
2013-09-17
Due to limited sample amounts, instrument time considerations, and reagent costs, only a small number of replicate experiments are typically performed for quantitative proteome analyses. Generation of reproducible data that can be readily assessed for consistency within a small number of datasets is critical for accurate quantification. We report our investigation of a strategy using reciprocal isotope labeling of two comparative samples as a tool for determining proteome changes. Reciprocal labeling was evaluated to determine the internal consistency of quantified proteome changes from Escherichia coli grown under aerobic and anaerobic conditions. Qualitatively, the peptide overlap between replicate analyses of the same sample and reverse labeled samples were found to be within 8%. Quantitatively, reciprocal analyses showed only a slight increase in average overall inconsistency when compared with replicate analyses (1.29 vs. 1.24-fold difference). Most importantly, reverse labeling was successfully used to identify spurious values resulting from incorrect peptide identifications and poor peak fitting. After removal of 5% of the peptide data with low reproducibility, a total of 275 differentially expressed proteins (>1.50-fold difference) were consistently identified and were then subjected to bioinformatics analysis. General considerations and guidelines for reciprocal labeling experimental design and biological significance of obtained results are discussed. Copyright © 2013 Elsevier B.V. All rights reserved.
Spotsizer: High-throughput quantitative analysis of microbial growth.
Bischof, Leanne; Převorovský, Martin; Rallis, Charalampos; Jeffares, Daniel C; Arzhaeva, Yulia; Bähler, Jürg
2016-10-01
Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license.
GLS-Finder: A Platform for Fast Profiling of Glucosinolates in Brassica Vegetables.
Sun, Jianghao; Zhang, Mengliang; Chen, Pei
2016-06-01
Mass spectrometry combined with related tandem techniques has become the most popular method for plant secondary metabolite characterization. We introduce a new strategy based on in-database searching, mass fragmentation behavior study, formula predicting for fast profiling of glucosinolates, a class of important compounds in brassica vegetables. A MATLAB script-based expert system computer program, "GLS-Finder", was developed. It is capable of qualitative and semi-quantitative analyses of glucosinolates in samples using data generated by ultrahigh-performance liquid chromatography-high-resolution accurate mass with multi-stage mass fragmentation (UHPLC-HRAM/MS(n)). A suite of bioinformatic tools was integrated into the "GLS-Finder" to perform raw data deconvolution, peak alignment, glucosinolate putative assignments, semi-quantitation, and unsupervised principal component analysis (PCA). GLS-Finder was successfully applied to identify intact glucosinolates in 49 commonly consumed Brassica vegetable samples in the United States. It is believed that this work introduces a new way of fast data processing and interpretation for qualitative and quantitative analyses of glucosinolates, where great efficacy was improved in comparison to identification manually.
Zaritsky, Assaf; Natan, Sari; Horev, Judith; Hecht, Inbal; Wolf, Lior; Ben-Jacob, Eshel; Tsarfaty, Ilan
2011-01-01
Confocal microscopy analysis of fluorescence and morphology is becoming the standard tool in cell biology and molecular imaging. Accurate quantification algorithms are required to enhance the understanding of different biological phenomena. We present a novel approach based on image-segmentation of multi-cellular regions in bright field images demonstrating enhanced quantitative analyses and better understanding of cell motility. We present MultiCellSeg, a segmentation algorithm to separate between multi-cellular and background regions for bright field images, which is based on classification of local patches within an image: a cascade of Support Vector Machines (SVMs) is applied using basic image features. Post processing includes additional classification and graph-cut segmentation to reclassify erroneous regions and refine the segmentation. This approach leads to a parameter-free and robust algorithm. Comparison to an alternative algorithm on wound healing assay images demonstrates its superiority. The proposed approach was used to evaluate common cell migration models such as wound healing and scatter assay. It was applied to quantify the acceleration effect of Hepatocyte growth factor/scatter factor (HGF/SF) on healing rate in a time lapse confocal microscopy wound healing assay and demonstrated that the healing rate is linear in both treated and untreated cells, and that HGF/SF accelerates the healing rate by approximately two-fold. A novel fully automated, accurate, zero-parameters method to classify and score scatter-assay images was developed and demonstrated that multi-cellular texture is an excellent descriptor to measure HGF/SF-induced cell scattering. We show that exploitation of textural information from differential interference contrast (DIC) images on the multi-cellular level can prove beneficial for the analyses of wound healing and scatter assays. The proposed approach is generic and can be used alone or alongside traditional fluorescence single-cell processing to perform objective, accurate quantitative analyses for various biological applications. PMID:22096600
Zaritsky, Assaf; Natan, Sari; Horev, Judith; Hecht, Inbal; Wolf, Lior; Ben-Jacob, Eshel; Tsarfaty, Ilan
2011-01-01
Confocal microscopy analysis of fluorescence and morphology is becoming the standard tool in cell biology and molecular imaging. Accurate quantification algorithms are required to enhance the understanding of different biological phenomena. We present a novel approach based on image-segmentation of multi-cellular regions in bright field images demonstrating enhanced quantitative analyses and better understanding of cell motility. We present MultiCellSeg, a segmentation algorithm to separate between multi-cellular and background regions for bright field images, which is based on classification of local patches within an image: a cascade of Support Vector Machines (SVMs) is applied using basic image features. Post processing includes additional classification and graph-cut segmentation to reclassify erroneous regions and refine the segmentation. This approach leads to a parameter-free and robust algorithm. Comparison to an alternative algorithm on wound healing assay images demonstrates its superiority. The proposed approach was used to evaluate common cell migration models such as wound healing and scatter assay. It was applied to quantify the acceleration effect of Hepatocyte growth factor/scatter factor (HGF/SF) on healing rate in a time lapse confocal microscopy wound healing assay and demonstrated that the healing rate is linear in both treated and untreated cells, and that HGF/SF accelerates the healing rate by approximately two-fold. A novel fully automated, accurate, zero-parameters method to classify and score scatter-assay images was developed and demonstrated that multi-cellular texture is an excellent descriptor to measure HGF/SF-induced cell scattering. We show that exploitation of textural information from differential interference contrast (DIC) images on the multi-cellular level can prove beneficial for the analyses of wound healing and scatter assays. The proposed approach is generic and can be used alone or alongside traditional fluorescence single-cell processing to perform objective, accurate quantitative analyses for various biological applications.
1H NMR quantification in very dilute toxin solutions: application to anatoxin-a analysis.
Dagnino, Denise; Schripsema, Jan
2005-08-01
A complete procedure is described for the extraction, detection and quantification of anatoxin-a in biological samples. Anatoxin-a is extracted from biomass by a routine acid base extraction. The extract is analysed by GC-MS, without the need of derivatization, with a detection limit of 0.5 ng. A method was developed for the accurate quantification of anatoxin-a in the standard solution to be used for the calibration of the GC analysis. 1H NMR allowed the accurate quantification of microgram quantities of anatoxin-a. The accurate quantification of compounds in standard solutions is rarely discussed, but for compounds like anatoxin-a (toxins with prices in the range of a million dollar a gram), of which generally only milligram quantities or less are available, this factor in the quantitative analysis is certainly not trivial. The method that was developed can easily be adapted for the accurate quantification of other toxins in very dilute solutions.
Development of Tripropellant CFD Design Code
NASA Technical Reports Server (NTRS)
Farmer, Richard C.; Cheng, Gary C.; Anderson, Peter G.
1998-01-01
A tripropellant, such as GO2/H2/RP-1, CFD design code has been developed to predict the local mixing of multiple propellant streams as they are injected into a rocket motor. The code utilizes real fluid properties to account for the mixing and finite-rate combustion processes which occur near an injector faceplate, thus the analysis serves as a multi-phase homogeneous spray combustion model. Proper accounting of the combustion allows accurate gas-side temperature predictions which are essential for accurate wall heating analyses. The complex secondary flows which are predicted to occur near a faceplate cannot be quantitatively predicted by less accurate methodology. Test cases have been simulated to describe an axisymmetric tripropellant coaxial injector and a 3-dimensional RP-1/LO2 impinger injector system. The analysis has been shown to realistically describe such injector combustion flowfields. The code is also valuable to design meaningful future experiments by determining the critical location and type of measurements needed.
NASA Astrophysics Data System (ADS)
Buss, S.; Wernli, H.; Peter, T.; Kivi, R.; Bui, T. P.; Kleinböhl, A.; Schiller, C.
Stratospheric winter temperatures play a key role in the chain of microphysical and chemical processes that lead to the formation of polar stratospheric clouds (PSCs), chlorine activation and eventually to stratospheric ozone depletion. Here the tempera- ture conditions during the Arctic winters 1999/2000 and 2000/2001 are quantitatively investigated using observed profiles of water vapour and nitric acid, and tempera- tures from high-resolution radiosondes and aircraft observations, global ECMWF and UKMO analyses and mesoscale model simulations over Scandinavia and Greenland. The ECMWF model resolves parts of the gravity wave activity and generally agrees well with the observations. However, for the very cold temperatures near the ice frost point the ECMWF analyses have a warm bias of 1-6 K compared to radiosondes. For the mesoscale model HRM, this bias is generally reduced due to a more accurate rep- resentation of gravity waves. Quantitative estimates of the impact of the mesoscale temperature perturbations indicates that over Scandinavia and Greenland the wave- induced stratospheric cooling (as simulated by the HRM) affects only moderately the estimated chlorine activation and homogeneous NAT particle formation, but strongly enhances the potential for ice formation.
Complex and dynamic landscape of RNA polyadenylation revealed by PAS-Seq
Shepard, Peter J.; Choi, Eun-A; Lu, Jente; Flanagan, Lisa A.; Hertel, Klemens J.; Shi, Yongsheng
2011-01-01
Alternative polyadenylation (APA) of mRNAs has emerged as an important mechanism for post-transcriptional gene regulation in higher eukaryotes. Although microarrays have recently been used to characterize APA globally, they have a number of serious limitations that prevents comprehensive and highly quantitative analysis. To better characterize APA and its regulation, we have developed a deep sequencing-based method called Poly(A) Site Sequencing (PAS-Seq) for quantitatively profiling RNA polyadenylation at the transcriptome level. PAS-Seq not only accurately and comprehensively identifies poly(A) junctions in mRNAs and noncoding RNAs, but also provides quantitative information on the relative abundance of polyadenylated RNAs. PAS-Seq analyses of human and mouse transcriptomes showed that 40%–50% of all expressed genes produce alternatively polyadenylated mRNAs. Furthermore, our study detected evolutionarily conserved polyadenylation of histone mRNAs and revealed novel features of mitochondrial RNA polyadenylation. Finally, PAS-Seq analyses of mouse embryonic stem (ES) cells, neural stem/progenitor (NSP) cells, and neurons not only identified more poly(A) sites than what was found in the entire mouse EST database, but also detected significant changes in the global APA profile that lead to lengthening of 3′ untranslated regions (UTR) in many mRNAs during stem cell differentiation. Together, our PAS-Seq analyses revealed a complex landscape of RNA polyadenylation in mammalian cells and the dynamic regulation of APA during stem cell differentiation. PMID:21343387
Quantitative Analysis of Venus Radar Backscatter Data in ArcGIS
NASA Technical Reports Server (NTRS)
Long, S. M.; Grosfils, E. B.
2005-01-01
Ongoing mapping of the Ganiki Planitia (V14) quadrangle of Venus and definition of material units has involved an integrated but qualitative analysis of Magellan radar backscatter images and topography using standard geomorphological mapping techniques. However, such analyses do not take full advantage of the quantitative information contained within the images. Analysis of the backscatter coefficient allows a much more rigorous statistical comparison between mapped units, permitting first order selfsimilarity tests of geographically separated materials assigned identical geomorphological labels. Such analyses cannot be performed directly on pixel (DN) values from Magellan backscatter images, because the pixels are scaled to the Muhleman law for radar echoes on Venus and are not corrected for latitudinal variations in incidence angle. Therefore, DN values must be converted based on pixel latitude back to their backscatter coefficient values before accurate statistical analysis can occur. Here we present a method for performing the conversions and analysis of Magellan backscatter data using commonly available ArcGIS software and illustrate the advantages of the process for geological mapping.
Egorov, Evgeny S; Merzlyak, Ekaterina M; Shelenkov, Andrew A; Britanova, Olga V; Sharonov, George V; Staroverov, Dmitriy B; Bolotin, Dmitriy A; Davydov, Alexey N; Barsova, Ekaterina; Lebedev, Yuriy B; Shugay, Mikhail; Chudakov, Dmitriy M
2015-06-15
Emerging high-throughput sequencing methods for the analyses of complex structure of TCR and BCR repertoires give a powerful impulse to adaptive immunity studies. However, there are still essential technical obstacles for performing a truly quantitative analysis. Specifically, it remains challenging to obtain comprehensive information on the clonal composition of small lymphocyte populations, such as Ag-specific, functional, or tissue-resident cell subsets isolated by sorting, microdissection, or fine needle aspirates. In this study, we report a robust approach based on unique molecular identifiers that allows profiling Ag receptors for several hundred to thousand lymphocytes while preserving qualitative and quantitative information on clonal composition of the sample. We also describe several general features regarding the data analysis with unique molecular identifiers that are critical for accurate counting of starting molecules in high-throughput sequencing applications. Copyright © 2015 by The American Association of Immunologists, Inc.
NASA Astrophysics Data System (ADS)
Cheng, Rita W. T.; Habib, Ayman F.; Frayne, Richard; Ronsky, Janet L.
2006-03-01
In-vivo quantitative assessments of joint conditions and health status can help to increase understanding of the pathology of osteoarthritis, a degenerative joint disease that affects a large population each year. Magnetic resonance imaging (MRI) provides a non-invasive and accurate means to assess and monitor joint properties, and has become widely used for diagnosis and biomechanics studies. Quantitative analyses and comparisons of MR datasets require accurate alignment of anatomical structures, thus image registration becomes a necessary procedure for these applications. This research focuses on developing a registration technique for MR knee joint surfaces to allow quantitative study of joint injuries and health status. It introduces a novel idea of translating techniques originally developed for geographic data in the field of photogrammetry and remote sensing to register 3D MR data. The proposed algorithm works with surfaces that are represented by randomly distributed points with no requirement of known correspondences. The algorithm performs matching locally by identifying corresponding surface elements, and solves for the transformation parameters relating the surfaces by minimizing normal distances between them. This technique was used in three applications to: 1) register temporal MR data to verify the feasibility of the algorithm to help monitor diseases, 2) quantify patellar movement with respect to the femur based on the transformation parameters, and 3) quantify changes in contact area locations between the patellar and femoral cartilage at different knee flexion angles. The results indicate accurate registration and the proposed algorithm can be applied for in-vivo study of joint injuries with MRI.
Standardized protocols for quality control of MRM-based plasma proteomic workflows.
Percy, Andrew J; Chambers, Andrew G; Smith, Derek S; Borchers, Christoph H
2013-01-04
Mass spectrometry (MS)-based proteomics is rapidly emerging as a viable technology for the identification and quantitation of biological samples, such as human plasma--the most complex yet commonly employed biofluid in clinical analyses. The transition from a qualitative to quantitative science is required if proteomics is going to successfully make the transition to a clinically useful technique. MS, however, has been criticized for a lack of reproducibility and interlaboratory transferability. Currently, the MS and plasma proteomics communities lack standardized protocols and reagents to ensure that high-quality quantitative data can be accurately and precisely reproduced by laboratories across the world using different MS technologies. Toward addressing this issue, we have developed standard protocols for multiple reaction monitoring (MRM)-based assays with customized isotopically labeled internal standards for quality control of the sample preparation workflow and the MS platform in quantitative plasma proteomic analyses. The development of reference standards and their application to a single MS platform is discussed herein, along with the results from intralaboratory tests. The tests highlighted the importance of the reference standards in assessing the efficiency and reproducibility of the entire bottom-up proteomic workflow and revealed errors related to the sample preparation and performance quality and deficits of the MS and LC systems. Such evaluations are necessary if MRM-based quantitative plasma proteomics is to be used in verifying and validating putative disease biomarkers across different research laboratories and eventually in clinical laboratories.
Pape, B E; Cary, P L; Clay, L C; Godolphin, W
1983-01-01
Pentobarbital serum concentrations associated with a high-dose therapeutic regimen were determined using EMIT immunoassay reagents. Replicate analyses of serum controls resulted in a within-assay coefficient of variation of 5.0% and a between-assay coefficient of variation of 10%. Regression analysis of 44 serum samples analyzed by this technique (y) and a reference procedure (x) were y = 0.98x + 3.6 (r = 0.98; x = ultraviolet spectroscopy) and y = 1.04x + 2.4 (r = 0.96; x = high-performance liquid chromatography). Clinical evaluation of the results indicates the immunoassay is sufficiently sensitive and selective for pentobarbital to allow accurate quantitation within the therapeutic range associated with high-dose therapy.
Hild, J; Gertz, C
1980-02-01
For the quantitative determination of preservatives in food, analyses were carried out by means of GLC, HPLC, and TLC according to the TAS-method. Using the alkaline extract (sample preparation see part I) the preservatives can be analysed as free acid or appropriate ester out the same GLC-column without any interference from coextractives. A fast and accurate HPLC determination can be achieved by direct injection of the alkaline extract. All preservatives were well separated and detected at a wavelength of 225 resp. 232 nm. As a quick test for the qualitative estimation the TLC (TAS) method is suggested and a suitable solvent system is proposed.
Hou, Zhifei; Sun, Guoxiang; Guo, Yong
2016-01-01
The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard.
A sensitivity analysis of regional and small watershed hydrologic models
NASA Technical Reports Server (NTRS)
Ambaruch, R.; Salomonson, V. V.; Simmons, J. W.
1975-01-01
Continuous simulation models of the hydrologic behavior of watersheds are important tools in several practical applications such as hydroelectric power planning, navigation, and flood control. Several recent studies have addressed the feasibility of using remote earth observations as sources of input data for hydrologic models. The objective of the study reported here was to determine how accurately remotely sensed measurements must be to provide inputs to hydrologic models of watersheds, within the tolerances needed for acceptably accurate synthesis of streamflow by the models. The study objective was achieved by performing a series of sensitivity analyses using continuous simulation models of three watersheds. The sensitivity analysis showed quantitatively how variations in each of 46 model inputs and parameters affect simulation accuracy with respect to five different performance indices.
Chen, Chia-Lin; Wang, Yuchuan; Lee, Jason J. S.; Tsui, Benjamin M. W.
2011-01-01
Purpose We assessed the quantitation accuracy of small animal pinhole single photon emission computed tomography (SPECT) under the current preclinical settings, where image compensations are not routinely applied. Procedures The effects of several common image-degrading factors and imaging parameters on quantitation accuracy were evaluated using Monte-Carlo simulation methods. Typical preclinical imaging configurations were modeled, and quantitative analyses were performed based on image reconstructions without compensating for attenuation, scatter, and limited system resolution. Results Using mouse-sized phantom studies as examples, attenuation effects alone degraded quantitation accuracy by up to −18% (Tc-99m or In-111) or −41% (I-125). The inclusion of scatter effects changed the above numbers to −12% (Tc-99m or In-111) and −21% (I-125), respectively, indicating the significance of scatter in quantitative I-125 imaging. Region-of-interest (ROI) definitions have greater impacts on regional quantitation accuracy for small sphere sources as compared to attenuation and scatter effects. For the same ROI, SPECT acquisitions using pinhole apertures of different sizes could significantly affect the outcome, whereas the use of different radii-of-rotation yielded negligible differences in quantitation accuracy for the imaging configurations simulated. Conclusions We have systematically quantified the influence of several factors affecting the quantitation accuracy of small animal pinhole SPECT. In order to consistently achieve accurate quantitation within 5% of the truth, comprehensive image compensation methods are needed. PMID:19048346
Liao, Yalin; Weber, Darren; Xu, Wei; Durbin-Johnson, Blythe P; Phinney, Brett S; Lönnerdal, Bo
2017-11-03
Whey proteins and caseins in breast milk provide bioactivities and also have different amino acid composition. Accurate determination of these two major protein classes provides a better understanding of human milk composition and function, and further aids in developing improved infant formulas based on bovine whey proteins and caseins. In this study, we implemented a LC-MS/MS quantitative analysis based on iBAQ label-free quantitation, to estimate absolute concentrations of α-casein, β-casein, and κ-casein in human milk samples (n = 88) collected between day 1 and day 360 postpartum. Total protein concentration ranged from 2.03 to 17.52 with a mean of 9.37 ± 3.65 g/L. Casein subunits ranged from 0.04 to 1.68 g/L (α-), 0.04 to 4.42 g/L (β-), and 0.10 to 1.72 g/L (α-), with β-casein having the highest average concentration among the three subunits. Calculated whey/casein ratio ranged from 45:55 to 97:3. Linear regression analyses show significant decreases in total protein, β-casein, κ-casein, total casein, and a significant increase of whey/casein ratio during the course of lactation. Our study presents a novel and accurate quantitative analysis of human milk casein content, demonstrating a lower casein content than earlier believed, which has implications for improved infants formulas.
Martín-Campos, Trinidad; Mylonas, Roman; Masselot, Alexandre; Waridel, Patrice; Petricevic, Tanja; Xenarios, Ioannis; Quadroni, Manfredo
2017-08-04
Mass spectrometry (MS) has become the tool of choice for the large scale identification and quantitation of proteins and their post-translational modifications (PTMs). This development has been enabled by powerful software packages for the automated analysis of MS data. While data on PTMs of thousands of proteins can nowadays be readily obtained, fully deciphering the complexity and combinatorics of modification patterns even on a single protein often remains challenging. Moreover, functional investigation of PTMs on a protein of interest requires validation of the localization and the accurate quantitation of its changes across several conditions, tasks that often still require human evaluation. Software tools for large scale analyses are highly efficient but are rarely conceived for interactive, in-depth exploration of data on individual proteins. We here describe MsViz, a web-based and interactive software tool that supports manual validation of PTMs and their relative quantitation in small- and medium-size experiments. The tool displays sequence coverage information, peptide-spectrum matches, tandem MS spectra and extracted ion chromatograms through a single, highly intuitive interface. We found that MsViz greatly facilitates manual data inspection to validate PTM location and quantitate modified species across multiple samples.
Some strategies for quantitative scanning Auger electron microscopy
NASA Technical Reports Server (NTRS)
Browning, R.; Peacock, D. C.; Prutton, M.
1985-01-01
The general applicability of power law forms of the background in electron spectra is pointed out and exploited for background removal from under Auger peaks. This form of B(E) is found to be extremely sensitive to instrumental alignment and to fault-free construction - an observation which can be used to set up analyser configurations in an accurate way. Also, differences between N(E) and B(E) can be used to derive a spectrometer transmission function T(E). The questions of information density in an energy-analysing spatially-resolving instrument are addressed after reliable instrumental characterization has been established. Strategies involving ratio histograms, showing the population distribution of the ratio of a pair of Auger peak heights, composition scatter diagrams and windowed imaging are discussed and illustrated.
Landslide inventories: The essential part of seismic landslide hazard analyses
Harp, E.L.; Keefer, D.K.; Sato, H.P.; Yagi, H.
2011-01-01
A detailed and accurate landslide inventory is an essential part of seismic landslide hazard analysis. An ideal inventory would cover the entire area affected by an earthquake and include all of the landslides that are possible to detect down to sizes of 1-5. m in length. The landslides must also be located accurately and mapped as polygons depicting their true shapes. Such mapped landslide distributions can then be used to perform seismic landslide hazard analysis and other quantitative analyses. Detailed inventory maps of landslide triggered by earthquakes began in the early 1960s with the use of aerial photography. In recent years, advances in technology have resulted in the accessibility of satellite imagery with sufficiently high resolution to identify and map all but the smallest of landslides triggered by a seismic event. With this ability to view any area of the globe, we can acquire imagery for any earthquake that triggers significant numbers of landslides. However, a common problem of incomplete coverage of the full distributions of landslides has emerged along with the advent of high resolution satellite imagery. ?? 2010.
Ostovaneh, Mohammad R; Vavere, Andrea L; Mehra, Vishal C; Kofoed, Klaus F; Matheson, Matthew B; Arbab-Zadeh, Armin; Fujisawa, Yasuko; Schuijf, Joanne D; Rochitte, Carlos E; Scholte, Arthur J; Kitagawa, Kakuya; Dewey, Marc; Cox, Christopher; DiCarli, Marcelo F; George, Richard T; Lima, Joao A C
To determine the diagnostic accuracy of semi-automatic quantitative metrics compared to expert reading for interpretation of computed tomography perfusion (CTP) imaging. The CORE320 multicenter diagnostic accuracy clinical study enrolled patients between 45 and 85 years of age who were clinically referred for invasive coronary angiography (ICA). Computed tomography angiography (CTA), CTP, single photon emission computed tomography (SPECT), and ICA images were interpreted manually in blinded core laboratories by two experienced readers. Additionally, eight quantitative CTP metrics as continuous values were computed semi-automatically from myocardial and blood attenuation and were combined using logistic regression to derive a final quantitative CTP metric score. For the reference standard, hemodynamically significant coronary artery disease (CAD) was defined as a quantitative ICA stenosis of 50% or greater and a corresponding perfusion defect by SPECT. Diagnostic accuracy was determined by area under the receiver operating characteristic curve (AUC). Of the total 377 included patients, 66% were male, median age was 62 (IQR: 56, 68) years, and 27% had prior myocardial infarction. In patient based analysis, the AUC (95% CI) for combined CTA-CTP expert reading and combined CTA-CTP semi-automatic quantitative metrics was 0.87(0.84-0.91) and 0.86 (0.83-0.9), respectively. In vessel based analyses the AUC's were 0.85 (0.82-0.88) and 0.84 (0.81-0.87), respectively. No significant difference in AUC was found between combined CTA-CTP expert reading and CTA-CTP semi-automatic quantitative metrics in patient based or vessel based analyses(p > 0.05 for all). Combined CTA-CTP semi-automatic quantitative metrics is as accurate as CTA-CTP expert reading to detect hemodynamically significant CAD. Copyright © 2018 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.
A general method for bead-enhanced quantitation by flow cytometry
Montes, Martin; Jaensson, Elin A.; Orozco, Aaron F.; Lewis, Dorothy E.; Corry, David B.
2009-01-01
Flow cytometry provides accurate relative cellular quantitation (percent abundance) of cells from diverse samples, but technical limitations of most flow cytometers preclude accurate absolute quantitation. Several quantitation standards are now commercially available which, when added to samples, permit absolute quantitation of CD4+ T cells. However, these reagents are limited by their cost, technical complexity, requirement for additional software and/or limited applicability. Moreover, few studies have validated the use of such reagents in complex biological samples, especially for quantitation of non-T cells. Here we show that addition to samples of known quantities of polystyrene fluorescence standardization beads permits accurate quantitation of CD4+ T cells from complex cell samples. This procedure, here termed single bead-enhanced cytofluorimetry (SBEC), was equally capable of enumerating eosinophils as well as subcellular fragments of apoptotic cells, moieties with very different optical and fluorescent characteristics. Relative to other proprietary products, SBEC is simple, inexpensive and requires no special software, suggesting that the method is suitable for the routine quantitation of most cells and other particles by flow cytometry. PMID:17067632
Optimization of CT image reconstruction algorithms for the lung tissue research consortium (LTRC)
NASA Astrophysics Data System (ADS)
McCollough, Cynthia; Zhang, Jie; Bruesewitz, Michael; Bartholmai, Brian
2006-03-01
To create a repository of clinical data, CT images and tissue samples and to more clearly understand the pathogenetic features of pulmonary fibrosis and emphysema, the National Heart, Lung, and Blood Institute (NHLBI) launched a cooperative effort known as the Lung Tissue Resource Consortium (LTRC). The CT images for the LTRC effort must contain accurate CT numbers in order to characterize tissues, and must have high-spatial resolution to show fine anatomic structures. This study was performed to optimize the CT image reconstruction algorithms to achieve these criteria. Quantitative analyses of phantom and clinical images were conducted. The ACR CT accreditation phantom containing five regions of distinct CT attenuations (CT numbers of approximately -1000 HU, -80 HU, 0 HU, 130 HU and 900 HU), and a high-contrast spatial resolution test pattern, was scanned using CT systems from two manufacturers (General Electric (GE) Healthcare and Siemens Medical Solutions). Phantom images were reconstructed using all relevant reconstruction algorithms. Mean CT numbers and image noise (standard deviation) were measured and compared for the five materials. Clinical high-resolution chest CT images acquired on a GE CT system for a patient with diffuse lung disease were reconstructed using BONE and STANDARD algorithms and evaluated by a thoracic radiologist in terms of image quality and disease extent. The clinical BONE images were processed with a 3 x 3 x 3 median filter to simulate a thicker slice reconstructed in smoother algorithms, which have traditionally been proven to provide an accurate estimation of emphysema extent in the lungs. Using a threshold technique, the volume of emphysema (defined as the percentage of lung voxels having a CT number lower than -950 HU) was computed for the STANDARD, BONE, and BONE filtered. The CT numbers measured in the ACR CT Phantom images were accurate for all reconstruction kernels for both manufacturers. As expected, visual evaluation of the spatial resolution bar patterns demonstrated that the BONE (GE) and B46f (Siemens) showed higher spatial resolution compared to the STANDARD (GE) or B30f (Siemens) reconstruction algorithms typically used for routine body CT imaging. Only the sharper images were deemed clinically acceptable for the evaluation of diffuse lung disease (e.g. emphysema). Quantitative analyses of the extent of emphysema in patient data showed the percent volumes above the -950 HU threshold as 9.4% for the BONE reconstruction, 5.9% for the STANDARD reconstruction, and 4.7% for the BONE filtered images. Contrary to the practice of using standard resolution CT images for the quantitation of diffuse lung disease, these data demonstrate that a single sharp reconstruction (BONE/B46f) should be used for both the qualitative and quantitative evaluation of diffuse lung disease. The sharper reconstruction images, which are required for diagnostic interpretation, provide accurate CT numbers over the range of -1000 to +900 HU and preserve the fidelity of small structures in the reconstructed images. A filtered version of the sharper images can be accurately substituted for images reconstructed with smoother kernels for comparison to previously published results.
Quantifying Golgi structure using EM: combining volume-SEM and stereology for higher throughput.
Ferguson, Sophie; Steyer, Anna M; Mayhew, Terry M; Schwab, Yannick; Lucocq, John Milton
2017-06-01
Investigating organelles such as the Golgi complex depends increasingly on high-throughput quantitative morphological analyses from multiple experimental or genetic conditions. Light microscopy (LM) has been an effective tool for screening but fails to reveal fine details of Golgi structures such as vesicles, tubules and cisternae. Electron microscopy (EM) has sufficient resolution but traditional transmission EM (TEM) methods are slow and inefficient. Newer volume scanning EM (volume-SEM) methods now have the potential to speed up 3D analysis by automated sectioning and imaging. However, they produce large arrays of sections and/or images, which require labour-intensive 3D reconstruction for quantitation on limited cell numbers. Here, we show that the information storage, digital waste and workload involved in using volume-SEM can be reduced substantially using sampling-based stereology. Using the Golgi as an example, we describe how Golgi populations can be sensed quantitatively using single random slices and how accurate quantitative structural data on Golgi organelles of individual cells can be obtained using only 5-10 sections/images taken from a volume-SEM series (thereby sensing population parameters and cell-cell variability). The approach will be useful in techniques such as correlative LM and EM (CLEM) where small samples of cells are treated and where there may be variable responses. For Golgi study, we outline a series of stereological estimators that are suited to these analyses and suggest workflows, which have the potential to enhance the speed and relevance of data acquisition in volume-SEM.
Saad, Hisham A; Terry, Mark A; Shamie, Neda; Chen, Edwin S; Friend, Daniel F; Holiman, Jeffrey D; Stoeger, Christopher
2008-08-01
We developed a simple, practical, and inexpensive technique to analyze areas of endothelial cell loss and/or damage over the entire corneal area after vital dye staining by using a readily available, off-the-shelf, consumer software program, Adobe Photoshop. The purpose of this article is to convey a method of quantifying areas of cell loss and/or damage. Descemet-stripping automated endothelial keratoplasty corneal transplant surgery was performed by using 5 precut corneas on a human cadaver eye. Corneas were removed and stained with trypan blue and alizarin red S and subsequently photographed. Quantitative assessment of endothelial damage was performed by using Adobe Photoshop 7.0 software. The average difference for cell area damage for analyses performed by 1 observer twice was 1.41%. For analyses performed by 2 observers, the average difference was 1.71%. Three masked observers were 100% successful in matching the randomized stained corneas to their randomized processed Adobe images. Vital dye staining of corneal endothelial cells can be combined with Adobe Photoshop software to yield a quantitative assessment of areas of acute endothelial cell loss and/or damage. This described technique holds promise for a more consistent and accurate method to evaluate the surgical trauma to the endothelial cell layer in laboratory models. This method of quantitative analysis can probably be generalized to any area of research that involves areas that are differentiated by color or contrast.
Morales-Navarrete, Hernán; Segovia-Miranda, Fabián; Klukowski, Piotr; Meyer, Kirstin; Nonaka, Hidenori; Marsico, Giovanni; Chernykh, Mikhail; Kalaidzidis, Alexander; Zerial, Marino; Kalaidzidis, Yannis
2015-01-01
A prerequisite for the systems biology analysis of tissues is an accurate digital three-dimensional reconstruction of tissue structure based on images of markers covering multiple scales. Here, we designed a flexible pipeline for the multi-scale reconstruction and quantitative morphological analysis of tissue architecture from microscopy images. Our pipeline includes newly developed algorithms that address specific challenges of thick dense tissue reconstruction. Our implementation allows for a flexible workflow, scalable to high-throughput analysis and applicable to various mammalian tissues. We applied it to the analysis of liver tissue and extracted quantitative parameters of sinusoids, bile canaliculi and cell shapes, recognizing different liver cell types with high accuracy. Using our platform, we uncovered an unexpected zonation pattern of hepatocytes with different size, nuclei and DNA content, thus revealing new features of liver tissue organization. The pipeline also proved effective to analyse lung and kidney tissue, demonstrating its generality and robustness. DOI: http://dx.doi.org/10.7554/eLife.11214.001 PMID:26673893
Quantitating Organoleptic Volatile Phenols in Smoke-Exposed Vitis vinifera Berries.
Noestheden, Matthew; Thiessen, Katelyn; Dennis, Eric G; Tiet, Ben; Zandberg, Wesley F
2017-09-27
Accurate methods for quantitating volatile phenols (i.e., guaiacol, syringol, 4-ethylphenol, etc.) in smoke-exposed Vitis vinifera berries prior to fermentation are needed to predict the likelihood of perceptible smoke taint following vinification. Reported here is a complete, cross-validated analytical workflow to accurately quantitate free and glycosidically bound volatile phenols in smoke-exposed berries using liquid-liquid extraction, acid-mediated hydrolysis, and gas chromatography-tandem mass spectrometry. The reported workflow addresses critical gaps in existing methods for volatile phenols that impact quantitative accuracy, most notably the effect of injection port temperature and the variability in acid-mediated hydrolytic procedures currently used. Addressing these deficiencies will help the wine industry make accurate, informed decisions when producing wines from smoke-exposed berries.
Matenaers, Cyrill; Popper, Bastian; Rieger, Alexandra; Wanke, Rüdiger; Blutke, Andreas
2018-01-01
The accuracy of quantitative stereological analysis tools such as the (physical) disector method substantially depends on the precise determination of the thickness of the analyzed histological sections. One conventional method for measurement of histological section thickness is to re-embed the section of interest vertically to its original section plane. The section thickness is then measured in a subsequently prepared histological section of this orthogonally re-embedded sample. However, the orthogonal re-embedding (ORE) technique is quite work- and time-intensive and may produce inaccurate section thickness measurement values due to unintentional slightly oblique (non-orthogonal) positioning of the re-embedded sample-section. Here, an improved ORE method is presented, allowing for determination of the factual section plane angle of the re-embedded section, and correction of measured section thickness values for oblique (non-orthogonal) sectioning. For this, the analyzed section is mounted flat on a foil of known thickness (calibration foil) and both the section and the calibration foil are then vertically (re-)embedded. The section angle of the re-embedded section is then calculated from the deviation of the measured section thickness of the calibration foil and its factual thickness, using basic geometry. To find a practicable, fast, and accurate alternative to ORE, the suitability of spectral reflectance (SR) measurement for determination of plastic section thicknesses was evaluated. Using a commercially available optical reflectometer (F20, Filmetrics®, USA), the thicknesses of 0.5 μm thick semi-thin Epon (glycid ether)-sections and of 1-3 μm thick plastic sections (glycolmethacrylate/ methylmethacrylate, GMA/MMA), as regularly used in physical disector analyses, could precisely be measured within few seconds. Compared to the measured section thicknesses determined by ORE, SR measures displayed less than 1% deviation. Our results prove the applicability of SR to efficiently provide accurate section thickness measurements as a prerequisite for reliable estimates of dependent quantitative stereological parameters.
Matenaers, Cyrill; Popper, Bastian; Rieger, Alexandra; Wanke, Rüdiger
2018-01-01
The accuracy of quantitative stereological analysis tools such as the (physical) disector method substantially depends on the precise determination of the thickness of the analyzed histological sections. One conventional method for measurement of histological section thickness is to re-embed the section of interest vertically to its original section plane. The section thickness is then measured in a subsequently prepared histological section of this orthogonally re-embedded sample. However, the orthogonal re-embedding (ORE) technique is quite work- and time-intensive and may produce inaccurate section thickness measurement values due to unintentional slightly oblique (non-orthogonal) positioning of the re-embedded sample-section. Here, an improved ORE method is presented, allowing for determination of the factual section plane angle of the re-embedded section, and correction of measured section thickness values for oblique (non-orthogonal) sectioning. For this, the analyzed section is mounted flat on a foil of known thickness (calibration foil) and both the section and the calibration foil are then vertically (re-)embedded. The section angle of the re-embedded section is then calculated from the deviation of the measured section thickness of the calibration foil and its factual thickness, using basic geometry. To find a practicable, fast, and accurate alternative to ORE, the suitability of spectral reflectance (SR) measurement for determination of plastic section thicknesses was evaluated. Using a commercially available optical reflectometer (F20, Filmetrics®, USA), the thicknesses of 0.5 μm thick semi-thin Epon (glycid ether)-sections and of 1–3 μm thick plastic sections (glycolmethacrylate/ methylmethacrylate, GMA/MMA), as regularly used in physical disector analyses, could precisely be measured within few seconds. Compared to the measured section thicknesses determined by ORE, SR measures displayed less than 1% deviation. Our results prove the applicability of SR to efficiently provide accurate section thickness measurements as a prerequisite for reliable estimates of dependent quantitative stereological parameters. PMID:29444158
Hou, Zhifei; Sun, Guoxiang; Guo, Yong
2016-01-01
The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard. PMID:27529425
Loo, Lit-Hsin; Laksameethanasan, Danai; Tung, Yi-Ling
2014-03-01
Protein subcellular localization is a major determinant of protein function. However, this important protein feature is often described in terms of discrete and qualitative categories of subcellular compartments, and therefore it has limited applications in quantitative protein function analyses. Here, we present Protein Localization Analysis and Search Tools (PLAST), an automated analysis framework for constructing and comparing quantitative signatures of protein subcellular localization patterns based on microscopy images. PLAST produces human-interpretable protein localization maps that quantitatively describe the similarities in the localization patterns of proteins and major subcellular compartments, without requiring manual assignment or supervised learning of these compartments. Using the budding yeast Saccharomyces cerevisiae as a model system, we show that PLAST is more accurate than existing, qualitative protein localization annotations in identifying known co-localized proteins. Furthermore, we demonstrate that PLAST can reveal protein localization-function relationships that are not obvious from these annotations. First, we identified proteins that have similar localization patterns and participate in closely-related biological processes, but do not necessarily form stable complexes with each other or localize at the same organelles. Second, we found an association between spatial and functional divergences of proteins during evolution. Surprisingly, as proteins with common ancestors evolve, they tend to develop more diverged subcellular localization patterns, but still occupy similar numbers of compartments. This suggests that divergence of protein localization might be more frequently due to the development of more specific localization patterns over ancestral compartments than the occupation of new compartments. PLAST enables systematic and quantitative analyses of protein localization-function relationships, and will be useful to elucidate protein functions and how these functions were acquired in cells from different organisms or species. A public web interface of PLAST is available at http://plast.bii.a-star.edu.sg.
Loo, Lit-Hsin; Laksameethanasan, Danai; Tung, Yi-Ling
2014-01-01
Protein subcellular localization is a major determinant of protein function. However, this important protein feature is often described in terms of discrete and qualitative categories of subcellular compartments, and therefore it has limited applications in quantitative protein function analyses. Here, we present Protein Localization Analysis and Search Tools (PLAST), an automated analysis framework for constructing and comparing quantitative signatures of protein subcellular localization patterns based on microscopy images. PLAST produces human-interpretable protein localization maps that quantitatively describe the similarities in the localization patterns of proteins and major subcellular compartments, without requiring manual assignment or supervised learning of these compartments. Using the budding yeast Saccharomyces cerevisiae as a model system, we show that PLAST is more accurate than existing, qualitative protein localization annotations in identifying known co-localized proteins. Furthermore, we demonstrate that PLAST can reveal protein localization-function relationships that are not obvious from these annotations. First, we identified proteins that have similar localization patterns and participate in closely-related biological processes, but do not necessarily form stable complexes with each other or localize at the same organelles. Second, we found an association between spatial and functional divergences of proteins during evolution. Surprisingly, as proteins with common ancestors evolve, they tend to develop more diverged subcellular localization patterns, but still occupy similar numbers of compartments. This suggests that divergence of protein localization might be more frequently due to the development of more specific localization patterns over ancestral compartments than the occupation of new compartments. PLAST enables systematic and quantitative analyses of protein localization-function relationships, and will be useful to elucidate protein functions and how these functions were acquired in cells from different organisms or species. A public web interface of PLAST is available at http://plast.bii.a-star.edu.sg. PMID:24603469
Identification and Quantitation of Flavanols and Proanthocyanidins in Foods: How Good are the Datas?
Kelm, Mark A.; Hammerstone, John F.; Schmitz, Harold H.
2005-01-01
Evidence suggesting that dietary polyphenols, flavanols, and proanthocyanidins in particular offer significant cardiovascular health benefits is rapidly increasing. Accordingly, reliable and accurate methods are needed to provide qualitative and quantitative food composition data necessary for high quality epidemiological and clinical research. Measurements for flavonoids and proanthocyanidins have employed a range of analytical techniques, with various colorimetric assays still being popular for estimating total polyphenolic content in foods and other biological samples despite advances made with more sophisticated analyses. More crudely, estimations of polyphenol content as well as antioxidant activity are also reported with values relating to radical scavenging activity. High-performance liquid chromatography (HPLC) is the method of choice for quantitative analysis of individual polyphenols such as flavanols and proanthocyanidins. Qualitative information regarding proanthocyanidin structure has been determined by chemical methods such as thiolysis and by HPLC-mass spectrometry (MS) techniques at present. The lack of appropriate standards is the single most important factor that limits the aforementioned analyses. However, with ever expanding research in the arena of flavanols, proanthocyanidins, and health and the importance of their future inclusion in food composition databases, the need for standards becomes more critical. At present, sufficiently well-characterized standard material is available for selective flavanols and proanthocyanidins, and construction of at least a limited food composition database is feasible. PMID:15712597
NASA Astrophysics Data System (ADS)
Ye, Qimiao; Chen, Lin; Qiu, Wenqi; Lin, Liangjie; Sun, Huijun; Cai, Shuhui; Wei, Zhiliang; Chen, Zhong
2017-01-01
Nuclear magnetic resonance (NMR) spectroscopy serves as an important tool for both qualitative and quantitative analyses of various systems in chemistry, biology, and medicine. However, applications of one-dimensional 1H NMR are often restrained by the presence of severe overlap among different resonances. The advent of two-dimensional (2D) 1H NMR constitutes a promising alternative by extending the crowded resonances into a plane and thereby alleviating the spectral congestions. However, the enhanced ability in discriminating resonances is achieved at the cost of extended experimental duration due to necessity of various scans with progressive delays to construct the indirect dimension. Therefore, in this study, we propose a selective coherence transfer (SECOT) method to accelerate acquisitions of 2D correlation spectroscopy by converting chemical shifts into spatial positions within the effective sample length and then performing an echo planar spectroscopic imaging module to record the spatial and spectral information, which generates 2D correlation spectrum after 2D Fourier transformation. The feasibility and effectiveness of SECOT have been verified by a set of experiments under both homogeneous and inhomogeneous magnetic fields. Moreover, evaluations of SECOT for quantitative analyses are carried out on samples with a series of different concentrations. Based on these experimental results, the SECOT may open important perspectives for fast, accurate, and stable investigations of various chemical systems both qualitatively and quantitatively.
Penheiter, Alan R.; Griesmann, Guy E.; Federspiel, Mark J.; Dingli, David; Russell, Stephen J.; Carlson, Stephanie K.
2011-01-01
The purpose of our study was to validate the ability of pinhole micro-single-photon emission computed tomography/computed tomography (SPECT/CT) to 1) accurately resolve the intratumoral dispersion pattern and 2) quantify the infection percentage in solid tumors of an oncolytic measles virus encoding the human sodium iodide symporter (MV-NIS). NIS RNA level and dispersion pattern were determined in control and MV-NIS infected BxPC-3 pancreatic tumor cells and mouse xenografts using quantitative, real-time, reverse transcriptase, polymerase chain reaction, autoradiography, and immunohistochemistry (IHC). Mice with BxPC-3 xenografts were imaged with 123I or 99TcO4 micro-SPECT/CT. Tumor dimensions and radionuclide localization were determined with imaging software. Linear regression and correlation analyses were performed to determine the relationship between tumor infection percentage and radionuclide uptake (% injected dose per gram) above background and a highly significant correlation was observed (r2 = 0.947). A detection threshold of 1.5-fold above the control tumor uptake (background) yielded a sensitivity of 2.7% MV-NIS infected tumor cells. We reliably resolved multiple distinct intratumoral zones of infection from noninfected regions. Pinhole micro-SPECT/CT imaging using the NIS reporter demonstrated precise localization and quantitation of oncolytic MV-NIS infection and can replace more time-consuming and expensive analyses (eg, autoradiography and IHC) that require animal sacrifice. PMID:21753796
Recent Progress in the Remote Detection of Vapours and Gaseous Pollutants.
ERIC Educational Resources Information Center
Moffat, A. J.; And Others
Work has been continuing on the correlation spectrometry techniques described at previous remote sensing symposiums. Advances in the techniques are described which enable accurate quantitative measurements of diffused atmospheric gases to be made using controlled light sources, accurate quantitative measurements of gas clouds relative to…
NASA Astrophysics Data System (ADS)
Burnham, Brian Scott
Outcrop analogue studies of fluvial sedimentary systems are often undertaken to identify spatial and temporal characteristics (e.g. stacking patterns, lateral continuity, lithofacies proportions). However, the lateral extent typically exceeds that of the exposure, and/or the true width and thickness are not apparent. Accurate characterisation of fluvial sand bodies is integral for accurate identification and subsequent modelling of aquifer and hydrocarbon reservoir architecture. The studies presented in this thesis utilise techniques that integrate lidar, highresolution photography and differential geospatial measurements, to create accurate three-dimensional (3D) digital outcrop models (DOMs) of continuous 3D and laterally extensive 2D outcrop exposures. The sedimentary architecture of outcrops in the medial portion of a large Distributive Fluvial System (DFS) (Huesca fluvial fan) in the Ebro Basin, north-east Spain, and in the fluvio-deltaic succession of the Breathitt Group in the eastern Appalachian Basin, USA, are evaluated using traditional sedimentological and digital outcrop analytical techniques. The major sand bodies in the study areas are quantitatively analysed to accurately characterise spatial and temporal changes in sand body architecture, from two different outcrop exposure types and scales. Several stochastic reservoir simulations were created to approximate fluvial sand body lithological component and connectivity within the medial portion of the Huesca DFS. Results demonstrate a workflow and current methodology adaptation of digital outcrop techniques required for each study to approximate true geobody widths, thickness and characterise architectural patterns (internal and external) of major fluvial sand bodies interpreted as products of DFSs in the Huesca fluvial fan, and both palaeovalleys and progradational DFSs in the Pikeville and Hyden Formations in the Breathitt Group. The results suggest key geostatistical metrics, which are translatable across any fluvial system that can be used to analyse 3D digital outcrop data, and identify spatial attributes of sand bodies to identify their genetic origin and lithological component within fluvial reservoir systems, and the rock record. 3D quantitative analysis of major sand bodies have allowed more accurate width vs. thickness relationships within the La Serreta area, showing a vertical increase in width and channel-fill facies, and demonstrates a 22% increase of in-channel facies from previous interpretations. Additionally, identification of deposits that are products of a nodal avulsion event have been characterised and are interpreted to be the cause for the increase in width and channel-fill facies. Furthermore, analysis of the Pikeville and Hyden Fms contain sand bodies of stacked distributaries and palaeovalleys, as previously interpreted, and demonstrates that a 3D spatial approach to determine basin-wide architectural trends is integral to identifying the genetic origin, and preservation potential of sand bodies of both palaeovalleys and distributive fluvial systems. The resultant geostatistics assimilated in the thesis demonstrates the efficacy of integrated lidar studies of outcrop analogues, and provide empirical relationships which can be applied to subsurface analogues for reservoir model development and the distribution of both DFS and palaeovalley depositional systems in the rock record.
NASA Astrophysics Data System (ADS)
Flyvbjerg, Henrik; Mortensen, Kim I.
2015-06-01
With each new aspect of nature that becomes accessible to quantitative science, new needs arise for data analysis and mathematical modeling. The classical example is Tycho Brahe's accurate and comprehensive observations of planets, which made him hire Kepler for his mathematical skills to assist with the data analysis. We all learned what that lead to: Kepler's three laws of planetary motion, phenomenology in purely mathematical form. Newton built on this, and the scientific revolution was over, completed.
Yan, Xu; Bishop, David J.
2018-01-01
Gene expression analysis by quantitative PCR in skeletal muscle is routine in exercise studies. The reproducibility and reliability of the data fundamentally depend on how the experiments are performed and interpreted. Despite the popularity of the assay, there is a considerable variation in experimental protocols and data analyses from different laboratories, and there is a lack of consistency of proper quality control steps throughout the assay. In this study, we present a number of experiments on various steps of quantitative PCR workflow, and demonstrate how to perform a quantitative PCR experiment with human skeletal muscle samples in an exercise study. We also tested some common mistakes in performing qPCR. Interestingly, we found that mishandling of muscle for a short time span (10 mins) before RNA extraction did not affect RNA quality, and isolated total RNA was preserved for up to one week at room temperature. Demonstrated by our data, use of unstable reference genes lead to substantial differences in the final results. Alternatively, cDNA content can be used for data normalisation; however, complete removal of RNA from cDNA samples is essential for obtaining accurate cDNA content. PMID:29746477
NASA Technical Reports Server (NTRS)
Kruse, Fred A.; Dwyer, John L.
1993-01-01
The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) measures reflected light in 224 contiguous spectra bands in the 0.4 to 2.45 micron region of the electromagnetic spectrum. Numerous studies have used these data for mineralogic identification and mapping based on the presence of diagnostic spectral features. Quantitative mapping requires conversion of the AVIRIS data to physical units (usually reflectance) so that analysis results can be compared and validated with field and laboratory measurements. This study evaluated two different AVIRIS calibration techniques to ground reflectance: an empirically-based method and an atmospheric model based method to determine their effects on quantitative scientific analyses. Expert system analysis and linear spectral unmixing were applied to both calibrated data sets to determine the effect of the calibration on the mineral identification and quantitative mapping results. Comparison of the image-map results and image reflectance spectra indicate that the model-based calibrated data can be used with automated mapping techniques to produce accurate maps showing the spatial distribution and abundance of surface mineralogy. This has positive implications for future operational mapping using AVIRIS or similar imaging spectrometer data sets without requiring a priori knowledge.
Meier, Matthias; Jakub, Zdeněk; Balajka, Jan; Hulva, Jan; Bliem, Roland; Thakur, Pardeep K.; Lee, Tien-Lin; Franchini, Cesare; Schmid, Michael; Diebold, Ulrike; Allegretti, Francesco; Parkinson, Gareth S.
2018-01-01
Accurately modelling the structure of a catalyst is a fundamental prerequisite for correctly predicting reaction pathways, but a lack of clear experimental benchmarks makes it difficult to determine the optimal theoretical approach. Here, we utilize the normal incidence X-ray standing wave (NIXSW) technique to precisely determine the three dimensional geometry of Ag1 and Cu1 adatoms on Fe3O4(001). Both adatoms occupy bulk-continuation cation sites, but with a markedly different height above the surface (0.43 ± 0.03 Å (Cu1) and 0.96 ± 0.03 Å (Ag1)). HSE-based calculations accurately predict the experimental geometry, but the more common PBE + U and PBEsol + U approaches perform poorly. PMID:29334395
Delmore, Kira E; Liedvogel, Miriam
2016-01-01
The amazing accuracy of migratory orientation performance across the animal kingdom is facilitated by the use of magnetic and celestial compass systems that provide individuals with both directional and positional information. Quantitative genetics analyses in several animal systems suggests that migratory orientation has a strong genetic component. Nevertheless, the exact identity of genes controlling orientation remains largely unknown, making it difficult to obtain an accurate understanding of this fascinating behavior on the molecular level. Here, we provide an overview of molecular genetic techniques employed thus far, highlight the pros and cons of various approaches, generalize results from species-specific studies whenever possible, and evaluate how far the field has come since early quantitative genetics studies. We emphasize the importance of examining different levels of molecular control, and outline how future studies can take advantage of high-resolution tracking and sequencing techniques to characterize the genomic architecture of migratory orientation.
Booth, Margaret Zoller; Gerard, Jean M.
2012-01-01
Utilizing mixed methodology, this paper investigates the relationship between self-esteem and academic achievement for young adolescents within two Western cultural contexts: the United States and England. Quantitative and qualitative data from 86 North American and 86 British adolescents were utilized to examine the links between self-esteem and academic achievement from the beginning to the end of their academic year during their 11th–12th year of age. For both samples, quantitative results demonstrated that fall self-esteem was related to multiple indicators of later year academic achievement. While country differences emerge by the end of the year, math appears to have a consistent relationship with self-esteem in both country contexts. Qualitative analyses found some support for British students’ self-perceptions as more accurately reflecting their academic experience than the students from the United States. PMID:24068853
An importance-performance analysis of hospital information system attributes: A nurses' perspective.
Cohen, Jason F; Coleman, Emma; Kangethe, Matheri J
2016-02-01
Health workers have numerous concerns about hospital IS (HIS) usage. Addressing these concerns requires understanding the system attributes most important to their satisfaction and productivity. Following a recent HIS implementation, our objective was to identify priorities for managerial intervention based on user evaluations of the performance of the HIS attributes as well as the relative importance of these attributes to user satisfaction and productivity outcomes. We collected data along a set of attributes representing system quality, data quality, information quality, and service quality from 154 nurse users. Their quantitative responses were analysed using the partial least squares approach followed by an importance-performance analysis. Qualitative responses were analysed using thematic analysis to triangulate and supplement the quantitative findings. Two system quality attributes (responsiveness and ease of learning), one information quality attribute (detail), one service quality attribute (sufficient support), and three data quality attributes (records complete, accurate and never missing) were identified as high priorities for intervention. Our application of importance-performance analysis is unique in HIS evaluation and we have illustrated its utility for identifying those system attributes for which underperformance is not acceptable to users and therefore should be high priorities for intervention. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Towards an understanding of British public attitudes concerning human cloning.
Shepherd, Richard; Barnett, Julie; Cooper, Helen; Coyle, Adrian; Moran-Ellis, Jo; Senior, Victoria; Walton, Chris
2007-07-01
The ability of scientists to apply cloning technology to humans has provoked public discussion and media coverage. The present paper reports on a series of studies examining public attitudes to human cloning in the UK, bringing together a range of quantitative and qualitative methods to address this question. These included a nationally representative survey, an experimental vignette study, focus groups and analyses of media coverage. Overall the research presents a complex picture of attitude to and constructions of human cloning. In all of the analyses, therapeutic cloning was viewed more favourably than reproductive cloning. However, while participants in the focus groups were generally negative about both forms of cloning, and this was also reflected in the media analyses, quantitative results showed more positive responses. In the quantitative research, therapeutic cloning was generally accepted when the benefits of such procedures were clear, and although reproductive cloning was less accepted there was still substantial support. Participants in the focus groups only differentiated between therapeutic and reproductive cloning after the issue of therapeutic cloning was explicitly raised; initially they saw cloning as being reproductive cloning and saw no real benefits. Attitudes were shown to be associated with underlying values associated with scientific progress rather than with age, gender or education, and although there were a few differences in the quantitative data based on religious affiliation, these tended to be small effects. Likewise in the focus groups there was little direct appeal to religion, but the main themes were 'interfering with nature' and the 'status of the embryo', with the latter being used more effectively to try to close down further discussion. In general there was a close correspondence between the media analysis and focus group responses, possibly demonstrating the importance of media as a resource, or that the media reflect public discourse accurately. However, focus group responses did not simply reflect media coverage.
Ruttanajit, Tida; Chanchamroen, Sujin; Cram, David S; Sawakwongpra, Kritchakorn; Suksalak, Wanwisa; Leng, Xue; Fan, Junmei; Wang, Li; Yao, Yuanqing; Quangkananurug, Wiwat
2016-02-01
Currently, our understanding of the nature and reproductive potential of blastocysts associated with trophectoderm (TE) lineage chromosomal mosaicism is limited. The objective of this study was to first validate copy number variation sequencing (CNV-Seq) for measuring the level of mosaicism and second, examine the nature and level of mosaicism in TE biopsies of patient's blastocysts. TE biopy samples were analysed by array comparative genomic hybridization (CGH) and CNV-Seq to discriminate between euploid, aneuploid and mosaic blastocysts. Using artificial models of TE mosaicism for five different chromosomes, CNV-Seq accurately and reproducibly quantitated mosaicism at levels of 50% and 20%. In a comparative 24-chromosome study of 49 blastocysts by array CGH and CNV-Seq, 43 blastocysts (87.8%) had a concordant diagnosis and 6 blastocysts (12.2%) were discordant. The discordance was attributed to low to medium levels of chromosomal mosaicism (30-70%) not detected by array CGH. In an expanded study of 399 blastocysts using CNV-Seq as the sole diagnostic method, the proportion of diploid-aneuploid mosaics (34, 8.5%) was significantly higher than aneuploid mosaics (18, 4.5%) (p < 0.02). Mosaicism is a significant chromosomal abnormality associated with the TE lineage of human blastocysts that can be reliably and accurately detected by CNV-Seq. © 2015 John Wiley & Sons, Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
van Benthem, Klaus; Tan, Guolong; French, Roger H
2006-01-01
Attractive van der Waals V London dispersion interactions between two half crystals arise from local physical property gradients within the interface layer separating the crystals. Hamaker coefficients and London dispersion energies were quantitatively determined for 5 and near- 13 grain boundaries in SrTiO3 by analysis of spatially resolved valence electron energy-loss spectroscopy (VEELS) data. From the experimental data, local complex dielectric functions were determined, from which optical properties can be locally analysed. Both local electronic structures and optical properties revealed gradients within the grain boundary cores of both investigated interfaces. The obtained results show that even in the presence ofmore » atomically structured grain boundary cores with widths of less than 1 nm, optical properties have to be represented with gradual changes across the grain boundary structures to quantitatively reproduce accurate van der Waals V London dispersion interactions. London dispersion energies of the order of 10% of the apparent interface energies of SrTiO3 were observed, demonstrating their significance in the grain boundary formation process. The application of different models to represent optical property gradients shows that long-range van der Waals V London dispersion interactions scale significantly with local, i.e atomic length scale property variations.« less
Castells-Nobau, Anna; Nijhof, Bonnie; Eidhof, Ilse; Wolf, Louis; Scheffer-de Gooyert, Jolanda M; Monedero, Ignacio; Torroja, Laura; van der Laak, Jeroen A W M; Schenck, Annette
2017-05-03
Synaptic morphology is tightly related to synaptic efficacy, and in many cases morphological synapse defects ultimately lead to synaptic malfunction. The Drosophila larval neuromuscular junction (NMJ), a well-established model for glutamatergic synapses, has been extensively studied for decades. Identification of mutations causing NMJ morphological defects revealed a repertoire of genes that regulate synapse development and function. Many of these were identified in large-scale studies that focused on qualitative approaches to detect morphological abnormalities of the Drosophila NMJ. A drawback of qualitative analyses is that many subtle players contributing to NMJ morphology likely remain unnoticed. Whereas quantitative analyses are required to detect the subtler morphological differences, such analyses are not yet commonly performed because they are laborious. This protocol describes in detail two image analysis algorithms "Drosophila NMJ Morphometrics" and "Drosophila NMJ Bouton Morphometrics", available as Fiji-compatible macros, for quantitative, accurate and objective morphometric analysis of the Drosophila NMJ. This methodology is developed to analyze NMJ terminals immunolabeled with the commonly used markers Dlg-1 and Brp. Additionally, its wider application to other markers such as Hrp, Csp and Syt is presented in this protocol. The macros are able to assess nine morphological NMJ features: NMJ area, NMJ perimeter, number of boutons, NMJ length, NMJ longest branch length, number of islands, number of branches, number of branching points and number of active zones in the NMJ terminal.
Impact of reconstruction parameters on quantitative I-131 SPECT
NASA Astrophysics Data System (ADS)
van Gils, C. A. J.; Beijst, C.; van Rooij, R.; de Jong, H. W. A. M.
2016-07-01
Radioiodine therapy using I-131 is widely used for treatment of thyroid disease or neuroendocrine tumors. Monitoring treatment by accurate dosimetry requires quantitative imaging. The high energy photons however render quantitative SPECT reconstruction challenging, potentially requiring accurate correction for scatter and collimator effects. The goal of this work is to assess the effectiveness of various correction methods on these effects using phantom studies. A SPECT/CT acquisition of the NEMA IEC body phantom was performed. Images were reconstructed using the following parameters: (1) without scatter correction, (2) with triple energy window (TEW) scatter correction and (3) with Monte Carlo-based scatter correction. For modelling the collimator-detector response (CDR), both (a) geometric Gaussian CDRs as well as (b) Monte Carlo simulated CDRs were compared. Quantitative accuracy, contrast to noise ratios and recovery coefficients were calculated, as well as the background variability and the residual count error in the lung insert. The Monte Carlo scatter corrected reconstruction method was shown to be intrinsically quantitative, requiring no experimentally acquired calibration factor. It resulted in a more accurate quantification of the background compartment activity density compared with TEW or no scatter correction. The quantification error relative to a dose calibrator derived measurement was found to be <1%,-26% and 33%, respectively. The adverse effects of partial volume were significantly smaller with the Monte Carlo simulated CDR correction compared with geometric Gaussian or no CDR modelling. Scatter correction showed a small effect on quantification of small volumes. When using a weighting factor, TEW correction was comparable to Monte Carlo reconstruction in all measured parameters, although this approach is clinically impractical since this factor may be patient dependent. Monte Carlo based scatter correction including accurately simulated CDR modelling is the most robust and reliable method to reconstruct accurate quantitative iodine-131 SPECT images.
Wang, Yan; Zhu, Wenhui; Duan, Xingxing; Zhao, Yongfeng; Liu, Wengang; Li, Ruizhen
2011-04-01
To evaluate intraventricular systolic dyssynchrony in rats with post-infarction heart failure by quantitative tissue velocity imaging combining synchronous electrocardiograph. A total of 60 male SD rats were randomly assigned to 3 groups: a 4 week post-operative group and an 8 week post-operation group (each n=25, with anterior descending branch of the left coronary artery ligated), and a sham operation group (n=10, with thoracotomy and open pericardium, but no ligation of the artery). The time to peak systolic velocity of regional myocardial in the rats was measured and the index of the left intraventricular dyssynchrony was calculated. All indexes of the heart function became lower as the heart failure worsened except the left ventricle index in the post-operative groups. All indexes of the dyssynchrony got longer in the post-operative groups (P<0.05), while the changes in the sham operation group were not significantly different (P>0.05). Quantitative tissue velocity imaging combining synchronous electrocardiograph can analyse the intraventricular systolic dyssynchrony accurately.
Quantitative analysis of tympanic membrane perforation: a simple and reliable method.
Ibekwe, T S; Adeosun, A A; Nwaorgu, O G
2009-01-01
Accurate assessment of the features of tympanic membrane perforation, especially size, site, duration and aetiology, is important, as it enables optimum management. To describe a simple, cheap and effective method of quantitatively analysing tympanic membrane perforations. The system described comprises a video-otoscope (capable of generating still and video images of the tympanic membrane), adapted via a universal serial bus box to a computer screen, with images analysed using the Image J geometrical analysis software package. The reproducibility of results and their correlation with conventional otoscopic methods of estimation were tested statistically with the paired t-test and correlational tests, using the Statistical Package for the Social Sciences version 11 software. The following equation was generated: P/T x 100 per cent = percentage perforation, where P is the area (in pixels2) of the tympanic membrane perforation and T is the total area (in pixels2) for the entire tympanic membrane (including the perforation). Illustrations are shown. Comparison of blinded data on tympanic membrane perforation area obtained independently from assessments by two trained otologists, of comparative years of experience, using the video-otoscopy system described, showed similar findings, with strong correlations devoid of inter-observer error (p = 0.000, r = 1). Comparison with conventional otoscopic assessment also indicated significant correlation, comparing results for two trained otologists, but some inter-observer variation was present (p = 0.000, r = 0.896). Correlation between the two methods for each of the otologists was also highly significant (p = 0.000). A computer-adapted video-otoscope, with images analysed by Image J software, represents a cheap, reliable, technology-driven, clinical method of quantitative analysis of tympanic membrane perforations and injuries.
Frequent frames as a cue for grammatical categories in child directed speech.
Mintz, Toben H
2003-11-01
This paper introduces the notion of frequent frames, distributional patterns based on co-occurrence patterns of words in sentences, then investigates the usefulness of this information in grammatical categorization. A frame is defined as two jointly occurring words with one word intervening. Qualitative and quantitative results from distributional analyses of six different corpora of child directed speech are presented in two experiments. In the analyses, words that were surrounded by the same frequent frame were categorized together. The results show that frequent frames yield very accurate categories. Furthermore, evidence from behavioral studies suggests that infants and adults are sensitive to frame-like units, and that adults use them to categorize words. This evidence, along with the success of frames in categorizing words, provides support for frames as a basis for the acquisition of grammatical categories.
Shahbazian, M. D.; Valsamakis, A.; Boonyaratanakornkit, J.; Cook, L.; Pang, X. L.; Preiksaitis, J. K.; Schönbrunner, E. R.; Caliendo, A. M.
2013-01-01
Commutability of quantitative reference materials has proven important for reliable and accurate results in clinical chemistry. As international reference standards and commercially produced calibration material have become available to address the variability of viral load assays, the degree to which such materials are commutable and the effect of commutability on assay concordance have been questioned. To investigate this, 60 archived clinical plasma samples, which previously tested positive for cytomegalovirus (CMV), were retested by five different laboratories, each using a different quantitative CMV PCR assay. Results from each laboratory were calibrated both with lab-specific quantitative CMV standards (“lab standards”) and with common, commercially available standards (“CMV panel”). Pairwise analyses among laboratories were performed using mean results from each clinical sample, calibrated first with lab standards and then with the CMV panel. Commutability of the CMV panel was determined based on difference plots for each laboratory pair showing plotted values of standards that were within the 95% prediction intervals for the clinical specimens. Commutability was demonstrated for 6 of 10 laboratory pairs using the CMV panel. In half of these pairs, use of the CMV panel improved quantitative agreement compared to use of lab standards. Two of four laboratory pairs for which the CMV panel was noncommutable showed reduced quantitative agreement when that panel was used as a common calibrator. Commutability of calibration material varies across different quantitative PCR methods. Use of a common, commutable quantitative standard can improve agreement across different assays; use of a noncommutable calibrator can reduce agreement among laboratories. PMID:24025907
Burns, M S; File, D M
1986-11-01
Secondary ion mass spectrometry (SIMS) is a surface analytical technique with high sensitivity for elemental detection and microlocalization capabilities within the micrometre range. Quantitative analysis of epoxy resins and gelatin have been reported (Burns-Bellhorn & File, 1979). We report here the first application of this technique to quantitative microlocalization in the context of a physiological problem--analyses of sodium, potassium and calcium in normal and galactose-induced cataract in rat lens. It is known that during the development of galactose-induced cataract the whole lens content of potassium is decreased, sodium is increased and, in late stages, calcium concentration increases. Whether these alterations in diffusible ions occur homogeneously or heterogeneously is not known. Standard curves were generated from epoxy resins containing known concentrations of sodium, potassium or calcium organometallic compounds using the Cameca IMS 300 Secondary Ion Mass Spectrometer. Normal and cataractous lenses were prepared by freezing in isopentane in a liquid nitrogen bath followed by freeze-drying at -30 degrees C. After dry embedding in epoxy resin, 10 microns thick sections of lens were pressure mounted on silicon wafers, overcoated with gold, and ion emission measured under the same instrumental conditions used to obtain the standard curves. Quantitative analysis of an area 27 microns in diameter, or a total analysed volume of 1.1 microns3, was performed by using a mechanical aperture in the ion optical system. Ion images provided qualitative microanalysis with a lateral resolution of 1 micron. Control rat lenses gave values for sodium and potassium content with a precision of +/- 17% or less. These values were compared to flame photometry and atomic absorption measurements of normal lenses and were accurate within 25%. Analysis of serum and blood also gave accurate and precise measurements of these elements. Normal rat lenses had a gradient of sodium, and, to a lesser degree, of potassium from the cortex to the nucleus. Development of galactose-induced cataract was heterogeneous by morphological criteria, beginning at the lens equator and spreading from the cortex into the nucleus. However, the loss of potassium and increase in sodium concentration occurred at early stages in both the cortex and nucleus cells, possibly because these cells are interconnected by gap junctions. There is a local alteration in elemental content prior to morphologically demonstrable cataract formation.(ABSTRACT TRUNCATED AT 400 WORDS)
Improving the Linkages between Air Pollution Epidemiology and Quantitative Risk Assessment
Bell, Michelle L.; Walker, Katy; Hubbell, Bryan
2011-01-01
Background: Air pollution epidemiology plays an integral role in both identifying the hazards of air pollution as well as supplying the risk coefficients that are used in quantitative risk assessments. Evidence from both epidemiology and risk assessments has historically supported critical environmental policy decisions. The extent to which risk assessors can properly specify a quantitative risk assessment and characterize key sources of uncertainty depends in part on the availability, and clarity, of data and assumptions in the epidemiological studies. Objectives: We discuss the interests shared by air pollution epidemiology and risk assessment communities in ensuring that the findings of epidemiological studies are appropriately characterized and applied correctly in risk assessments. We highlight the key input parameters for risk assessments and consider how modest changes in the characterization of these data might enable more accurate risk assessments that better represent the findings of epidemiological studies. Discussion: We argue that more complete information regarding the methodological choices and input data used in epidemiological studies would support more accurate risk assessments—to the benefit of both disciplines. In particular, we suggest including additional details regarding air quality, demographic, and health data, as well as certain types of data-rich graphics. Conclusions: Relatively modest changes to the data reported in epidemiological studies will improve the quality of risk assessments and help prevent the misinterpretation and mischaracterization of the results of epidemiological studies. Such changes may also benefit epidemiologists undertaking meta-analyses. We suggest workshops as a way to improve the dialogue between the two communities. PMID:21816702
A quantitative reconstruction software suite for SPECT imaging
NASA Astrophysics Data System (ADS)
Namías, Mauro; Jeraj, Robert
2017-11-01
Quantitative Single Photon Emission Tomography (SPECT) imaging allows for measurement of activity concentrations of a given radiotracer in vivo. Although SPECT has usually been perceived as non-quantitative by the medical community, the introduction of accurate CT based attenuation correction and scatter correction from hybrid SPECT/CT scanners has enabled SPECT systems to be as quantitative as Positron Emission Tomography (PET) systems. We implemented a software suite to reconstruct quantitative SPECT images from hybrid or dedicated SPECT systems with a separate CT scanner. Attenuation, scatter and collimator response corrections were included in an Ordered Subset Expectation Maximization (OSEM) algorithm. A novel scatter fraction estimation technique was introduced. The SPECT/CT system was calibrated with a cylindrical phantom and quantitative accuracy was assessed with an anthropomorphic phantom and a NEMA/IEC image quality phantom. Accurate activity measurements were achieved at an organ level. This software suite helps increasing quantitative accuracy of SPECT scanners.
NASA Astrophysics Data System (ADS)
Berger, D.; Nissen, J.
2018-01-01
The studies in this paper are part of systematic investigations of the lateral analytical resolution of the field emission electron microprobe JEOL JXA-8530F. Hereby, the quantitative lateral resolution, which is achieved in practise, is in the focus of interest. The approach is to determine the minimum thickness of a metallic layer for which an accurate quantitative element analysis in cross-section is still possible. Previous measurements were accomplished at sputtered gold (Z = 79) layers, where a lateral resolution in the range of 140 to 170 nm was achieved at suitable parameters of the microprobe. To study the Z-dependence of the lateral resolution, now aluminium (Z = 13) resp. silver (Z = 47) layers with different thicknesses were generated by evaporation and prepared in cross-section subsequently by use of a focussed Ga-ion beam (FIB). Each layer was analysed quantitatively with different electron energies. The thinnest layer which can be resolved specifies the best lateral resolution. These measured values were compared on the one hand with Monte Carlo simulations and on the other hand with predictions from formulas from the literature. The measurements fit well to the simulated and calculated values, except the ones at the lowest primary electron energies with an overvoltage below ˜ 2. The reason for this discrepancy is not clear yet and has to be clarified by further investigations. The results apply for any microanalyser - even with energy-dispersive X-ray spectrometry (EDS) detection - if the probe diameters, which might deviate from those of the JEOL JXA-8530F, at suitable analysing parameters are considered.
A novel mesh processing based technique for 3D plant analysis
2012-01-01
Background In recent years, imaging based, automated, non-invasive, and non-destructive high-throughput plant phenotyping platforms have become popular tools for plant biology, underpinning the field of plant phenomics. Such platforms acquire and record large amounts of raw data that must be accurately and robustly calibrated, reconstructed, and analysed, requiring the development of sophisticated image understanding and quantification algorithms. The raw data can be processed in different ways, and the past few years have seen the emergence of two main approaches: 2D image processing and 3D mesh processing algorithms. Direct image quantification methods (usually 2D) dominate the current literature due to comparative simplicity. However, 3D mesh analysis provides the tremendous potential to accurately estimate specific morphological features cross-sectionally and monitor them over-time. Result In this paper, we present a novel 3D mesh based technique developed for temporal high-throughput plant phenomics and perform initial tests for the analysis of Gossypium hirsutum vegetative growth. Based on plant meshes previously reconstructed from multi-view images, the methodology involves several stages, including morphological mesh segmentation, phenotypic parameters estimation, and plant organs tracking over time. The initial study focuses on presenting and validating the accuracy of the methodology on dicotyledons such as cotton but we believe the approach will be more broadly applicable. This study involved applying our technique to a set of six Gossypium hirsutum (cotton) plants studied over four time-points. Manual measurements, performed for each plant at every time-point, were used to assess the accuracy of our pipeline and quantify the error on the morphological parameters estimated. Conclusion By directly comparing our automated mesh based quantitative data with manual measurements of individual stem height, leaf width and leaf length, we obtained the mean absolute errors of 9.34%, 5.75%, 8.78%, and correlation coefficients 0.88, 0.96, and 0.95 respectively. The temporal matching of leaves was accurate in 95% of the cases and the average execution time required to analyse a plant over four time-points was 4.9 minutes. The mesh processing based methodology is thus considered suitable for quantitative 4D monitoring of plant phenotypic features. PMID:22553969
Hertrampf, A; Sousa, R M; Menezes, J C; Herdling, T
2016-05-30
Quality control (QC) in the pharmaceutical industry is a key activity in ensuring medicines have the required quality, safety and efficacy for their intended use. QC departments at pharmaceutical companies are responsible for all release testing of final products but also all incoming raw materials. Near-infrared spectroscopy (NIRS) and Raman spectroscopy are important techniques for fast and accurate identification and qualification of pharmaceutical samples. Tablets containing two different active pharmaceutical ingredients (API) [bisoprolol, hydrochlorothiazide] in different commercially available dosages were analysed using Raman- and NIR Spectroscopy. The goal was to define multivariate models based on each vibrational spectroscopy to discriminate between different dosages (identity) and predict their dosage (semi-quantitative). Furthermore the combination of spectroscopic techniques was investigated. Therefore, two different multiblock techniques based on PLS have been applied: multiblock PLS (MB-PLS) and sequential-orthogonalised PLS (SO-PLS). NIRS showed better results compared to Raman spectroscopy for both identification and quantitation. The multiblock techniques investigated showed that each spectroscopy contains information not present or captured with the other spectroscopic technique, thus demonstrating that there is a potential benefit in their combined use for both identification and quantitation purposes. Copyright © 2016 Elsevier B.V. All rights reserved.
Dynamics of spiral waves rotating around an obstacle and the existence of a minimal obstacle
NASA Astrophysics Data System (ADS)
Gao, Xiang; Feng, Xia; Li, Teng-Chao; Qu, Shixian; Wang, Xingang; Zhang, Hong
2017-05-01
Pinning of vortices by obstacles plays an important role in various systems. In the heart, anatomical reentry is created when a vortex, also known as the spiral wave, is pinned to an anatomical obstacle, leading to a class of physiologically very important arrhythmias. Previous analyses of its dynamics and instability provide fine estimates in some special circumstances, such as large obstacles or weak excitabilities. Here, to expand theoretical analyses to all circumstances, we propose a general theory whose results quantitatively agree with direct numerical simulations. In particular, when obstacles are small and pinned spiral waves are destabilized, an accurate explanation of the instability in two-dimensional media is provided by the usage of a mapping rule and dimension reduction. The implications of our results are to better understand the mechanism of arrhythmia and thus improve its early prevention.
Stochasticity in staged models of epidemics: quantifying the dynamics of whooping cough
Black, Andrew J.; McKane, Alan J.
2010-01-01
Although many stochastic models can accurately capture the qualitative epidemic patterns of many childhood diseases, there is still considerable discussion concerning the basic mechanisms generating these patterns; much of this stems from the use of deterministic models to try to understand stochastic simulations. We argue that a systematic method of analysing models of the spread of childhood diseases is required in order to consistently separate out the effects of demographic stochasticity, external forcing and modelling choices. Such a technique is provided by formulating the models as master equations and using the van Kampen system-size expansion to provide analytical expressions for quantities of interest. We apply this method to the susceptible–exposed–infected–recovered (SEIR) model with distributed exposed and infectious periods and calculate the form that stochastic oscillations take on in terms of the model parameters. With the use of a suitable approximation, we apply the formalism to analyse a model of whooping cough which includes seasonal forcing. This allows us to more accurately interpret the results of simulations and to make a more quantitative assessment of the predictions of the model. We show that the observed dynamics are a result of a macroscopic limit cycle induced by the external forcing and resonant stochastic oscillations about this cycle. PMID:20164086
Vesicular stomatitis forecasting based on Google Trends
Lu, Yi; Zhou, GuangYa; Chen, Qin
2018-01-01
Background Vesicular stomatitis (VS) is an important viral disease of livestock. The main feature of VS is irregular blisters that occur on the lips, tongue, oral mucosa, hoof crown and nipple. Humans can also be infected with vesicular stomatitis and develop meningitis. This study analyses 2014 American VS outbreaks in order to accurately predict vesicular stomatitis outbreak trends. Methods American VS outbreaks data were collected from OIE. The data for VS keywords were obtained by inputting 24 disease-related keywords into Google Trends. After calculating the Pearson and Spearman correlation coefficients, it was found that there was a relationship between outbreaks and keywords derived from Google Trends. Finally, the predicted model was constructed based on qualitative classification and quantitative regression. Results For the regression model, the Pearson correlation coefficients between the predicted outbreaks and actual outbreaks are 0.953 and 0.948, respectively. For the qualitative classification model, we constructed five classification predictive models and chose the best classification predictive model as the result. The results showed, SN (sensitivity), SP (specificity) and ACC (prediction accuracy) values of the best classification predictive model are 78.52%,72.5% and 77.14%, respectively. Conclusion This study applied Google search data to construct a qualitative classification model and a quantitative regression model. The results show that the method is effective and that these two models obtain more accurate forecast. PMID:29385198
Li, Qiang; Qiu, Tian; Hao, Hongxia; Zhou, Hong; Wang, Tongzhou; Zhang, Ye; Li, Xin; Huang, Guoliang; Cheng, Jing
2012-04-07
A deep ultraviolet-visible (DUV-Vis) reflected optical fiber sensor was developed for use in a simple spectrophotometric detection system to detect the absorption of various illegal drugs at wavelengths between 180 and 800 nm. Quantitative analyses performed using the sensor revealed a high specificity and sensitivity for drug detection at a wavelength of approximately 200 nm. Using a double-absorption optical path length, extremely small sample volumes were used (32 to 160 nL), which allowed the use of minimal amounts of samples. A portable spectrophotometric system was established based on our optical fiber sensor for the on-site determination and quantitative analysis of common illegal drugs, such as 3,4-methylenedioxymethamphetamine (MDMA), ketamine hydrochloride, cocaine hydrochloride, diazepam, phenobarbital, and barbital. By analyzing the absorbance spectra, six different drugs were quantified at concentrations that ranged from 0.1 to 1000 μg mL(-1) (16 pg-0.16 μg). A novel Matching Algorithm of Spectra Space (MASS) was used to accurately distinguish between each drug in a mixture. As an important supplement to traditional methods, such as mass spectrometry or chromatography, our optical fiber sensor offers rapid and low-cost on-site detection using trace amounts of sample. This rapid and accurate analytical method has wide-ranging applications in forensic science, law enforcement, and medicine.
75 FR 29537 - Draft Transportation Conformity Guidance for Quantitative Hot-spot Analyses in PM2.5
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-26
... Quantitative Hot- spot Analyses in PM 2.5 and PM 10 Nonattainment and Maintenance Areas AGENCY: Environmental... finalized, this guidance would help state and local agencies complete quantitative PM 2.5 and PM 10 hot-spot...), EPA stated that quantitative PM 2.5 and PM 10 hot-spot analyses would not be required until EPA...
Reliable enumeration of malaria parasites in thick blood films using digital image analysis.
Frean, John A
2009-09-23
Quantitation of malaria parasite density is an important component of laboratory diagnosis of malaria. Microscopy of Giemsa-stained thick blood films is the conventional method for parasite enumeration. Accurate and reproducible parasite counts are difficult to achieve, because of inherent technical limitations and human inconsistency. Inaccurate parasite density estimation may have adverse clinical and therapeutic implications for patients, and for endpoints of clinical trials of anti-malarial vaccines or drugs. Digital image analysis provides an opportunity to improve performance of parasite density quantitation. Accurate manual parasite counts were done on 497 images of a range of thick blood films with varying densities of malaria parasites, to establish a uniformly reliable standard against which to assess the digital technique. By utilizing descriptive statistical parameters of parasite size frequency distributions, particle counting algorithms of the digital image analysis programme were semi-automatically adapted to variations in parasite size, shape and staining characteristics, to produce optimum signal/noise ratios. A reliable counting process was developed that requires no operator decisions that might bias the outcome. Digital counts were highly correlated with manual counts for medium to high parasite densities, and slightly less well correlated with conventional counts. At low densities (fewer than 6 parasites per analysed image) signal/noise ratios were compromised and correlation between digital and manual counts was poor. Conventional counts were consistently lower than both digital and manual counts. Using open-access software and avoiding custom programming or any special operator intervention, accurate digital counts were obtained, particularly at high parasite densities that are difficult to count conventionally. The technique is potentially useful for laboratories that routinely perform malaria parasite enumeration. The requirements of a digital microscope camera, personal computer and good quality staining of slides are potentially reasonably easy to meet.
New atom probe approaches to studying segregation in nanocrystalline materials.
Samudrala, S K; Felfer, P J; Araullo-Peters, V J; Cao, Y; Liao, X Z; Cairney, J M
2013-09-01
Atom probe is a technique that is highly suited to the study of nanocrystalline materials. It can provide accurate atomic-scale information about the composition of grain boundaries in three dimensions. In this paper we have analysed the microstructure of a nanocrystalline super-duplex stainless steel prepared by high pressure torsion (HPT). Not all of the grain boundaries in this alloy display obvious segregation, making visualisation of the microstructure challenging. In addition, the grain boundaries present in the atom probe data acquired from this alloy have complex shapes that are curved at the scale of the dataset and the interfacial excess varies considerably over the boundaries, making the accurate characterisation of the distribution of solute challenging using existing analysis techniques. In this paper we present two new data treatment methods that allow the visualisation of boundaries with little or no segregation, the delineation of boundaries for further analysis and the quantitative analysis of Gibbsian interfacial excess at boundaries, including the capability of excess mapping. Copyright © 2013 Elsevier B.V. All rights reserved.
Shi, Xu; Gao, Weimin; Chao, Shih-hui
2013-01-01
Directly monitoring the stress response of microbes to their environments could be one way to inspect the health of microorganisms themselves, as well as the environments in which the microorganisms live. The ultimate resolution for such an endeavor could be down to a single-cell level. In this study, using the diatom Thalassiosira pseudonana as a model species, we aimed to measure gene expression responses of this organism to various stresses at a single-cell level. We developed a single-cell quantitative real-time reverse transcription-PCR (RT-qPCR) protocol and applied it to determine the expression levels of multiple selected genes under nitrogen, phosphate, and iron depletion stress conditions. The results, for the first time, provided a quantitative measurement of gene expression at single-cell levels in T. pseudonana and demonstrated that significant gene expression heterogeneity was present within the cell population. In addition, different expression patterns between single-cell- and bulk-cell-based analyses were also observed for all genes assayed in this study, suggesting that cell response heterogeneity needs to be taken into consideration in order to obtain accurate information that indicates the environmental stress condition. PMID:23315741
Shi, Xu; Gao, Weimin; Chao, Shih-hui; Zhang, Weiwen; Meldrum, Deirdre R
2013-03-01
Directly monitoring the stress response of microbes to their environments could be one way to inspect the health of microorganisms themselves, as well as the environments in which the microorganisms live. The ultimate resolution for such an endeavor could be down to a single-cell level. In this study, using the diatom Thalassiosira pseudonana as a model species, we aimed to measure gene expression responses of this organism to various stresses at a single-cell level. We developed a single-cell quantitative real-time reverse transcription-PCR (RT-qPCR) protocol and applied it to determine the expression levels of multiple selected genes under nitrogen, phosphate, and iron depletion stress conditions. The results, for the first time, provided a quantitative measurement of gene expression at single-cell levels in T. pseudonana and demonstrated that significant gene expression heterogeneity was present within the cell population. In addition, different expression patterns between single-cell- and bulk-cell-based analyses were also observed for all genes assayed in this study, suggesting that cell response heterogeneity needs to be taken into consideration in order to obtain accurate information that indicates the environmental stress condition.
Cuvelier, Daphne; de Busserolles, Fanny; Lavaud, Romain; Floc'h, Estelle; Fabri, Marie-Claire; Sarradin, Pierre-Marie; Sarrazin, Jozée
2012-12-01
In the past few decades, hydrothermal vent research has progressed immensely, resulting in higher-quality samples and long-term studies. With time, scientists are becoming more aware of the impacts of sampling on the faunal communities and are looking for less invasive ways to investigate the vent ecosystems. In this perspective, imagery analysis plays a very important role. With this study, we test which factors can be quantitatively and accurately assessed based on imagery, through comparison with faunal sampling. Twelve instrumented chains were deployed on the Atlantic Eiffel Tower hydrothermal edifice and the corresponding study sites were subsequently sampled. Discrete, quantitative samples were compared to the imagery recorded during the experiment. An observer-effect was tested, by comparing imagery data gathered by different scientists. Most factors based on image analyses concerning Bathymodiolus azoricus mussels were shown to be valid representations of the corresponding samples. Additional ecological assets, based exclusively on imagery, were included. Copyright © 2012 Elsevier Ltd. All rights reserved.
Hong, Jungeui; Gresham, David
2017-11-01
Quantitative analysis of next-generation sequencing (NGS) data requires discriminating duplicate reads generated by PCR from identical molecules that are of unique origin. Typically, PCR duplicates are identified as sequence reads that align to the same genomic coordinates using reference-based alignment. However, identical molecules can be independently generated during library preparation. Misidentification of these molecules as PCR duplicates can introduce unforeseen biases during analyses. Here, we developed a cost-effective sequencing adapter design by modifying Illumina TruSeq adapters to incorporate a unique molecular identifier (UMI) while maintaining the capacity to undertake multiplexed, single-index sequencing. Incorporation of UMIs into TruSeq adapters (TrUMIseq adapters) enables identification of bona fide PCR duplicates as identically mapped reads with identical UMIs. Using TrUMIseq adapters, we show that accurate removal of PCR duplicates results in improved accuracy of both allele frequency (AF) estimation in heterogeneous populations using DNA sequencing and gene expression quantification using RNA-Seq.
A novel 3D imaging system for strawberry phenotyping.
He, Joe Q; Harrison, Richard J; Li, Bo
2017-01-01
Accurate and quantitative phenotypic data in plant breeding programmes is vital in breeding to assess the performance of genotypes and to make selections. Traditional strawberry phenotyping relies on the human eye to assess most external fruit quality attributes, which is time-consuming and subjective. 3D imaging is a promising high-throughput technique that allows multiple external fruit quality attributes to be measured simultaneously. A low cost multi-view stereo (MVS) imaging system was developed, which captured data from 360° around a target strawberry fruit. A 3D point cloud of the sample was derived and analysed with custom-developed software to estimate berry height, length, width, volume, calyx size, colour and achene number. Analysis of these traits in 100 fruits showed good concordance with manual assessment methods. This study demonstrates the feasibility of an MVS based 3D imaging system for the rapid and quantitative phenotyping of seven agronomically important external strawberry traits. With further improvement, this method could be applied in strawberry breeding programmes as a cost effective phenotyping technique.
NASA Technical Reports Server (NTRS)
Smedes, H. W. (Principal Investigator); Root, R. R.; Roller, N. E. G.; Despain, D.
1978-01-01
The author has identified the following significant results. A terrain map of Yellowstone National Park showed plant community types and other classes of ground cover in what is basically a wild land. The map comprised 12 classes, six of which were mapped with accuracies of 70 to 95%. The remaining six classes had spectral reflectances that overlapped appreciably, and hence, those were mapped less accurately. Techniques were devised for quantitatively comparing the recognition map of the park with control data acquired from ground inspection and from analysis of sidelooking radar images, a thermal IR mosaic, and IR aerial photos of several scales. Quantitative analyses were made in ten 40 sq km test areas. Comparison mechanics were performed by computer with the final results displayed on line printer output. Forested areas were mapped by computer using ERTS data for less than 1/4 the cost of the conventional forest mapping technique for topographic base maps.
Li, Xiuying; Yang, Qiwei; Bai, Jinping; Xuan, Yali; Wang, Yimin
2015-01-01
Normalization to a reference gene is the method of choice for quantitative reverse transcription-PCR (RT-qPCR) analysis. The stability of reference genes is critical for accurate experimental results and conclusions. We have evaluated the expression stability of eight commonly used reference genes found in four different human mesenchymal stem cells (MSC). Using geNorm, NormFinder and BestKeeper algorithms, we show that beta-2-microglobulin and peptidyl-prolylisomerase A were the optimal reference genes for normalizing RT-qPCR data obtained from MSC, whereas the TATA box binding protein was not suitable due to its extensive variability in expression. Our findings emphasize the significance of validating reference genes for qPCR analyses. We offer a short list of reference genes to use for normalization and recommend some commercially-available software programs as a rapid approach to validate reference genes. We also demonstrate that the two reference genes, β-actin and glyceraldehyde-3-phosphate dehydrogenase, are frequently used are not always successful in many cases.
A field instrument for quantitative determination of beryllium by activation analysis
Vaughn, William W.; Wilson, E.E.; Ohm, J.M.
1960-01-01
A low-cost instrument has been developed for quantitative determinations of beryllium in the field by activation analysis. The instrument makes use of the gamma-neutron reaction between gammas emitted by an artificially radioactive source (Sb124) and beryllium as it occurs in nature. The instrument and power source are mounted in a panel-type vehicle. Samples are prepared by hand-crushing the rock to approximately ?-inch mesh size and smaller. Sample volumes are kept constant by means of a standard measuring cup. Instrument calibration, made by using standards of known BeO content, indicates the analyses are reproducible and accurate to within ? 0.25 percent BeO in the range from 1 to 20 percent BeO with a sample counting time of 5 minutes. Sensitivity of the instrument maybe increased somewhat by increasing the source size, the sample size, or by enlarging the cross-sectional area of the neutron-sensitive phosphor normal to the neutron flux.
Scoring systems for the Clock Drawing Test: A historical review
Spenciere, Bárbara; Alves, Heloisa; Charchat-Fichman, Helenice
2017-01-01
The Clock Drawing Test (CDT) is a simple neuropsychological screening instrument that is well accepted by patients and has solid psychometric properties. Several different CDT scoring methods have been developed, but no consensus has been reached regarding which scoring method is the most accurate. This article reviews the literature on these scoring systems and the changes they have undergone over the years. Historically, different types of scoring systems emerged. Initially, the focus was on screening for dementia, and the methods were both quantitative and semi-quantitative. Later, the need for an early diagnosis called for a scoring system that can detect subtle errors, especially those related to executive function. Therefore, qualitative analyses began to be used for both differential and early diagnoses of dementia. A widely used qualitative method was proposed by Rouleau et al. (1992). Tracing the historical path of these scoring methods is important for developing additional scoring systems and furthering dementia prevention research. PMID:29213488
Miao, Yu; Wang, Cheng-long; Yin, Hui-jun; Shi, Da-zhuo; Chen, Ke-ji
2005-04-18
To establish method for the quantitative determination of adenosine phosphates in rat myocardium by optimized high performance liquid chromatogram (HPLC). ODS HYPERSIL C(18) column and a mobile phase of 50 mmol/L tribasic potassium phosphate buffer solution (pH 6.5), with UV detector at 254 nm were used. The average recovery rates of myocardial adenosine triphosphate (ATP), adenosine diphosphate (ADP) and adenosine monophosphate (AMP) were 99%-107%, 96%-104% and 95%-119%, respectively; relative standard deviations (RSDs) of within-day and between-days were less than 1.5% and 5.1%, respectively. The method is simple, rapid and accurate, and can be used to analyse the adenosine phosphates in myocardium.
2017-06-29
Accurate Virus Quantitation Using a Scanning Transmission Electron Microscopy (STEM) Detector in a Scanning Electron Microscope Candace D Blancett1...L Norris2, Cynthia A Rossi4 , Pamela J Glass3, Mei G Sun1,* 1 Pathology Division, United States Army Medical Research Institute of Infectious...Diseases (USAMRIID), 1425 Porter Street, Fort Detrick, Maryland, 21702 2Biostatistics Division, United States Army Medical Research Institute of
Quantifying the chemical composition of soil organic carbon with solid-state 13C NMR
NASA Astrophysics Data System (ADS)
Baldock, J. A.; Sanderman, J.
2011-12-01
The vulnerability of soil organic carbon (SOC) to biological decomposition and mineralisation to CO2 is defined at least partially by its chemical composition. Highly aromatic charcoal-like SOC components are more stable to biological decomposition than other forms of carbon including cellulose. Solid-state 13C NMR has gained wide acceptance as a method capable of defining SOC chemical composition and mathematical fitting processes have been developed to estimate biochemical composition. Obtaining accurate estimates depends on an ability to quantitatively detect all carbon present in a sample. Often little attention has been paid to defining the proportion of organic carbon present in a soil that is observable in solid-state 13C NMR analyses of soil samples. However, if such data is to be used to inform carbon cycling studies, it is critical that quantitative assessments of SOC observability be undertaken. For example, it is now well established that a significant discrimination exists against the detection of the low proton content polyaromatic structures typical of charcoal using cross polarisation 13C NMR analyses. Such discrimination does not exist where direct polarisation analyses are completed. In this study, the chemical composition of SOC as defined by cross polarisation and direct polarisation13C NMR analyses will be compared for Australian soils collected from under a diverse range of agricultural managements and climatic conditions. Results indicate that where significant charcoal C contents exist, it is highly under-represented in the acquired CP spectra. For some soils, a discrimination against alkyl carbon was also evident. The ability to derive correction factors to compensate for such discriminations will be assessed and presented.
Tichauer, Kenneth M.; Wang, Yu; Pogue, Brian W.; Liu, Jonathan T. C.
2015-01-01
The development of methods to accurately quantify cell-surface receptors in living tissues would have a seminal impact in oncology. For example, accurate measures of receptor density in vivo could enhance early detection or surgical resection of tumors via protein-based contrast, allowing removal of cancer with high phenotype specificity. Alternatively, accurate receptor expression estimation could be used as a biomarker to guide patient-specific clinical oncology targeting of the same molecular pathway. Unfortunately, conventional molecular contrast-based imaging approaches are not well adapted to accurately estimating the nanomolar-level cell-surface receptor concentrations in tumors, as most images are dominated by nonspecific sources of contrast such as high vascular permeability and lymphatic inhibition. This article reviews approaches for overcoming these limitations based upon tracer kinetic modeling and the use of emerging protocols to estimate binding potential and the related receptor concentration. Methods such as using single time point imaging or a reference-tissue approach tend to have low accuracy in tumors, whereas paired-agent methods or advanced kinetic analyses are more promising to eliminate the dominance of interstitial space in the signals. Nuclear medicine and optical molecular imaging are the primary modalities used, as they have the nanomolar level sensitivity needed to quantify cell-surface receptor concentrations present in tissue, although each likely has a different clinical niche. PMID:26134619
Halogen Bonding versus Hydrogen Bonding: A Molecular Orbital Perspective
Wolters, Lando P; Bickelhaupt, F Matthias
2012-01-01
We have carried out extensive computational analyses of the structure and bonding mechanism in trihalides DX⋅⋅⋅A− and the analogous hydrogen-bonded complexes DH⋅⋅⋅A− (D, X, A=F, Cl, Br, I) using relativistic density functional theory (DFT) at zeroth-order regular approximation ZORA-BP86/TZ2P. One purpose was to obtain a set of consistent data from which reliable trends in structure and stability can be inferred over a large range of systems. The main objective was to achieve a detailed understanding of the nature of halogen bonds, how they resemble, and also how they differ from, the better understood hydrogen bonds. Thus, we present an accurate physical model of the halogen bond based on quantitative Kohn–Sham molecular orbital (MO) theory, energy decomposition analyses (EDA) and Voronoi deformation density (VDD) analyses of the charge distribution. It appears that the halogen bond in DX⋅⋅⋅A− arises not only from classical electrostatic attraction but also receives substantial stabilization from HOMO–LUMO interactions between the lone pair of A− and the σ* orbital of D–X. PMID:24551497
Uitto, J; Paul, J L; Brockley, K; Pearce, R H; Clark, J G
1983-10-01
The elastic fibers in the skin and other organs can be affected in several disease processes. In this study, we have developed morphometric techniques that allow accurate quantitation of the elastic fibers in punch biopsy specimens of skin. In this procedure, the elastic fibers, visualized by elastin-specific stains, are examined through a camera unit attached to the microscope. The black and white images sensing various gray levels are then converted to binary images after selecting a threshold with an analog threshold selection device. The binary images are digitized and the data analyzed by a computer program designed to express the properties of the image, thus allowing determination of the volume fraction occupied by the elastic fibers. As an independent measure of the elastic fibers, alternate tissue sections were used for assay of desmosine, an elastin-specific cross-link compound, by a radioimmunoassay. The clinical applicability of the computerized morphometric analyses was tested by examining the elastic fibers in the skin of five patients with pseudoxanthoma elasticum or Buschke-Ollendorff syndrome. In the skin of 10 healthy control subjects, the elastic fibers occupied 2.1 +/- 1.1% (mean +/- SD) of the dermis. The volume fractions occupied by the elastic fibers in the lesions of pseudoxanthoma elasticum or Buschke-Ollendorff syndrome were increased as much as 6-fold, whereas the values in the unaffected areas of the skin in the same patients were within normal limits. A significant correlation between the volume fraction of elastic fibers, determined by computerized morphometric analyses, and the concentration of desmosine, quantitated by radioimmunoassay, was noted in the total material. These results demonstrate that computerized morphometric techniques are helpful in characterizing disease processes affecting skin. This methodology should also be applicable to other tissues that contain elastic fibers and that are affected in various heritable and acquired diseases.
A NOVEL TECHNIQUE FOR QUANTITATIVE ESTIMATION OF UPTAKE OF DIESEL EXHAUST PARTICLES BY LUNG CELLS
While airborne particulates like diesel exhaust particulates (DEP) exert significant toxicological effects on lungs, quantitative estimation of accumulation of DEP inside lung cells has not been reported due to a lack of an accurate and quantitative technique for this purpose. I...
[A new method of processing quantitative PCR data].
Ke, Bing-Shen; Li, Guang-Yun; Chen, Shi-Min; Huang, Xiang-Yan; Chen, Ying-Jian; Xu, Jun
2003-05-01
Today standard PCR can't satisfy the need of biotechnique development and clinical research any more. After numerous dynamic research, PE company found there is a linear relation between initial template number and cycling time when the accumulating fluorescent product is detectable.Therefore,they developed a quantitative PCR technique to be used in PE7700 and PE5700. But the error of this technique is too great to satisfy the need of biotechnique development and clinical research. A better quantitative PCR technique is needed. The mathematical model submitted here is combined with the achievement of relative science,and based on the PCR principle and careful analysis of molecular relationship of main members in PCR reaction system. This model describes the function relation between product quantity or fluorescence intensity and initial template number and other reaction conditions, and can reflect the accumulating rule of PCR product molecule accurately. Accurate quantitative PCR analysis can be made use this function relation. Accumulated PCR product quantity can be obtained from initial template number. Using this model to do quantitative PCR analysis,result error is only related to the accuracy of fluorescence intensity or the instrument used. For an example, when the fluorescence intensity is accurate to 6 digits and the template size is between 100 to 1,000,000, the quantitative result accuracy will be more than 99%. The difference of result error is distinct using same condition,same instrument but different analysis method. Moreover,if the PCR quantitative analysis system is used to process data, it will get result 80 times of accuracy than using CT method.
Microlensing for extrasolar planets : improving the photometry
NASA Astrophysics Data System (ADS)
Bajek, David J.
2013-08-01
Gravitational Microlensing, as a technique for detecting Extrasolar Planets, is recognised for its potential in discovering small-mass planets similar to Earth, at a distance of a few Astronomical Units from their host stars. However, analysing the data from microlensing events (which statistically rarely reveal planets) is complex and requires continued and intensive use of various networks of telescopes working together in order to observe the phenomenon. As such the techniques are constantly being developed and refined; this project outlines some steps of the careful analysis required to model an event and ensure the best quality data is used in the fitting. A quantitative investigation into increasing the quality of the original photometric data available from any microlensing event demonstrates that 'lucky imaging' can lead to a marked improvement in the signal to noise ratio of images over standard imaging techniques, which could result in more accurate models and thus the calculation of more accurate planetary parameters. In addition, a simulation illustrating the effects of atmospheric turbulence on exposures was created, and expanded upon to give an approximation of the lucky imaging technique. This further demonstrated the advantages of lucky images which are shown to potentially approach the quality of those expected from diffraction limited photometry. The simulation may be further developed for potential future use as a 'theoretical lucky imager' in our research group, capable of producing and analysing synthetic exposures through customisable conditions.
Optical eigenmodes for illumination & imaging
NASA Astrophysics Data System (ADS)
Kosmeier, Sebastian
Gravitational Microlensing, as a technique for detecting Extrasolar Planets, is recognised for its potential in discovering small-mass planets similar to Earth, at a distance of a few Astronomical Units from their host stars. However, analysing the data from microlensing events (which statistically rarely reveal planets) is complex and requires continued and intensive use of various networks of telescopes working together in order to observe the phenomenon. As such the techniques are constantly being developed and refined; this project outlines some steps of the careful analysis required to model an event and ensure the best quality data is used in the fitting. A quantitative investigation into increasing the quality of the original photometric data available from any microlensing event demonstrates that 'lucky imaging' can lead to a marked improvement in the signal to noise ratio of images over standard imaging techniques, which could result in more accurate models and thus the calculation of more accurate planetary parameters. In addition, a simulation illustrating the effects of atmospheric turbulence on exposures was created, and expanded upon to give an approximation of the lucky imaging technique. This further demonstrated the advantages of lucky images which are shown to potentially approach the quality of those expected from diffraction limited photometry. The simulation may be further developed for potential future use as a 'theoretical lucky imager' in our research group, capable of producing and analysing synthetic exposures through customisable conditions.
NASA Astrophysics Data System (ADS)
Sun, Aihui; Tian, Xiaolin; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Wang, Shouyu; Liu, Cheng
2018-01-01
As a lensfree imaging technique, ptychographic iterative engine (PIE) method can provide both quantitative sample amplitude and phase distributions avoiding aberration. However, it requires field of view (FoV) scanning often relying on mechanical translation, which not only slows down measuring speed, but also introduces mechanical errors decreasing both resolution and accuracy in retrieved information. In order to achieve high-accurate quantitative imaging with fast speed, digital micromirror device (DMD) is adopted in PIE for large FoV scanning controlled by on/off state coding by DMD. Measurements were implemented using biological samples as well as USAF resolution target, proving high resolution in quantitative imaging using the proposed system. Considering its fast and accurate imaging capability, it is believed the DMD based PIE technique provides a potential solution for medical observation and measurements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abeykoon, A. M. Milinda; Hu, Hefei; Wu, Lijun
2015-01-30
Different protocols for calibrating electron pair distribution function (ePDF) measurements are explored and described for quantitative studies on nanomaterials. It is found that the most accurate approach to determine the camera length is to use a standard calibration sample of Au nanoparticles from the National Institute of Standards and Technology. Different protocols for data collection are also explored, as are possible operational errors, to find the best approaches for accurate data collection for quantitative ePDF studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abeykoon, A. M. Milinda; Hu, Hefei; Wu, Lijun
2015-02-01
We explore and describe different protocols for calibrating electron pair distribution function (ePDF) measurements for quantitative studies on nano-materials. We find the most accurate approach to determine the camera-length is to use a standard calibration sample of Au nanoparticles from National Institute of Standards and Technology. Different protocols for data collection are also explored, as are possible operational errors, to find the best approaches for accurate data collection for quantitative ePDF studies.
Tang, Xiaoying; Luo, Yuan; Chen, Zhibin; Huang, Nianwei; Johnson, Hans J.; Paulsen, Jane S.; Miller, Michael I.
2018-01-01
In this paper, we present a fully-automated subcortical and ventricular shape generation pipeline that acts on structural magnetic resonance images (MRIs) of the human brain. Principally, the proposed pipeline consists of three steps: (1) automated structure segmentation using the diffeomorphic multi-atlas likelihood-fusion algorithm; (2) study-specific shape template creation based on the Delaunay triangulation; (3) deformation-based shape filtering using the large deformation diffeomorphic metric mapping for surfaces. The proposed pipeline is shown to provide high accuracy, sufficient smoothness, and accurate anatomical topology. Two datasets focused upon Huntington's disease (HD) were used for evaluating the performance of the proposed pipeline. The first of these contains a total of 16 MRI scans, each with a gold standard available, on which the proposed pipeline's outputs were observed to be highly accurate and smooth when compared with the gold standard. Visual examinations and outlier analyses on the second dataset, which contains a total of 1,445 MRI scans, revealed 100% success rates for the putamen, the thalamus, the globus pallidus, the amygdala, and the lateral ventricle in both hemispheres and rates no smaller than 97% for the bilateral hippocampus and caudate. Another independent dataset, consisting of 15 atlas images and 20 testing images, was also used to quantitatively evaluate the proposed pipeline, with high accuracy having been obtained. In short, the proposed pipeline is herein demonstrated to be effective, both quantitatively and qualitatively, using a large collection of MRI scans. PMID:29867332
Tang, Xiaoying; Luo, Yuan; Chen, Zhibin; Huang, Nianwei; Johnson, Hans J; Paulsen, Jane S; Miller, Michael I
2018-01-01
In this paper, we present a fully-automated subcortical and ventricular shape generation pipeline that acts on structural magnetic resonance images (MRIs) of the human brain. Principally, the proposed pipeline consists of three steps: (1) automated structure segmentation using the diffeomorphic multi-atlas likelihood-fusion algorithm; (2) study-specific shape template creation based on the Delaunay triangulation; (3) deformation-based shape filtering using the large deformation diffeomorphic metric mapping for surfaces. The proposed pipeline is shown to provide high accuracy, sufficient smoothness, and accurate anatomical topology. Two datasets focused upon Huntington's disease (HD) were used for evaluating the performance of the proposed pipeline. The first of these contains a total of 16 MRI scans, each with a gold standard available, on which the proposed pipeline's outputs were observed to be highly accurate and smooth when compared with the gold standard. Visual examinations and outlier analyses on the second dataset, which contains a total of 1,445 MRI scans, revealed 100% success rates for the putamen, the thalamus, the globus pallidus, the amygdala, and the lateral ventricle in both hemispheres and rates no smaller than 97% for the bilateral hippocampus and caudate. Another independent dataset, consisting of 15 atlas images and 20 testing images, was also used to quantitatively evaluate the proposed pipeline, with high accuracy having been obtained. In short, the proposed pipeline is herein demonstrated to be effective, both quantitatively and qualitatively, using a large collection of MRI scans.
Marine, Rachel; McCarren, Coleen; Vorrasane, Vansay; Nasko, Dan; Crowgey, Erin; Polson, Shawn W; Wommack, K Eric
2014-01-30
Shotgun metagenomics has become an important tool for investigating the ecology of microorganisms. Underlying these investigations is the assumption that metagenome sequence data accurately estimates the census of microbial populations. Multiple displacement amplification (MDA) of microbial community DNA is often used in cases where it is difficult to obtain enough DNA for sequencing; however, MDA can result in amplification biases that may impact subsequent estimates of population census from metagenome data. Some have posited that pooling replicate MDA reactions negates these biases and restores the accuracy of population analyses. This assumption has not been empirically tested. Using mock viral communities, we examined the influence of pooling on population-scale analyses. In pooled and single reaction MDA treatments, sequence coverage of viral populations was highly variable and coverage patterns across viral genomes were nearly identical, indicating that initial priming biases were reproducible and that pooling did not alleviate biases. In contrast, control unamplified sequence libraries showed relatively even coverage across phage genomes. MDA should be avoided for metagenomic investigations that require quantitative estimates of microbial taxa and gene functional groups. While MDA is an indispensable technique in applications such as single-cell genomics, amplification biases cannot be overcome by combining replicate MDA reactions. Alternative library preparation techniques should be utilized for quantitative microbial ecology studies utilizing metagenomic sequencing approaches.
Rule, Geoffrey S; Clark, Zlatuse D; Yue, Bingfang; Rockwood, Alan L
2013-04-16
Stable isotope-labeled internal standards are of great utility in providing accurate quantitation in mass spectrometry (MS). An implicit assumption has been that there is no "cross talk" between signals of the internal standard and the target analyte. In some cases, however, naturally occurring isotopes of the analyte do contribute to the signal of the internal standard. This phenomenon becomes more pronounced for isotopically rich compounds, such as those containing sulfur, chlorine, or bromine, higher molecular weight compounds, and those at high analyte/internal standard concentration ratio. This can create nonlinear calibration behavior that may bias quantitative results. Here, we propose the use of a nonlinear but more accurate fitting of data for these situations that incorporates one or two constants determined experimentally for each analyte/internal standard combination and an adjustable calibration parameter. This fitting provides more accurate quantitation in MS-based assays where contributions from analyte to stable labeled internal standard signal exist. It can also correct for the reverse situation where an analyte is present in the internal standard as an impurity. The practical utility of this approach is described, and by using experimental data, the approach is compared to alternative fits.
Drivers for inappropriate fever management in children: a systematic review.
Kelly, M; McCarthy, S; O'Sullivan, R; Shiely, F; Larkin, P; Brenner, M; Sahm, L J
2016-08-01
Background Fever is one of the most common childhood symptoms and accounts for numerous consultations with healthcare practitioners. It causes much anxiety amongst parents as many struggle with managing a feverish child and find it difficult to assess fever severity. Over- and under-dosing of antipyretics has been reported. Aim of the review The aim of this review was to synthesise qualitative and quantitative evidence on the knowledge, attitudes and beliefs of parents regarding fever and febrile illness in children. Method A systematic search was conducted in ten bibliographic databases from database inception to June 2014. Citation lists of studies and consultation with experts were used as secondary sources to identify further relevant studies. Titles and abstracts were screened for inclusion according to pre-defined inclusion and exclusion criteria. Quantitative studies using a questionnaire were analysed using narrative synthesis. Qualitative studies with a semi-structured interview or focus group methodology were analysed thematically. Results Of the 1565 studies which were screened for inclusion in the review, the final review comprised of 14 studies (three qualitative and 11 quantitative). Three categories emerged from the narrative synthesis of quantitative studies: (i) parental practices; (ii) knowledge; (iii) expectations and information seeking. A further three analytical themes emerged from the qualitative studies: (i) control; (ii) impact on family; (iii) experiences. Conclusion Our review identifies the multifaceted nature of the factors which impact on how parents manage fever and febrile illness in children. A coherent approach to the management of fever and febrile illness needs to be implemented so a consistent message is communicated to parents. Healthcare professionals including pharmacists regularly advise parents on fever management. Information given to parents needs to be timely, consistent and accurate so that inappropriate fever management is reduced or eliminated. This review is a necessary foundation for further research in this area.
NASA Astrophysics Data System (ADS)
Monesi, C.; Meneghini, C.; Bardelli, F.; Benfatto, M.; Mobilio, S.; Manju, U.; Sarma, D. D.
2005-11-01
Hole-doped perovskites such as La1-xCaxMnO3 present special magnetic and magnetotransport properties, and it is commonly accepted that the local atomic structure around Mn ions plays a crucial role in determining these peculiar features. Therefore experimental techniques directly probing the local atomic structure, like x-ray absorption spectroscopy (XAS), have been widely exploited to deeply understand the physics of these compounds. Quantitative XAS analysis usually concerns the extended region [extended x-ray absorption fine structure (EXAFS)] of the absorption spectra. The near-edge region [x-ray absorption near-edge spectroscopy (XANES)] of XAS spectra can provide detailed complementary information on the electronic structure and local atomic topology around the absorber. However, the complexity of the XANES analysis usually prevents a quantitative understanding of the data. This work exploits the recently developed MXAN code to achieve a quantitative structural refinement of the Mn K -edge XANES of LaMnO3 and CaMnO3 compounds; they are the end compounds of the doped manganite series LaxCa1-xMnO3 . The results derived from the EXAFS and XANES analyses are in good agreement, demonstrating that a quantitative picture of the local structure can be obtained from XANES in these crystalline compounds. Moreover, the quantitative XANES analysis provides topological information not directly achievable from EXAFS data analysis. This work demonstrates that combining the analysis of extended and near-edge regions of Mn K -edge XAS spectra could provide a complete and accurate description of Mn local atomic environment in these compounds.
Blancett, Candace D; Fetterer, David P; Koistinen, Keith A; Morazzani, Elaine M; Monninger, Mitchell K; Piper, Ashley E; Kuehl, Kathleen A; Kearney, Brian J; Norris, Sarah L; Rossi, Cynthia A; Glass, Pamela J; Sun, Mei G
2017-10-01
A method for accurate quantitation of virus particles has long been sought, but a perfect method still eludes the scientific community. Electron Microscopy (EM) quantitation is a valuable technique because it provides direct morphology information and counts of all viral particles, whether or not they are infectious. In the past, EM negative stain quantitation methods have been cited as inaccurate, non-reproducible, and with detection limits that were too high to be useful. To improve accuracy and reproducibility, we have developed a method termed Scanning Transmission Electron Microscopy - Virus Quantitation (STEM-VQ), which simplifies sample preparation and uses a high throughput STEM detector in a Scanning Electron Microscope (SEM) coupled with commercially available software. In this paper, we demonstrate STEM-VQ with an alphavirus stock preparation to present the method's accuracy and reproducibility, including a comparison of STEM-VQ to viral plaque assay and the ViroCyt Virus Counter. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Tan, Jean-Marie; Payne, Elizabeth J.; Lin, Lynlee L.; Sinnya, Sudipta; Raphael, Anthony P.; Lambie, Duncan; Frazer, Ian H.; Dinger, Marcel E.; Soyer, H. Peter
2017-01-01
Identification of appropriate reference genes (RGs) is critical to accurate data interpretation in quantitative real-time PCR (qPCR) experiments. In this study, we have utilised next generation RNA sequencing (RNA-seq) to analyse the transcriptome of a panel of non-melanoma skin cancer lesions, identifying genes that are consistently expressed across all samples. Genes encoding ribosomal proteins were amongst the most stable in this dataset. Validation of this RNA-seq data was examined using qPCR to confirm the suitability of a set of highly stable genes for use as qPCR RGs. These genes will provide a valuable resource for the normalisation of qPCR data for the analysis of non-melanoma skin cancer. PMID:28852586
Fang, Xin-sheng; Tan, Xiao-mei
2005-09-01
To purify salvianolic acids by macroreticular resin,then mensurate the contents of salvianolic acids and analyse the chromatogram with HPLC. Make salvianolic acids with macroreticular resin; mensurate the content of Salvianolic acids with UV spestrophotometry: the control compound is protocaechuic aldehyde, and the wavelength is 281 nm. Analysis the chromatogram with HPLC, and compare the chromatogram in different technics: zorbax ODS column (4.6 mm x 250 mm, 5 microm), mobilephase: 1% aceticacid-water and methanol in different proportions, the wavelength is 281 nm. The contents of salvianolic acids is 53.8%; HPLC chromatogram indicate that the method is reasonable to make salvianolic acids. Determination of contents and HPLC chromatogram can control the quality of Salvianolic acids more accurately.
Accurate sparse-projection image reconstruction via nonlocal TV regularization.
Zhang, Yi; Zhang, Weihua; Zhou, Jiliu
2014-01-01
Sparse-projection image reconstruction is a useful approach to lower the radiation dose; however, the incompleteness of projection data will cause degeneration of imaging quality. As a typical compressive sensing method, total variation has obtained great attention on this problem. Suffering from the theoretical imperfection, total variation will produce blocky effect on smooth regions and blur edges. To overcome this problem, in this paper, we introduce the nonlocal total variation into sparse-projection image reconstruction and formulate the minimization problem with new nonlocal total variation norm. The qualitative and quantitative analyses of numerical as well as clinical results demonstrate the validity of the proposed method. Comparing to other existing methods, our method more efficiently suppresses artifacts caused by low-rank reconstruction and reserves structure information better.
Valdes, Pablo A; Bekelis, Kimon; Harris, Brent T; Wilson, Brian C; Leblond, Frederic; Kim, Anthony; Simmons, Nathan E; Erkmen, Kadir; Paulsen, Keith D; Roberts, David W
2014-03-01
The use of 5-aminolevulinic acid (ALA)-induced protoporphyrin IX (PpIX) fluorescence has shown promise as a surgical adjunct for maximizing the extent of surgical resection in gliomas. To date, the clinical utility of 5-ALA in meningiomas is not fully understood, with most descriptive studies using qualitative approaches to 5-ALA-PpIX. To assess the diagnostic performance of 5-ALA-PpIX fluorescence during surgical resection of meningioma. ALA was administered to 15 patients with meningioma undergoing PpIX fluorescence-guided surgery at our institution. At various points during the procedure, the surgeon performed qualitative, visual assessments of fluorescence by using the surgical microscope, followed by a quantitative fluorescence measurement by using an intraoperative probe. Specimens were collected at each point for subsequent neuropathological analysis. Clustered data analysis of variance was used to ascertain a difference between groups, and receiver operating characteristic analyses were performed to assess diagnostic capabilities. Red-pink fluorescence was observed in 80% (12/15) of patients, with visible fluorescence generally demonstrating a strong, homogenous character. Quantitative fluorescence measured diagnostically significant PpIX concentrations (cPpIx) in both visibly and nonvisibly fluorescent tissues, with significantly higher cPpIx in both visibly fluorescent (P < .001) and tumor tissue (P = .002). Receiver operating characteristic analyses also showed diagnostic accuracies up to 90% for differentiating tumor from normal dura. ALA-induced PpIX fluorescence guidance is a potential and promising adjunct in accurately detecting neoplastic tissue during meningioma resective surgery. These results suggest a broader reach for PpIX as a biomarker for meningiomas than was previously noted in the literature.
Valdes, Pablo A.; Bekelis, Kimon; Harris, Brent T.; Wilson, Brian C.; Leblond, Frederic; Kim, Anthony; Simmons, Nathan E.; Erkmen, Kadir; Paulsen, Keith D.; Roberts, David W.
2014-01-01
BACKGROUND The use of 5-aminolevulinic acid (ALA)-induced protoporphyrin IX (PpIX) fluorescence has shown promise as a surgical adjunct for maximizing the extent of surgical resection in gliomas. To date, the clinical utility of 5-ALA in meningiomas is not fully understood, with most descriptive studies using qualitative approaches to 5-ALA-PpIX. OBJECTIVE To assess the diagnostic performance of 5-ALA-PpIX fluorescence during surgical resection of meningioma. METHODS ALA was administered to 15 patients with meningioma undergoing PpIX fluorescence-guided surgery at our institution. At various points during the procedure, the surgeon performed qualitative, visual assessments of fluorescence by using the surgical microscope, followed by a quantitative fluorescence measurement by using an intra-operative probe. Specimens were collected at each point for subsequent neuropathological analysis. Clustered data analysis of variance was used to ascertain a difference between groups, and receiver operating characteristic analyses were performed to assess diagnostic capabilities. RESULTS Red-pink fluorescence was observed in 80% (12/15) of patients, with visible fluorescence generally demonstrating a strong, homogenous character. Quantitative fluorescence measured diagnostically significant PpIX concentrations (CPpIx) in both visibly and nonvisibly fluorescent tissues, with significantly higher CPpIx in both visibly fluorescent (P < .001) and tumor tissue (P = .002). Receiver operating characteristic analyses also showed diagnostic accuracies up to 90% for differentiating tumor from normal dura. CONCLUSION ALA-induced PpIX fluorescence guidance is a potential and promising adjunct in accurately detecting neoplastic tissue during meningioma resective surgery. These results suggest a broader reach for PpIX as a biomarker for meningiomas than was previously noted in the literature. PMID:23887194
Rigour in quantitative research.
Claydon, Leica Sarah
2015-07-22
This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.
NASA Astrophysics Data System (ADS)
Lazariev, A.; Allouche, A.-R.; Aubert-Frécon, M.; Fauvelle, F.; Piotto, M.; Elbayed, K.; Namer, I.-J.; van Ormondt, D.; Graveron-Demilly, D.
2011-11-01
High-resolution magic angle spinning (HRMAS) nuclear magnetic resonance (NMR) is playing an increasingly important role for diagnosis. This technique enables setting up metabolite profiles of ex vivo pathological and healthy tissue. The need to monitor diseases and pharmaceutical follow-up requires an automatic quantitation of HRMAS 1H signals. However, for several metabolites, the values of chemical shifts of proton groups may slightly differ according to the micro-environment in the tissue or cells, in particular to its pH. This hampers the accurate estimation of the metabolite concentrations mainly when using quantitation algorithms based on a metabolite basis set: the metabolite fingerprints are not correct anymore. In this work, we propose an accurate method coupling quantum mechanical simulations and quantitation algorithms to handle basis-set changes. The proposed algorithm automatically corrects mismatches between the signals of the simulated basis set and the signal under analysis by maximizing the normalized cross-correlation between the mentioned signals. Optimized chemical shift values of the metabolites are obtained. This method, QM-QUEST, provides more robust fitting while limiting user involvement and respects the correct fingerprints of metabolites. Its efficiency is demonstrated by accurately quantitating 33 signals from tissue samples of human brains with oligodendroglioma, obtained at 11.7 tesla. The corresponding chemical shift changes of several metabolites within the series are also analyzed.
Cardiff, Robert D; Hubbard, Neil E; Engelberg, Jesse A; Munn, Robert J; Miller, Claramae H; Walls, Judith E; Chen, Jane Q; Velásquez-García, Héctor A; Galvez, Jose J; Bell, Katie J; Beckett, Laurel A; Li, Yue-Ju; Borowsky, Alexander D
2013-01-01
Quantitative Image Analysis (QIA) of digitized whole slide images for morphometric parameters and immunohistochemistry of breast cancer antigens was used to evaluate the technical reproducibility, biological variability, and intratumoral heterogeneity in three transplantable mouse mammary tumor models of human breast cancer. The relative preservation of structure and immunogenicity of the three mouse models and three human breast cancers was also compared when fixed with representatives of four distinct classes of fixatives. The three mouse mammary tumor cell models were an ER + /PR + model (SSM2), a Her2 + model (NDL), and a triple negative model (MET1). The four breast cancer antigens were ER, PR, Her2, and Ki67. The fixatives included examples of (1) strong cross-linkers, (2) weak cross-linkers, (3) coagulants, and (4) combination fixatives. Each parameter was quantitatively analyzed using modified Aperio Technologies ImageScope algorithms. Careful pre-analytical adjustments to the algorithms were required to provide accurate results. The QIA permitted rigorous statistical analysis of results and grading by rank order. The analyses suggested excellent technical reproducibility and confirmed biological heterogeneity within each tumor. The strong cross-linker fixatives, such as formalin, consistently ranked higher than weak cross-linker, coagulant and combination fixatives in both the morphometric and immunohistochemical parameters. PMID:23399853
Kim, Jaai; Lim, Juntaek; Lee, Changsoo
2013-12-01
Quantitative real-time PCR (qPCR) has been widely used in recent environmental microbial ecology studies as a tool for detecting and quantifying microorganisms of interest, which aids in better understandings of the complexity of wastewater microbial communities. Although qPCR can be used to provide more specific and accurate quantification than other molecular techniques, it does have limitations that must be considered when applying it in practice. This article reviews the principle of qPCR quantification and its applications to microbial ecology studies in various wastewater treatment environments. Here we also address several limitations of qPCR-based approaches that can affect the validity of quantification data: template nucleic acid quality, nucleic acid extraction efficiency, specificity of group-specific primers and probes, amplification of nonviable DNA, gene copy number variation, and limited number of sequences in the database. Even with such limitations, qPCR is reportedly among the best methods for quantitatively investigating environmental microbial communities. The application of qPCR is and will continue to be increasingly common in studies of wastewater treatment systems. To obtain reliable analyses, however, the limitations that have often been overlooked must be carefully considered when interpreting the results. Copyright © 2013 Elsevier Inc. All rights reserved.
Flightdeck Automation Problems (FLAP) Model for Safety Technology Portfolio Assessment
NASA Technical Reports Server (NTRS)
Ancel, Ersin; Shih, Ann T.
2014-01-01
NASA's Aviation Safety Program (AvSP) develops and advances methodologies and technologies to improve air transportation safety. The Safety Analysis and Integration Team (SAIT) conducts a safety technology portfolio assessment (PA) to analyze the program content, to examine the benefits and risks of products with respect to program goals, and to support programmatic decision making. The PA process includes systematic identification of current and future safety risks as well as tracking several quantitative and qualitative metrics to ensure the program goals are addressing prominent safety risks accurately and effectively. One of the metrics within the PA process involves using quantitative aviation safety models to gauge the impact of the safety products. This paper demonstrates the role of aviation safety modeling by providing model outputs and evaluating a sample of portfolio elements using the Flightdeck Automation Problems (FLAP) model. The model enables not only ranking of the quantitative relative risk reduction impact of all portfolio elements, but also highlighting the areas with high potential impact via sensitivity and gap analyses in support of the program office. Although the model outputs are preliminary and products are notional, the process shown in this paper is essential to a comprehensive PA of NASA's safety products in the current program and future programs/projects.
2015-01-01
Glioblastoma multiforme (GBM) is the most aggressive malignant primary brain tumor, with a dismal mean survival even with the current standard of care. Although in vitro cell systems can provide mechanistic insight into the regulatory networks governing GBM cell proliferation and migration, clinical samples provide a more physiologically relevant view of oncogenic signaling networks. However, clinical samples are not widely available and may be embedded for histopathologic analysis. With the goal of accurately identifying activated signaling networks in GBM tumor samples, we investigated the impact of embedding in optimal cutting temperature (OCT) compound followed by flash freezing in LN2 vs immediate flash freezing (iFF) in LN2 on protein expression and phosphorylation-mediated signaling networks. Quantitative proteomic and phosphoproteomic analysis of 8 pairs of tumor specimens revealed minimal impact of the different sample processing strategies and highlighted the large interpatient heterogeneity present in these tumors. Correlation analyses of the differentially processed tumor sections identified activated signaling networks present in selected tumors and revealed the differential expression of transcription, translation, and degradation associated proteins. This study demonstrates the capability of quantitative mass spectrometry for identification of in vivo oncogenic signaling networks from human tumor specimens that were either OCT-embedded or immediately flash-frozen. PMID:24927040
Linkage disequilibrium interval mapping of quantitative trait loci.
Boitard, Simon; Abdallah, Jihad; de Rochambeau, Hubert; Cierco-Ayrolles, Christine; Mangin, Brigitte
2006-03-16
For many years gene mapping studies have been performed through linkage analyses based on pedigree data. Recently, linkage disequilibrium methods based on unrelated individuals have been advocated as powerful tools to refine estimates of gene location. Many strategies have been proposed to deal with simply inherited disease traits. However, locating quantitative trait loci is statistically more challenging and considerable research is needed to provide robust and computationally efficient methods. Under a three-locus Wright-Fisher model, we derived approximate expressions for the expected haplotype frequencies in a population. We considered haplotypes comprising one trait locus and two flanking markers. Using these theoretical expressions, we built a likelihood-maximization method, called HAPim, for estimating the location of a quantitative trait locus. For each postulated position, the method only requires information from the two flanking markers. Over a wide range of simulation scenarios it was found to be more accurate than a two-marker composite likelihood method. It also performed as well as identity by descent methods, whilst being valuable in a wider range of populations. Our method makes efficient use of marker information, and can be valuable for fine mapping purposes. Its performance is increased if multiallelic markers are available. Several improvements can be developed to account for more complex evolution scenarios or provide robust confidence intervals for the location estimates.
Li, Cuixia; Zuo, Jing; Zhang, Li; Chang, Yulei; Zhang, Youlin; Tu, Langping; Liu, Xiaomin; Xue, Bin; Li, Qiqing; Zhao, Huiying; Zhang, Hong; Kong, Xianggui
2016-12-09
Accurate quantitation of intracellular pH (pH i ) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pH i sensing. Till now, biological auto-fluorescence background upon UV-Vis excitation and severe photo-bleaching of dyes are the two main factors impeding the accurate quantitative detection of pH i . Herein, we have developed a self-ratiometric luminescence nanoprobe based on förster resonant energy transfer (FRET) for probing pH i , in which pH-sensitive fluorescein isothiocyanate (FITC) and upconversion nanoparticles (UCNPs) were served as energy acceptor and donor, respectively. Under 980 nm excitation, upconversion emission bands at 475 nm and 645 nm of NaYF 4 :Yb 3+ , Tm 3+ UCNPs were used as pH i response and self-ratiometric reference signal, respectively. This direct quantitative sensing approach has circumvented the traditional software-based subsequent processing of images which may lead to relatively large uncertainty of the results. Due to efficient FRET and fluorescence background free, a highly-sensitive and accurate sensing has been achieved, featured by 3.56 per unit change in pH i value 3.0-7.0 with deviation less than 0.43. This approach shall facilitate the researches in pH i related areas and development of the intracellular drug delivery systems.
NASA Astrophysics Data System (ADS)
Li, Cuixia; Zuo, Jing; Zhang, Li; Chang, Yulei; Zhang, Youlin; Tu, Langping; Liu, Xiaomin; Xue, Bin; Li, Qiqing; Zhao, Huiying; Zhang, Hong; Kong, Xianggui
2016-12-01
Accurate quantitation of intracellular pH (pHi) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pHi sensing. Till now, biological auto-fluorescence background upon UV-Vis excitation and severe photo-bleaching of dyes are the two main factors impeding the accurate quantitative detection of pHi. Herein, we have developed a self-ratiometric luminescence nanoprobe based on förster resonant energy transfer (FRET) for probing pHi, in which pH-sensitive fluorescein isothiocyanate (FITC) and upconversion nanoparticles (UCNPs) were served as energy acceptor and donor, respectively. Under 980 nm excitation, upconversion emission bands at 475 nm and 645 nm of NaYF4:Yb3+, Tm3+ UCNPs were used as pHi response and self-ratiometric reference signal, respectively. This direct quantitative sensing approach has circumvented the traditional software-based subsequent processing of images which may lead to relatively large uncertainty of the results. Due to efficient FRET and fluorescence background free, a highly-sensitive and accurate sensing has been achieved, featured by 3.56 per unit change in pHi value 3.0-7.0 with deviation less than 0.43. This approach shall facilitate the researches in pHi related areas and development of the intracellular drug delivery systems.
Lim, Chee Wei; Tai, Siew Hoon; Lee, Lin Min; Chan, Sheot Harn
2012-07-01
The current food crisis demands unambiguous determination of mycotoxin contamination in staple foods to achieve safer food for consumption. This paper describes the first accurate LC-MS/MS method developed to analyze tricothecenes in grains by applying multiple reaction monitoring (MRM) transition and MS(3) quantitation strategies in tandem. The tricothecenes are nivalenol, deoxynivalenol, deoxynivalenol-3-glucoside, fusarenon X, 3-acetyl-deoxynivalenol, 15-acetyldeoxynivalenol, diacetoxyscirpenol, and HT-2 and T-2 toxins. Acetic acid and ammonium acetate were used to convert the analytes into their respective acetate adducts and ammonium adducts under negative and positive MS polarity conditions, respectively. The mycotoxins were separated by reversed-phase LC in a 13.5-min run, ionized using electrospray ionization, and detected by tandem mass spectrometry. Analyte-specific mass-to-charge (m/z) ratios were used to perform quantitation under MRM transition and MS(3) (linear ion trap) modes. Three experiments were made for each quantitation mode and matrix in batches over 6 days for recovery studies. The matrix effect was investigated at concentration levels of 20, 40, 80, 120, 160, and 200 μg kg(-1) (n = 3) in 5 g corn flour and rice flour. Extraction with acetonitrile provided a good overall recovery range of 90-108% (n = 3) at three levels of spiking concentration of 40, 80, and 120 μg kg(-1). A quantitation limit of 2-6 μg kg(-1) was achieved by applying an MRM transition quantitation strategy. Under MS(3) mode, a quantitation limit of 4-10 μg kg(-1) was achieved. Relative standard deviations of 2-10% and 2-11% were reported for MRM transition and MS(3) quantitation, respectively. The successful utilization of MS(3) enabled accurate analyte fragmentation pattern matching and its quantitation, leading to the development of analytical methods in fields that demand both analyte specificity and fragmentation fingerprint-matching capabilities that are unavailable under MRM transition.
2014-01-01
We present primary results from the Sequencing Quality Control (SEQC) project, coordinated by the United States Food and Drug Administration. Examining Illumina HiSeq, Life Technologies SOLiD and Roche 454 platforms at multiple laboratory sites using reference RNA samples with built-in controls, we assess RNA sequencing (RNA-seq) performance for junction discovery and differential expression profiling and compare it to microarray and quantitative PCR (qPCR) data using complementary metrics. At all sequencing depths, we discover unannotated exon-exon junctions, with >80% validated by qPCR. We find that measurements of relative expression are accurate and reproducible across sites and platforms if specific filters are used. In contrast, RNA-seq and microarrays do not provide accurate absolute measurements, and gene-specific biases are observed, for these and qPCR. Measurement performance depends on the platform and data analysis pipeline, and variation is large for transcript-level profiling. The complete SEQC data sets, comprising >100 billion reads (10Tb), provide unique resources for evaluating RNA-seq analyses for clinical and regulatory settings. PMID:25150838
Mass Defect Labeling of Cysteine for Improving Peptide Assignment in Shotgun Proteomic Analyses
Hernandez, Hilda; Niehauser, Sarah; Boltz, Stacey A.; Gawandi, Vijay; Phillips, Robert S.; Amster, I. Jonathan
2006-01-01
A method for improving the identification of peptides in a shotgun proteome analysis using accurate mass measurement has been developed. The improvement is based upon the derivatization of cysteine residues with a novel reagent, 2,4-dibromo-(2′-iodo)acetanilide. The derivitization changes the mass defect of cysteine-containing proteolytic peptides in a manner that increases their identification specificity. Peptide masses were measured using matrix-assisted laser desorption/ionization Fourier transform ion cyclotron mass spectrometry. Reactions with protein standards show that the derivatization of cysteine is rapid and quantitative, and the data suggest that the derivatized peptides are more easily ionized or detected than unlabeled cysteine-containing peptides. The reagent was tested on a 15N-metabolically labeled proteome from M. maripaludis. Proteins were identified by their accurate mass values and from their nitrogen stoichiometry. A total of 47% of the labeled peptides are identified versus 27% for the unlabeled peptides. This procedure permits the identification of proteins from the M. maripaludis proteome that are not usually observed by the standard protocol and shows that better protein coverage is obtained with this methodology. PMID:16689545
Doblas, Ana; Sánchez-Ortiga, Emilio; Martínez-Corral, Manuel; Saavedra, Genaro; Garcia-Sucerquia, Jorge
2014-04-01
The advantages of using a telecentric imaging system in digital holographic microscopy (DHM) to study biological specimens are highlighted. To this end, the performances of nontelecentric DHM and telecentric DHM are evaluated from the quantitative phase imaging (QPI) point of view. The evaluated stability of the microscope allows single-shot QPI in DHM by using telecentric imaging systems. Quantitative phase maps of a section of the head of the drosophila melanogaster fly and of red blood cells are obtained via single-shot DHM with no numerical postprocessing. With these maps we show that the use of telecentric DHM provides larger field of view for a given magnification and permits more accurate QPI measurements with less number of computational operations.
McGarry, Bryony L; Rogers, Harriet J; Knight, Michael J; Jokivarsi, Kimmo T; Sierra, Alejandra; Gröhn, Olli Hj; Kauppinen, Risto A
2016-08-01
Quantitative T2 relaxation magnetic resonance imaging allows estimation of stroke onset time. We aimed to examine the accuracy of quantitative T1 and quantitative T2 relaxation times alone and in combination to provide estimates of stroke onset time in a rat model of permanent focal cerebral ischemia and map the spatial distribution of elevated quantitative T1 and quantitative T2 to assess tissue status. Permanent middle cerebral artery occlusion was induced in Wistar rats. Animals were scanned at 9.4T for quantitative T1, quantitative T2, and Trace of Diffusion Tensor (Dav) up to 4 h post-middle cerebral artery occlusion. Time courses of differentials of quantitative T1 and quantitative T2 in ischemic and non-ischemic contralateral brain tissue (ΔT1, ΔT2) and volumes of tissue with elevated T1 and T2 relaxation times (f1, f2) were determined. TTC staining was used to highlight permanent ischemic damage. ΔT1, ΔT2, f1, f2, and the volume of tissue with both elevated quantitative T1 and quantitative T2 (V(Overlap)) increased with time post-middle cerebral artery occlusion allowing stroke onset time to be estimated. V(Overlap) provided the most accurate estimate with an uncertainty of ±25 min. At all times-points regions with elevated relaxation times were smaller than areas with Dav defined ischemia. Stroke onset time can be determined by quantitative T1 and quantitative T2 relaxation times and tissue volumes. Combining quantitative T1 and quantitative T2 provides the most accurate estimate and potentially identifies irreversibly damaged brain tissue. © 2016 World Stroke Organization.
Wang, Yi-Shan; Potts, Jonathan R
2017-03-07
Recent advances in animal tracking have allowed us to uncover the drivers of movement in unprecedented detail. This has enabled modellers to construct ever more realistic models of animal movement, which aid in uncovering detailed patterns of space use in animal populations. Partial differential equations (PDEs) provide a popular tool for mathematically analysing such models. However, their construction often relies on simplifying assumptions which may greatly affect the model outcomes. Here, we analyse the effect of various PDE approximations on the analysis of some simple movement models, including a biased random walk, central-place foraging processes and movement in heterogeneous landscapes. Perhaps the most commonly-used PDE method dates back to a seminal paper of Patlak from 1953. However, our results show that this can be a very poor approximation in even quite simple models. On the other hand, more recent methods, based on transport equation formalisms, can provide more accurate results, as long as the kernel describing the animal's movement is sufficiently smooth. When the movement kernel is not smooth, we show that both the older and newer methods can lead to quantitatively misleading results. Our detailed analysis will aid future researchers in the appropriate choice of PDE approximation for analysing models of animal movement. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Auricchio, F.; Conti, M.; Lefieux, A.; Morganti, S.; Reali, A.; Sardanelli, F.; Secchi, F.; Trimarchi, S.; Veneziani, A.
2014-10-01
The purpose of this study is to quantitatively evaluate the impact of endovascular repair on aortic hemodynamics. The study addresses the assessment of post-operative hemodynamic conditions of a real clinical case through patient-specific analysis, combining accurate medical image analysis and advanced computational fluid-dynamics (CFD). Although the main clinical concern was firstly directed to the endoluminal protrusion of the prosthesis, the CFD simulations have demonstrated that there are two other important areas where the local hemodynamics is impaired and a disturbed blood flow is present: the first one is the ostium of the subclavian artery, which is partially closed by the graft; the second one is the stenosis of the distal thoracic aorta. Besides the clinical relevance of these specific findings, this study highlights how CFD analyses allow to observe important flow effects resulting from the specific features of patient vessel geometries. Consequently, our results demonstrate the potential impact of computational biomechanics not only on the basic knowledge of physiopathology, but also on the clinical practice, thanks to a quantitative extraction of knowledge made possible by merging medical data and mathematical models.
NASA Astrophysics Data System (ADS)
Noack, C.; Jain, J.; Hakala, A.; Schroeder, K.; Dzombak, D. A.; Karamalidis, A.
2013-12-01
Rare earth elements (REE) - encompassing the naturally occurring lanthanides, yttrium, and scandium - are potential tracers for subsurface groundwater-brine flows and geochemical processes. Application of these elements as naturally occurring tracers during shale gas development is reliant on accurate quantitation of trace metals in hypersaline brines. We have modified and validated a liquid-liquid technique for extraction and pre-concentration of REE from saline produced waters from shale gas extraction wells with quantitative analysis by ICP-MS. This method was used to analyze time-series samples of Marcellus shale flowback and produced waters. Additionally, the total REE content of core samples of various strata throughout the Appalachian Basin were determined using HF/HNO3 digestion and ICP-MS analysis. A primary goal of the study is to elucidate systematic geochemical variations as a function of location or shale characteristics. Statistical testing will be performed to study temporal variability of inter-element relationships and explore associations between REE abundance and major solution chemistry. The results of these analyses and discussion of their significance will be presented.
Busman, Mark; Liu, Jihong; Zhong, Hongjian; Bobell, John R; Maragos, Chris M
2014-01-01
Direct analysis in real time (DART) ionisation coupled to a high-resolution mass spectrometer (MS) was used for screening of aflatoxins from a variety of surfaces and the rapid quantitative analysis of a common form of aflatoxin, AFB1, extracted from corn. Sample preparation procedure and instrument parameter settings were optimised to obtain sensitive and accurate determination of aflatoxin AFB1. 84:16 acetonitrile water extracts of corn were analysed by DART-MS. The lowest calibration level (LCL) for aflatoxin AFB1 was 4 μg kg⁻¹. Quantitative analysis was performed with the use of matrix-matched standards employing the ¹³C-labelled internal standard for AFB1. DART-MS of spiked corn extracts gave linear response in the range 4-1000 μg kg⁻¹. Good recoveries (94-110%) and repeatabilities (RSD = 0.7-6.9%) were obtained at spiking levels of 20 and 100 μg kg⁻¹ with the use of an isotope dilution technique. Trueness of data obtained for AFB1 in maize by DART-MS was demonstrated by analysis of corn certified reference materials.
Detection of blur artifacts in histopathological whole-slide images of endomyocardial biopsies.
Hang Wu; Phan, John H; Bhatia, Ajay K; Cundiff, Caitlin A; Shehata, Bahig M; Wang, May D
2015-01-01
Histopathological whole-slide images (WSIs) have emerged as an objective and quantitative means for image-based disease diagnosis. However, WSIs may contain acquisition artifacts that affect downstream image feature extraction and quantitative disease diagnosis. We develop a method for detecting blur artifacts in WSIs using distributions of local blur metrics. As features, these distributions enable accurate classification of WSI regions as sharp or blurry. We evaluate our method using over 1000 portions of an endomyocardial biopsy (EMB) WSI. Results indicate that local blur metrics accurately detect blurry image regions.
2010-01-01
High-throughput genotype data can be used to identify genes important for local adaptation in wild populations, phenotypes in lab stocks, or disease-related traits in human medicine. Here we advance microarray-based genotyping for population genomics with Restriction Site Tiling Analysis. The approach simultaneously discovers polymorphisms and provides quantitative genotype data at 10,000s of loci. It is highly accurate and free from ascertainment bias. We apply the approach to uncover genomic differentiation in the purple sea urchin. PMID:20403197
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haudebourg, Raphael; Fichet, Pascal; Goutelard, Florence
The detection (location and quantification) of nuclear facilities to be dismantled possible contamination with low-range particles emitters ({sup 3}H, other low-energy β emitters, a emitters) remains a tedious and expensive task. Indeed, usual remote counters show a too low sensitivity to these non-penetrating radiations, while conventional wipe tests are irrelevant for fixed radioactivity evaluation. The only method to accurately measure activity levels consists in sampling and running advanced laboratory analyses (spectroscopy, liquid scintillation counting, pyrolysis...). Such measurements generally induce sample preparation, waste production (destructive analyses, solvents), nuclear material transportation, long durations, and significant labor mobilization. Therefore, the search for themore » limitation of their number and cost easily conflicts with the necessity to perform a dense screening for sampling (to maximize the representativeness of the samples), in installations of thousands of square meters (floors, wells, ceilings), plus furniture, pipes, and other wastes. To overcome this contradiction, Digital Autoradiography (D. A.) was re-routed from bio molecular research to radiological mapping of nuclear installations under dismantling and to waste and sample analysis. After in-situ exposure to the possibly-contaminated areas to investigate, commercial reusable radiosensitive phosphor screens (of a few 100 cm{sup 2}) were scanned in the proper laboratory device and sharp quantitative images of the radioactivity could be obtained. The implementation of geostatistical tools in the data processing software enabled the exhaustive characterization of concrete floors at a rate of 2 weeks / 100 m{sup 2}, at lowest costs. Various samples such as drilled cores, or tank and wood pieces, were also successfully evaluated with this method, for decisive results. Thanks to the accurate location of potential contamination spots, this approach ensures relevant and representative sampling for further laboratory analyses and should be inserted in the range of common tools used in dismantling. (authors)« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-05
.... This draft document describes the quantitative analyses that are being conducted as part of the review... primary (health-based) CO NAAQS, the Agency is conducting qualitative and quantitative assessments... results, observations, and related uncertainties associated with the quantitative analyses performed. An...
Bryce A. Richardson; Gerald E. Rehfeldt; Mee-Sook Kim
2009-01-01
Analyses of molecular and quantitative genetic data demonstrate the existence of congruent climate-related patterns in western white pine (Pinus monticola). Two independent studies allowed comparisons of amplified fragment length polymorphism (AFLP) markers with quantitative variation in adaptive traits. Principal component analyses...
Using GPS To Teach More Than Accurate Positions.
ERIC Educational Resources Information Center
Johnson, Marie C.; Guth, Peter L.
2002-01-01
Undergraduate science majors need practice in critical thinking, quantitative analysis, and judging whether their calculated answers are physically reasonable. Develops exercises using handheld Global Positioning System (GPS) receivers. Reinforces students' abilities to think quantitatively, make realistic "back of the envelope"…
Wolf, Louis; Scheffer-de Gooyert, Jolanda M.; Monedero, Ignacio; Torroja, Laura; Coromina, Lluis; van der Laak, Jeroen A. W. M.; Schenck, Annette
2016-01-01
The morphology of synapses is of central interest in neuroscience because of the intimate relation with synaptic efficacy. Two decades of gene manipulation studies in different animal models have revealed a repertoire of molecules that contribute to synapse development. However, since such studies often assessed only one, or at best a few, morphological features at a given synapse, it remained unaddressed how different structural aspects relate to one another. Furthermore, such focused and sometimes only qualitative approaches likely left many of the more subtle players unnoticed. Here, we present the image analysis algorithm ‘Drosophila_NMJ_Morphometrics’, available as a Fiji-compatible macro, for quantitative, accurate and objective synapse morphometry of the Drosophila larval neuromuscular junction (NMJ), a well-established glutamatergic model synapse. We developed this methodology for semi-automated multiparametric analyses of NMJ terminals immunolabeled for the commonly used markers Dlg1 and Brp and showed that it also works for Hrp, Csp and Syt. We demonstrate that gender, genetic background and identity of abdominal body segment consistently and significantly contribute to variability in our data, suggesting that controlling for these parameters is important to minimize variability in quantitative analyses. Correlation and principal component analyses (PCA) were performed to investigate which morphometric parameters are inter-dependent and which ones are regulated rather independently. Based on nine acquired parameters, we identified five morphometric groups: NMJ size, geometry, muscle size, number of NMJ islands and number of active zones. Based on our finding that the parameters of the first two principal components hardly correlated with each other, we suggest that different molecular processes underlie these two morphometric groups. Our study sets the stage for systems morphometry approaches at the well-studied Drosophila NMJ. PMID:26998933
The great importance of normalization of LC-MS data for highly-accurate non-targeted metabolomics.
Mizuno, Hajime; Ueda, Kazuki; Kobayashi, Yuta; Tsuyama, Naohiro; Todoroki, Kenichiro; Min, Jun Zhe; Toyo'oka, Toshimasa
2017-01-01
The non-targeted metabolomics analysis of biological samples is very important to understand biological functions and diseases. LC combined with electrospray ionization-based MS has been a powerful tool and widely used for metabolomic analyses. However, the ionization efficiency of electrospray ionization fluctuates for various unexpected reasons such as matrix effects and intraday variations of the instrument performances. To remove these fluctuations, normalization methods have been developed. Such techniques include increasing the sensitivity, separating co-eluting components and normalizing the ionization efficiencies. Normalization techniques allow simultaneously correcting of the ionization efficiencies of the detected metabolite peaks and achieving quantitative non-targeted metabolomics. In this review paper, we focused on these normalization methods for non-targeted metabolomics by LC-MS. Copyright © 2016 John Wiley & Sons, Ltd.
Contact thermal shock test of ceramics
NASA Technical Reports Server (NTRS)
Rogers, W. P.; Emery, A. F.
1992-01-01
A novel quantitative thermal shock test of ceramics is described. The technique employs contact between a metal-cooling rod and hot disk-shaped specimen. In contrast with traditional techniques, the well-defined thermal boundary condition allows for accurate analyses of heat transfer, stress, and fracture. Uniform equibiaxial tensile stresses are induced in the center of the test specimen. Transient specimen temperature and acoustic emission are monitored continuously during the thermal stress cycle. The technique is demonstrated with soda-lime glass specimens. Experimental results are compared with theoretical predictions based on a finite-element method thermal stress analysis combined with a statistical model of fracture. Material strength parameters are determined using concentric ring flexure tests. Good agreement is found between experimental results and theoretical predictions of failure probability as a function of time and initial specimen temperature.
Recent advances in targeted endoscopic imaging: Early detection of gastrointestinal neoplasms
Kwon, Yong-Soo; Cho, Young-Seok; Yoon, Tae-Jong; Kim, Ho-Shik; Choi, Myung-Gyu
2012-01-01
Molecular imaging has emerged as a new discipline in gastrointestinal endoscopy. This technology encompasses modalities that can visualize disease-specific morphological or functional tissue changes based on the molecular signature of individual cells. Molecular imaging has several advantages including minimal damage to tissues, repetitive visualization, and utility for conducting quantitative analyses. Advancements in basic science coupled with endoscopy have made early detection of gastrointestinal cancer possible. Molecular imaging during gastrointestinal endoscopy requires the development of safe biomarkers and exogenous probes to detect molecular changes in cells with high specificity anda high signal-to-background ratio. Additionally, a high-resolution endoscope with an accurate wide-field viewing capability must be developed. Targeted endoscopic imaging is expected to improve early diagnosis and individual therapy of gastrointestinal cancer. PMID:22442742
Lee, Hyunjong; Kim, Ji Hyun; Kang, Yeon-koo; Moon, Jae Hoon; So, Young; Lee, Won Woo
2016-01-01
Abstract Objectives: Technetium pertechnetate (99mTcO4) is a radioactive tracer used to assess thyroid function by thyroid uptake system (TUS). However, the TUS often fails to deliver accurate measurements of the percent of thyroid uptake (%thyroid uptake) of 99mTcO4. Here, we investigated the usefulness of quantitative single-photon emission computed tomography/computed tomography (SPECT/CT) after injection of 99mTcO4 in detecting thyroid function abnormalities. Materials and methods: We retrospectively reviewed data from 50 patients (male:female = 15:35; age, 46.2 ± 16.3 years; 17 Graves disease, 13 thyroiditis, and 20 euthyroid). All patients underwent 99mTcO4 quantitative SPECT/CT (185 MBq = 5 mCi), which yielded %thyroid uptake and standardized uptake value (SUV). Twenty-one (10 Graves disease and 11 thyroiditis) of the 50 patients also underwent conventional %thyroid uptake measurements using a TUS. Results: Quantitative SPECT/CT parameters (%thyroid uptake, SUVmean, and SUVmax) were the highest in Graves disease, second highest in euthyroid, and lowest in thyroiditis (P < 0.0001, Kruskal–Wallis test). TUS significantly overestimated the %thyroid uptake compared with SPECT/CT (P < 0.0001, paired t test) because other 99mTcO4 sources in addition to thyroid, such as salivary glands and saliva, contributed to the %thyroid uptake result by TUS, whereas %thyroid uptake, SUVmean and SUVmax from the SPECT/CT were associated with the functional status of thyroid. Conclusions: Quantitative SPECT/CT is more accurate than conventional TUS for measuring 99mTcO4 %thyroid uptake. Quantitative measurements using SPECT/CT may facilitate more accurate assessment of thyroid tracer uptake. PMID:27399139
Quantitative interpretation of the magnetic susceptibility frequency dependence
NASA Astrophysics Data System (ADS)
Ustra, Andrea; Mendonça, Carlos A.; Leite, Aruã; Jovane, Luigi; Trindade, Ricardo I. F.
2018-05-01
Low-field mass-specific magnetic susceptibility (MS) measurements using multifrequency alternating fields are commonly used to evaluate concentration of ferrimagnetic particles in the transition of superparamagnetic (SP) to stable single domain (SSD). In classical palaeomagnetic analyses, this measurement serves as a preliminary assessment of rock samples providing rapid, non-destructive, economical and easy information of magnetic properties. The SP-SSD transition is relevant in environmental studies because it has been associated with several geological and biogeochemical processes affecting magnetic mineralogy. MS is a complex function of mineral-type and grain-size distribution, as well as measuring parameters such as external field magnitude and frequency. In this work, we propose a new technique to obtain quantitative information on grain-size variations of magnetic particles in the SP-SSD transition by inverting frequency-dependent susceptibility. We introduce a descriptive parameter named as `limiting frequency effect' that provides an accurate estimation of MS loss with frequency. Numerical simulations show the methodology capability in providing data fitting and model parameters in many practical situations. Real-data applications with magnetite nanoparticles and core samples from sediments of Poggio le Guaine section of Umbria-Marche Basin (Italy) provide additional information not clearly recognized when interpreting cruder MS data. Caution is needed when interpreting frequency dependence in terms of single relaxation processes, which are not universally applicable and depend upon the nature of magnetic mineral in the material. Nevertheless, the proposed technique is a promising tool for SP-SSD content analyses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hardisty, M.; Gordon, L.; Agarwal, P.
2007-08-15
Quantitative assessment of metastatic disease in bone is often considered immeasurable and, as such, patients with skeletal metastases are often excluded from clinical trials. In order to effectively quantify the impact of metastatic tumor involvement in the spine, accurate segmentation of the vertebra is required. Manual segmentation can be accurate but involves extensive and time-consuming user interaction. Potential solutions to automating segmentation of metastatically involved vertebrae are demons deformable image registration and level set methods. The purpose of this study was to develop a semiautomated method to accurately segment tumor-bearing vertebrae using the aforementioned techniques. By maintaining morphology of anmore » atlas, the demons-level set composite algorithm was able to accurately differentiate between trans-cortical tumors and surrounding soft tissue of identical intensity. The algorithm successfully segmented both the vertebral body and trabecular centrum of tumor-involved and healthy vertebrae. This work validates our approach as equivalent in accuracy to an experienced user.« less
NASA Astrophysics Data System (ADS)
Armstrong, J. T.; McSwiggen, P.; Nielsen, C.
2013-12-01
Quantitative electron microprobe analysis has revolutionized two-dimensional elemental analysis of Earth materials at the micrometer-scale. Newly available commercial field emission (FE-) source instruments represent significant technological advances in quantitative measurement with high spatial resolution at sub-micrometer scale - helping to bridge the gap between conventional microprobe and AEM analyses. Their performance specifications suggest the ability to extend routine quantitative analyses from ~3-5 micrometer diameter areas down to 1-2 micrometer diameter at beam energies of 15 keV; and, with care, down to 200-500 nm diameter at reduced beam energies. . In order to determine whether the level of performance suggested by the specifications is realistic, we spent a week doing analyses at the newly installed JEOL JXA-8530F field emission microprobe at Arizona State University, using a series of samples that are currently being studied in various projects at CIW. These samples included: 1) high-pressure experiment run product containing intergrowths of sub-micrometer grains of metal, sulfide, Fe-Mg-perovskite, and ferropericlase; 2) a thin section of the Ivankinsky basalt, part of the Siberian flood basalt sequence containing complex sub-micrometer intergrowths of magnetite, titanomagnetite, ilmenite, titanite and rutile; 3) a polished section of the Giroux pallasite, being studied for element partitioning, that we used as an analogue to test the capabilities for zonation and diffusion determination; and 4) a polished section of the Semarkona ordinary chondrite containing chondules comprised of highly zoned and rimmed olivines and pyroxenes in a complex mesostasis of sub-micrometer pyroxenes and glass. The results of these analyses that we will present confirmed our optimism regarding the new analytical capabilities of a field emission microprobe. We were able, at reduced voltages, to accurately analyze the major and minor element composition of intergrowth and rimming phases as small as 200 nm without artifact contribution from the surrounding phases. We were able to determine the compositional gradients at kamacite-taenite boundaries in the pallasite specimen with a resolution of ~180 nm, enabling much higher precision and accuracy determination of the meteorite's cooling rate than previously possible with microprobe measurements. We were able to determine the composition and zonation of phases in the experimental run product, none of which were large enough to be analyzable in a conventional electron microprobe.
Application of Mixed-Methods Approaches to Higher Education and Intersectional Analyses
ERIC Educational Resources Information Center
Griffin, Kimberly A.; Museus, Samuel D.
2011-01-01
In this article, the authors discuss the utility of combining quantitative and qualitative methods in conducting intersectional analyses. First, they discuss some of the paradigmatic underpinnings of qualitative and quantitative research, and how these methods can be used in intersectional analyses. They then consider how paradigmatic pragmatism…
Role Of Social Networks In Resilience Of Naval Recruits: A Quantitative Analysis
2016-06-01
comprises 1,297 total surveys from a total of eight divisions of recruits at two different time periods. Quantitative analyses using surveys and network... surveys from a total of eight divisions of recruits at two different time periods. Quantitative analyses using surveys and network data examine the effects...NETWORKS IN RESILIENCE OF NAVAL RECRUITS: A QUANTITATIVE ANALYSIS by Andrea M. Watling June 2016 Thesis Advisor: Edward H. Powley Co
Spibey, C A; Jackson, P; Herick, K
2001-03-01
In recent years the use of fluorescent dyes in biological applications has dramatically increased. The continual improvement in the capabilities of these fluorescent dyes demands increasingly sensitive detection systems that provide accurate quantitation over a wide linear dynamic range. In the field of proteomics, the detection, quantitation and identification of very low abundance proteins are of extreme importance in understanding cellular processes. Therefore, the instrumentation used to acquire an image of such samples, for spot picking and identification by mass spectrometry, must be sensitive enough to be able, not only, to maximise the sensitivity and dynamic range of the staining dyes but, as importantly, adapt to the ever changing portfolio of fluorescent dyes as they become available. Just as the available fluorescent probes are improving and evolving so are the users application requirements. Therefore, the instrumentation chosen must be flexible to address and adapt to those changing needs. As a result, a highly competitive market for the supply and production of such dyes and the instrumentation for their detection and quantitation have emerged. The instrumentation currently available is based on either laser/photomultiplier tube (PMT) scanning or lamp/charge-coupled device (CCD) based mechanisms. This review briefly discusses the advantages and disadvantages of both System types for fluorescence imaging, gives a technical overview of CCD technology and describes in detail a unique xenon/are lamp CCD based instrument, from PerkinElmer Life Sciences. The Wallac-1442 ARTHUR is unique in its ability to scan both large areas at high resolution and give accurate selectable excitation over the whole of the UV/visible range. It operates by filtering both the excitation and emission wavelengths, providing optimal and accurate measurement and quantitation of virtually any available dye and allows excellent spectral resolution between different fluorophores. This flexibility and excitation accuracy is key to multicolour applications and future adaptation of the instrument to address the application requirements and newly emerging dyes.
Quantitative Analysis of Radar Returns from Insects
NASA Technical Reports Server (NTRS)
Riley, J. R.
1979-01-01
When a number of flying insects is low enough to permit their resolution as individual radar targets, quantitative estimates of their aerial density are developed. Accurate measurements of heading distribution using a rotating polarization radar to enhance the wingbeat frequency method of identification are presented.
Geng, Hua; Todd, Naomi M; Devlin-Mullin, Aine; Poologasundarampillai, Gowsihan; Kim, Taek Bo; Madi, Kamel; Cartmell, Sarah; Mitchell, Christopher A; Jones, Julian R; Lee, Peter D
2016-06-01
A correlative imaging methodology was developed to accurately quantify bone formation in the complex lattice structure of additive manufactured implants. Micro computed tomography (μCT) and histomorphometry were combined, integrating the best features from both, while demonstrating the limitations of each imaging modality. This semi-automatic methodology registered each modality using a coarse graining technique to speed the registration of 2D histology sections to high resolution 3D μCT datasets. Once registered, histomorphometric qualitative and quantitative bone descriptors were directly correlated to 3D quantitative bone descriptors, such as bone ingrowth and bone contact. The correlative imaging allowed the significant volumetric shrinkage of histology sections to be quantified for the first time (~15 %). This technique demonstrated the importance of location of the histological section, demonstrating that up to a 30 % offset can be introduced. The results were used to quantitatively demonstrate the effectiveness of 3D printed titanium lattice implants.
Adhikari, Puspa L; Wong, Roberto L; Overton, Edward B
2017-10-01
Accurate characterization of petroleum hydrocarbons in complex and weathered oil residues is analytically challenging. This is primarily due to chemical compositional complexity of both the oil residues and environmental matrices, and the lack of instrumental selectivity due to co-elution of interferences with the target analytes. To overcome these analytical selectivity issues, we used an enhanced resolution gas chromatography coupled with triple quadrupole mass spectrometry in Multiple Reaction Monitoring (MRM) mode (GC/MS/MS-MRM) to eliminate interferences within the ion chromatograms of target analytes found in environmental samples. This new GC/MS/MS-MRM method was developed and used for forensic fingerprinting of deep-water and marsh sediment samples containing oily residues from the Deepwater Horizon oil spill. The results showed that the GC/MS/MS-MRM method increases selectivity, eliminates interferences, and provides more accurate quantitation and characterization of trace levels of alkyl-PAHs and biomarker compounds, from weathered oil residues in complex sample matrices. The higher selectivity of the new method, even at low detection limits, provides greater insights on isomer and homolog compositional patterns and the extent of oil weathering under various environmental conditions. The method also provides flat chromatographic baselines for accurate and unambiguous calculation of petroleum forensic biomarker compound ratios. Thus, this GC/MS/MS-MRM method can be a reliable analytical strategy for more accurate and selective trace level analyses in petroleum forensic studies, and for tacking continuous weathering of oil residues. Copyright © 2017 Elsevier Ltd. All rights reserved.
Changes in body composition of neonatal piglets during growth
USDA-ARS?s Scientific Manuscript database
During studies of neonatal piglet growth it is important to be able to accurately assess changes in body composition. Previous studies have demonstrated that quantitative magnetic resonance (QMR) provides precise and accurate measurements of total body fat mass, lean mass and total body water in non...
Danish, Shabbar F; Baltuch, Gordon H; Jaggi, Jurg L; Wong, Stephen
2008-04-01
Microelectrode recording during deep brain stimulation surgery is a useful adjunct for subthalamic nucleus (STN) localization. We hypothesize that information in the nonspike background activity can help identify STN boundaries. We present results from a novel quantitative analysis that accomplishes this goal. Thirteen consecutive microelectrode recordings were retrospectively analyzed. Spikes were removed from the recordings with an automated algorithm. The remaining "despiked" signals were converted via root mean square amplitude and curve length calculations into "feature profile" time series. Subthalamic nucleus boundaries determined by inspection, based on sustained deviations from baseline for each feature profile, were compared against those determined intraoperatively by the clinical neurophysiologist. Feature profile activity within STN exhibited a sustained rise in 10 of 13 tracks (77%). The sensitivity of STN entry was 60% and 90% for curve length and root mean square amplitude, respectively, when agreement within 0.5 mm of the neurophysiologist's prediction was used. Sensitivities were 70% and 100% for 1 mm accuracy. Exit point sensitivities were 80% and 90% for both features within 0.5 mm and 1.0 mm, respectively. Reproducible activity patterns in deep brain stimulation microelectrode recordings can allow accurate identification of STN boundaries. Quantitative analyses of this type may provide useful adjunctive information for electrode placement in deep brain stimulation surgery.
Quantitative Graphics in Newspapers.
ERIC Educational Resources Information Center
Tankard, James W., Jr.
The use of quantitative graphics in newspapers requires achieving a balance between being accurate and getting the attention of the reader. The statistical representations in newspapers are drawn by graphic designers whose key technique is fusion--the striking combination of two visual images. This technique often results in visual puns,…
Quantitative PCR for Detection and Enumeration of Genetic Markers of Bovine Fecal Pollution
Accurate assessment of health risks associated with bovine (cattle) fecal pollution requires a reliable host-specific genetic marker and a rapid quantification method. We report the development of quantitative PCR assays for the detection of two recently described cow feces-spec...
Quantitation of spatially-localized proteins in tissue samples using MALDI-MRM imaging.
Clemis, Elizabeth J; Smith, Derek S; Camenzind, Alexander G; Danell, Ryan M; Parker, Carol E; Borchers, Christoph H
2012-04-17
MALDI imaging allows the creation of a "molecular image" of a tissue slice. This image is reconstructed from the ion abundances in spectra obtained while rastering the laser over the tissue. These images can then be correlated with tissue histology to detect potential biomarkers of, for example, aberrant cell types. MALDI, however, is known to have problems with ion suppression, making it difficult to correlate measured ion abundance with concentration. It would be advantageous to have a method which could provide more accurate protein concentration measurements, particularly for screening applications or for precise comparisons between samples. In this paper, we report the development of a novel MALDI imaging method for the localization and accurate quantitation of proteins in tissues. This method involves optimization of in situ tryptic digestion, followed by reproducible and uniform deposition of an isotopically labeled standard peptide from a target protein onto the tissue, using an aerosol-generating device. Data is acquired by MALDI multiple reaction monitoring (MRM) mass spectrometry (MS), and accurate peptide quantitation is determined from the ratio of MRM transitions for the endogenous unlabeled proteolytic peptides to the corresponding transitions from the applied isotopically labeled standard peptides. In a parallel experiment, the quantity of the labeled peptide applied to the tissue was determined using a standard curve generated from MALDI time-of-flight (TOF) MS data. This external calibration curve was then used to determine the quantity of endogenous peptide in a given area. All standard curves generate by this method had coefficients of determination greater than 0.97. These proof-of-concept experiments using MALDI MRM-based imaging show the feasibility for the precise and accurate quantitation of tissue protein concentrations over 2 orders of magnitude, while maintaining the spatial localization information for the proteins.
A feasible high spatiotemporal resolution breast DCE-MRI protocol for clinical settings.
Tudorica, Luminita A; Oh, Karen Y; Roy, Nicole; Kettler, Mark D; Chen, Yiyi; Hemmingson, Stephanie L; Afzal, Aneela; Grinstead, John W; Laub, Gerhard; Li, Xin; Huang, Wei
2012-11-01
Three dimensional bilateral imaging is the standard for most clinical breast dynamic contrast-enhanced (DCE) MRI protocols. Because of high spatial resolution (sRes) requirement, the typical 1-2 min temporal resolution (tRes) afforded by a conventional full-k-space-sampling gradient echo (GRE) sequence precludes meaningful and accurate pharmacokinetic analysis of DCE time-course data. The commercially available, GRE-based, k-space undersampling and data sharing TWIST (time-resolved angiography with stochastic trajectories) sequence was used in this study to perform DCE-MRI exams on thirty one patients (with 36 suspicious breast lesions) before their biopsies. The TWIST DCE-MRI was immediately followed by a single-frame conventional GRE acquisition. Blinded from each other, three radiologist readers assessed agreements in multiple lesion morphology categories between the last set of TWIST DCE images and the conventional GRE images. Fleiss' κ test was used to evaluate inter-reader agreement. The TWIST DCE time-course data were subjected to quantitative pharmacokinetic analyses. With a four-channel phased-array breast coil, the TWIST sequence produced DCE images with 20 s or less tRes and ~ 1.0×1.0×1.4 mm(3) sRes. There were no significant differences in signal-to-noise (P=.45) and contrast-to-noise (P=.51) ratios between the TWIST and conventional GRE images. The agreements in morphology evaluations between the two image sets were excellent with the intra-reader agreement ranging from 79% for mass margin to 100% for mammographic density and the inter-reader κ value ranging from 0.54 (P<.0001) for lesion size to 1.00 (P<.0001) for background parenchymal enhancement. Quantitative analyses of the DCE time-course data provided higher breast cancer diagnostic accuracy (91% specificity at 100% sensitivity) than the current clinical practice of morphology and qualitative kinetics assessments. The TWIST sequence may be used in clinical settings to acquire high spatiotemporal resolution breast DCE-MRI images for both precise lesion morphology characterization and accurate pharmacokinetic analysis. Copyright © 2012 Elsevier Inc. All rights reserved.
41 CFR 60-2.10 - General purpose and contents of affirmative action programs.
Code of Federal Regulations, 2012 CFR
2012-07-01
... number of quantitative analyses designed to evaluate the composition of the workforce of the contractor... affirmative action program must include the following quantitative analyses: (i) Organizational profile—§ 60-2...
41 CFR 60-2.10 - General purpose and contents of affirmative action programs.
Code of Federal Regulations, 2011 CFR
2011-07-01
... number of quantitative analyses designed to evaluate the composition of the workforce of the contractor... affirmative action program must include the following quantitative analyses: (i) Organizational profile—§ 60-2...
41 CFR 60-2.10 - General purpose and contents of affirmative action programs.
Code of Federal Regulations, 2010 CFR
2010-07-01
... number of quantitative analyses designed to evaluate the composition of the workforce of the contractor... affirmative action program must include the following quantitative analyses: (i) Organizational profile—§ 60-2...
41 CFR 60-2.10 - General purpose and contents of affirmative action programs.
Code of Federal Regulations, 2013 CFR
2013-07-01
... number of quantitative analyses designed to evaluate the composition of the workforce of the contractor... affirmative action program must include the following quantitative analyses: (i) Organizational profile—§ 60-2...
41 CFR 60-2.10 - General purpose and contents of affirmative action programs.
Code of Federal Regulations, 2014 CFR
2014-07-01
... number of quantitative analyses designed to evaluate the composition of the workforce of the contractor... affirmative action program must include the following quantitative analyses: (i) Organizational profile—§ 60-2...
Quantitative Investigation of the Role of Intra-/Intercellular Dynamics in Bacterial Quorum Sensing.
Leaman, Eric J; Geuther, Brian Q; Behkam, Bahareh
2018-04-20
Bacteria utilize diffusible signals to regulate population density-dependent coordinated gene expression in a process called quorum sensing (QS). While the intracellular regulatory mechanisms of QS are well-understood, the effect of spatiotemporal changes in the population configuration on the sensitivity and robustness of the QS response remains largely unexplored. Using a microfluidic device, we quantitatively characterized the emergent behavior of a population of swimming E. coli bacteria engineered with the lux QS system and a GFP reporter. We show that the QS activation time follows a power law with respect to bacterial population density, but this trend is disrupted significantly by microscale variations in population configuration and genetic circuit noise. We then developed a computational model that integrates population dynamics with genetic circuit dynamics to enable accurate (less than 7% error) quantitation of the bacterial QS activation time. Through modeling and experimental analyses, we show that changes in spatial configuration of swimming bacteria can drastically alter the QS activation time, by up to 22%. The integrative model developed herein also enables examination of the performance robustness of synthetic circuits with respect to growth rate, circuit sensitivity, and the population's initial size and spatial structure. Our framework facilitates quantitative tuning of microbial systems performance through rational engineering of synthetic ribosomal binding sites. We have demonstrated this through modulation of QS activation time over an order of magnitude. Altogether, we conclude that predictive engineering of QS-based bacterial systems requires not only the precise temporal modulation of gene expression (intracellular dynamics) but also accounting for the spatiotemporal changes in population configuration (intercellular dynamics).
Bremner, P D; Blacklock, C J; Paganga, G; Mullen, W; Rice-Evans, C A; Crozier, A
2000-06-01
After minimal sample preparation, two different HPLC methodologies, one based on a single gradient reversed-phase HPLC step, the other on multiple HPLC runs each optimised for specific components, were used to investigate the composition of flavonoids and phenolic acids in apple and tomato juices. The principal components in apple juice were identified as chlorogenic acid, phloridzin, caffeic acid and p-coumaric acid. Tomato juice was found to contain chlorogenic acid, caffeic acid, p-coumaric acid, naringenin and rutin. The quantitative estimates of the levels of these compounds, obtained with the two HPLC procedures, were very similar, demonstrating that either method can be used to analyse accurately the phenolic components of apple and tomato juices. Chlorogenic acid in tomato juice was the only component not fully resolved in the single run study and the multiple run analysis prior to enzyme treatment. The single run system of analysis is recommended for the initial investigation of plant phenolics and the multiple run approach for analyses where chromatographic resolution requires improvement.
NASA Astrophysics Data System (ADS)
Yehia, Ali M.; Arafa, Reham M.; Abbas, Samah S.; Amer, Sawsan M.
2016-01-01
Spectral resolution of cefquinome sulfate (CFQ) in the presence of its degradation products was studied. Three selective, accurate and rapid spectrophotometric methods were performed for the determination of CFQ in the presence of either its hydrolytic, oxidative or photo-degradation products. The proposed ratio difference, derivative ratio and mean centering are ratio manipulating spectrophotometric methods that were satisfactorily applied for selective determination of CFQ within linear range of 5.0-40.0 μg mL- 1. Concentration Residuals Augmented Classical Least Squares was applied and evaluated for the determination of the cited drug in the presence of its all degradation products. Traditional Partial Least Squares regression was also applied and benchmarked against the proposed advanced multivariate calibration. Experimentally designed 25 synthetic mixtures of three factors at five levels were used to calibrate and validate the multivariate models. Advanced chemometrics succeeded in quantitative and qualitative analyses of CFQ along with its hydrolytic, oxidative and photo-degradation products. The proposed methods were applied successfully for different pharmaceutical formulations analyses. These developed methods were simple and cost-effective compared with the manufacturer's RP-HPLC method.
Applied spectrophotometry: analysis of a biochemical mixture.
Trumbo, Toni A; Schultz, Emeric; Borland, Michael G; Pugh, Michael Eugene
2013-01-01
Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the concentration of a given species (RNA, DNA, protein) in isolation (a contrived circumstance) as opposed to determining that concentration in the presence of other species (a more realistic situation). To present the student with a more realistic laboratory experience and also to fill a hole that we believe exists in student experience prior to reaching a biochemistry course, we have devised a three week laboratory experience designed so that students learn to: connect laboratory practice with theory, apply the Beer-Lambert-Bougert Law to biochemical analyses, demonstrate the utility and limitations of example quantitative colorimetric assays, demonstrate the utility and limitations of UV analyses for biomolecules, develop strategies for analysis of a solution of unknown biomolecular composition, use digital micropipettors to make accurate and precise measurements, and apply graphing software. Copyright © 2013 Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Anderson, Lowell Bruce; Bracken, Jerome; Bracken, Marilyn C.
This volume compiles, and presents in integrated form, the Institute for Defense Analyses' (IDA) quantitative analysis of educational quality provided by the Department of Defense's dependent schools. It covers the quantitative aspects of volume 1 in greater detail and presents some analyses deemed too technical for that volume. The first task in…
Manpower Systems Integration Factors for Frigate Design in the Turkish Navy
2016-12-01
factors for frigate design in the Turkish Navy. The qualitative and quantitative analyses of the correlation between ship design specifications and...frigates. The correlation between the ship design characteristics and the manpower requirements is supported by the quantitative analysis. This... design in the Turkish Navy. The qualitative and quantitative analyses of the correlation between ship design specifications and manpower requirements
Investigating the Validity of Two Widely Used Quantitative Text Tools
ERIC Educational Resources Information Center
Cunningham, James W.; Hiebert, Elfrieda H.; Mesmer, Heidi Anne
2018-01-01
In recent years, readability formulas have gained new prominence as a basis for selecting texts for learning and assessment. Variables that quantitative tools count (e.g., word frequency, sentence length) provide valid measures of text complexity insofar as they accurately predict representative and high-quality criteria. The longstanding…
Study of correlations from Ab-Initio Simulations of Liquid Water
NASA Astrophysics Data System (ADS)
Soto, Adrian; Fernandez-Serra, Marivi; Lu, Deyu; Yoo, Shinjae
An accurate understanding of the dynamics and the structure of H2O molecules in the liquid phase is of extreme importance both from a fundamental and from a practical standpoint. Despite the successes of Molecular Dynamics (MD) with Density Functional Theory (DFT), liquid water remains an extremely difficult material to simulate accurately and efficiently because of fine balance between the covalent O-H bond, the hydrogen bond and the attractive the van der Waals forces. Small errors in those produce dramatic changes in the macroscopic properties of the liquid or in its structural properties. Different density functionals produce answers that differ by as much as 35% in ambient conditions, with none producing quantitative results in agreement with experiment at different mass densities. In order to understand these differences we perform an exhaustive scanning of the geometrical coordinates of MD simulations and study their statistical correlations with the simulation output quantities using advanced correlation analyses and machine learning techniques. This work was partially supported by DOE Award No. DE-FG02-09ER16052, by DOE Early Career Award No. DE-SC0003871, by BNL LDRD 16-039 project and BNL Contract No. DE-SC0012704.
Study of correlations from Ab-Initio Simulations of Liquid Water
NASA Astrophysics Data System (ADS)
Soto, Adrian; Fernandez-Serra, Marivi; Lu, Deyu; Yoo, Shinjae
An accurate understanding of the dynamics and the structure of H2O molecules in the liquid phase is of extreme importance both from a fundamental and from a practical standpoint. Despite the successes of Molecular Dynamics (MD) with Density Functional Theory (DFT), liquid water remains an extremely difficult material to simulate accurately and efficiently because of fine balance between the covalent O-H bond, the hydrogen bond and the attractive the van der Waals forces. Small errors in those produce dramatic changes in the macroscopic properties of the liquid or in its structural properties. Different density functionals produce answers that differ by as much as 35% in ambient conditions, with none producing quantitative results in agreement with experiment at different mass densities [J. Chem Phys. 139, 194502(2013)]. In order to understand these differences we perform an exhaustive scanning of the geometrical coordinates of MD simulations and study their statistical correlations with the simulation output quantities using advanced correlation analyses and machine learning techniques. This work was partially supported by DOE Award No. DE-FG02-09ER16052, by DOE Early Career Award No. DE-SC0003871, by BNL LDRD 16-039 project and BNL Contract No. DE-SC0012704.
NASA Astrophysics Data System (ADS)
Mohamed, Gehad G.; Hamed, Maher M.; Zaki, Nadia G.; Abdou, Mohamed M.; Mohamed, Marwa El-Badry; Abdallah, Abanoub Mosaad
2017-07-01
A simple, accurate and fast spectrophotometric method for the quantitative determination of melatonin (ML) drug in its pure and pharmaceutical forms was developed based on the formation of its charge transfer complex with 2,3-dichloro-5,6-dicyano-1,4-benzoquinone (DDQ) as an electron acceptor. The different conditions for this method were optimized accurately. The Lambert-Beer's law was found to be valid over the concentration range of 4-100 μg mL- 1 ML. The solid form of the CT complex was structurally characterized by means of different spectral methods. Density functional theory (DFT) and time-dependent density functional theory (TD-DFT) calculations were carried out. The different quantum chemical parameters of the CT complex were calculated. Thermal properties of the CT complex and its kinetic thermodynamic parameters were studied, as well as its antimicrobial and antifungal activities were investigated. Molecular docking studies were performed to predict the binding modes of the CT complex components towards E. coli bacterial RNA and the receptor of breast cancer mutant oxidoreductase.
Hawley, James M; Keevil, Brian G
2016-09-01
Liquid chromatography-tandem mass spectrometry (LC-MS/MS) is a powerful analytical technique that offers exceptional selectivity and sensitivity. Used optimally, LC-MS/MS provides accurate and precise results for a wide range of analytes at concentrations that are difficult to quantitate with other methodologies. Its implementation into routine clinical biochemistry laboratories has revolutionised our ability to analyse small molecules such as glucocorticoids. Whereas immunoassays can suffer from matrix effects and cross-reactivity due to interactions with structural analogues, the selectivity offered by LC-MS/MS has largely overcome these limitations. As many clinical guidelines are now beginning to acknowledge the importance of the methodology used to provide results, the advantages associated with LC-MS/MS are gaining wider recognition. With their integral role in both the diagnosis and management of hypo- and hyperadrenal disorders, coupled with their widespread pharmacological use, the accurate measurement of glucocorticoids is fundamental to effective patient care. Here, we provide an up-to-date review of the LC-MS/MS techniques used to successfully measure endogenous glucocorticoids, particular reference is made to serum, urine and salivary cortisol. Copyright © 2016 Elsevier Ltd. All rights reserved.
Determination of phosphate in natural waters by activation analysis of tungstophosphoric acid
Allen, Herbert E.; Hahn, Richard B.
1969-01-01
Activation analysis may be used to determine quantitatively traces of phosphate in natural waters. Methods based on the reaction 31P(n,γ)32P are subject to interference by sulfur and chlorine which give rise to 32P through n,p and n,α reactions. If the ratio of phosphorus to sulfur or chlorine is small, as it is in most natural waters, accurate analyses by these methods are difficult to achieve. In the activation analysis method, molybdate and tungstate ions are added to samples containing phosphate ion to form tungstomolybdophosphoric acid. The complex is extracted with 2,6-dimethyl-4-heptanone. After activation of an aliquot of the organic phase for 1 hour at a flux of 1013 neutrons per cm2, per second, the gamma spectrum is essentially that of tungsten-187. The induced activity is proportional to the concentration of phosphate in the sample. A test of the method showed it to give accurate results at concentrations of 4 to at least 200 p.p.b. of phosphorus when an aliquot of 100 μl. was activated. By suitable reagent purification, counting for longer times, and activation of larger aliquots, the detection limit could be lowered several hundredfold.
Mohamed, Gehad G; Hamed, Maher M; Zaki, Nadia G; Abdou, Mohamed M; Mohamed, Marwa El-Badry; Abdallah, Abanoub Mosaad
2017-07-05
A simple, accurate and fast spectrophotometric method for the quantitative determination of melatonin (ML) drug in its pure and pharmaceutical forms was developed based on the formation of its charge transfer complex with 2,3-dichloro-5,6-dicyano-1,4-benzoquinone (DDQ) as an electron acceptor. The different conditions for this method were optimized accurately. The Lambert-Beer's law was found to be valid over the concentration range of 4-100μgmL -1 ML. The solid form of the CT complex was structurally characterized by means of different spectral methods. Density functional theory (DFT) and time-dependent density functional theory (TD-DFT) calculations were carried out. The different quantum chemical parameters of the CT complex were calculated. Thermal properties of the CT complex and its kinetic thermodynamic parameters were studied, as well as its antimicrobial and antifungal activities were investigated. Molecular docking studies were performed to predict the binding modes of the CT complex components towards E. coli bacterial RNA and the receptor of breast cancer mutant oxidoreductase. Copyright © 2017 Elsevier B.V. All rights reserved.
Wang, Shunhai; Bobst, Cedric E.; Kaltashov, Igor A.
2018-01-01
Transferrin (Tf) is an 80 kDa iron-binding protein which is viewed as a promising drug carrier to target the central nervous system due to its ability to penetrate the blood-brain barrier (BBB). Among the many challenges during the development of Tf-based therapeutics, sensitive and accurate quantitation of the administered Tf in cerebrospinal fluid (CSF) remains particularly difficult due to the presence of abundant endogenous Tf. Herein, we describe the development of a new LC-MS based method for sensitive and accurate quantitation of exogenous recombinant human Tf in rat CSF. By taking advantage of a His-tag present in recombinant Tf and applying Ni affinity purification, the exogenous hTf can be greatly enriched from rat CSF, despite the presence of the abundant endogenous protein. Additionally, we applied a newly developed O18-labeling technique that can generate internal standards at the protein level, which greatly improved the accuracy and robustness of quantitation. The developed method was investigated for linearity, accuracy, precision and lower limit of quantitation, all of which met the commonly accepted criteria for bioanalytical method validation. PMID:26307718
Communication—Quantitative Voltammetric Analysis of High Concentration Actinides in Molten Salts
Hoyt, Nathaniel C.; Willit, James L.; Williamson, Mark A.
2017-01-18
Previous electroanalytical studies have shown that cyclic voltammetry can provide accurate quantitative measurements of actinide concentrations at low weight loadings in molten salts. However, above 2 wt%, the techniques were found to underpredict the concentrations of the reactant species. Here this work will demonstrate that much of the discrepancy is caused by uncompensated resistance and cylindrical diffusion. An improved electroanalytical approach has therefore been developed using the results of digital simulations to take these effects into account. This approach allows for accurate electroanalytical predictions across the full range of weight loadings expected to be encountered in operational nuclear fuel processingmore » equipment.« less
Communication—Quantitative Voltammetric Analysis of High Concentration Actinides in Molten Salts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoyt, Nathaniel C.; Willit, James L.; Williamson, Mark A.
Previous electroanalytical studies have shown that cyclic voltammetry can provide accurate quantitative measurements of actinide concentrations at low weight loadings in molten salts. However, above 2 wt%, the techniques were found to underpredict the concentrations of the reactant species. Here this work will demonstrate that much of the discrepancy is caused by uncompensated resistance and cylindrical diffusion. An improved electroanalytical approach has therefore been developed using the results of digital simulations to take these effects into account. This approach allows for accurate electroanalytical predictions across the full range of weight loadings expected to be encountered in operational nuclear fuel processingmore » equipment.« less
Fu, Yong-Bi; Yang, Mo-Hua; Zeng, Fangqin; Biligetu, Bill
2017-01-01
Molecular plant breeding with the aid of molecular markers has played an important role in modern plant breeding over the last two decades. Many marker-based predictions for quantitative traits have been made to enhance parental selection, but the trait prediction accuracy remains generally low, even with the aid of dense, genome-wide SNP markers. To search for more accurate trait-specific prediction with informative SNP markers, we conducted a literature review on the prediction issues in molecular plant breeding and on the applicability of an RNA-Seq technique for developing function-associated specific trait (FAST) SNP markers. To understand whether and how FAST SNP markers could enhance trait prediction, we also performed a theoretical reasoning on the effectiveness of these markers in a trait-specific prediction, and verified the reasoning through computer simulation. To the end, the search yielded an alternative to regular genomic selection with FAST SNP markers that could be explored to achieve more accurate trait-specific prediction. Continuous search for better alternatives is encouraged to enhance marker-based predictions for an individual quantitative trait in molecular plant breeding. PMID:28729875
NASA Astrophysics Data System (ADS)
Clancy, Michael; Belli, Antonio; Davies, David; Lucas, Samuel J. E.; Su, Zhangjie; Dehghani, Hamid
2015-07-01
The subject of superficial contamination and signal origins remains a widely debated topic in the field of Near Infrared Spectroscopy (NIRS), yet the concept of using the technology to monitor an injured brain, in a clinical setting, poses additional challenges concerning the quantitative accuracy of recovered parameters. Using high density diffuse optical tomography probes, quantitatively accurate parameters from different layers (skin, bone and brain) can be recovered from subject specific reconstruction models. This study assesses the use of registered atlas models for situations where subject specific models are not available. Data simulated from subject specific models were reconstructed using the 8 registered atlas models implementing a regional (layered) parameter recovery in NIRFAST. A 3-region recovery based on the atlas model yielded recovered brain saturation values which were accurate to within 4.6% (percentage error) of the simulated values, validating the technique. The recovered saturations in the superficial regions were not quantitatively accurate. These findings highlight differences in superficial (skin and bone) layer thickness between the subject and atlas models. This layer thickness mismatch was propagated through the reconstruction process decreasing the parameter accuracy.
NASA Astrophysics Data System (ADS)
Yuan, Wu; Kut, Carmen; Liang, Wenxuan; Li, Xingde
2017-03-01
Cancer is known to alter the local optical properties of tissues. The detection of OCT-based optical attenuation provides a quantitative method to efficiently differentiate cancer from non-cancer tissues. In particular, the intraoperative use of quantitative OCT is able to provide a direct visual guidance in real time for accurate identification of cancer tissues, especially these without any obvious structural layers, such as brain cancer. However, current methods are suboptimal in providing high-speed and accurate OCT attenuation mapping for intraoperative brain cancer detection. In this paper, we report a novel frequency-domain (FD) algorithm to enable robust and fast characterization of optical attenuation as derived from OCT intensity images. The performance of this FD algorithm was compared with traditional fitting methods by analyzing datasets containing images from freshly resected human brain cancer and from a silica phantom acquired by a 1310 nm swept-source OCT (SS-OCT) system. With graphics processing unit (GPU)-based CUDA C/C++ implementation, this new attenuation mapping algorithm can offer robust and accurate quantitative interpretation of OCT images in real time during brain surgery.
Wills, Jimi; Edwards-Hicks, Joy; Finch, Andrew J
2017-09-19
Metabolic analyses generally fall into two classes: unbiased metabolomic analyses and analyses that are targeted toward specific metabolites. Both techniques have been revolutionized by the advent of mass spectrometers with detectors that afford high mass accuracy and resolution, such as time-of-flights (TOFs) and Orbitraps. One particular area where this technology is key is in the field of metabolic flux analysis because the resolution of these spectrometers allows for discrimination between 13 C-containing isotopologues and those containing 15 N or other isotopes. While XCMS-based software is freely available for untargeted analysis of mass spectrometric data sets, it does not always identify metabolites of interest in a targeted assay. Furthermore, there is a paucity of vendor-independent software that deals with targeted analyses of metabolites and of isotopologues in particular. Here, we present AssayR, an R package that takes high resolution wide-scan liquid chromatography-mass spectrometry (LC-MS) data sets and tailors peak detection for each metabolite through a simple, iterative user interface. It automatically integrates peak areas for all isotopologues and outputs extracted ion chromatograms (EICs), absolute and relative stacked bar charts for all isotopologues, and a .csv data file. We demonstrate several examples where AssayR provides more accurate and robust quantitation than XCMS, and we propose that tailored peak detection should be the preferred approach for targeted assays. In summary, AssayR provides easy and robust targeted metabolite and stable isotope analyses on wide-scan data sets from high resolution mass spectrometers.
2017-01-01
Metabolic analyses generally fall into two classes: unbiased metabolomic analyses and analyses that are targeted toward specific metabolites. Both techniques have been revolutionized by the advent of mass spectrometers with detectors that afford high mass accuracy and resolution, such as time-of-flights (TOFs) and Orbitraps. One particular area where this technology is key is in the field of metabolic flux analysis because the resolution of these spectrometers allows for discrimination between 13C-containing isotopologues and those containing 15N or other isotopes. While XCMS-based software is freely available for untargeted analysis of mass spectrometric data sets, it does not always identify metabolites of interest in a targeted assay. Furthermore, there is a paucity of vendor-independent software that deals with targeted analyses of metabolites and of isotopologues in particular. Here, we present AssayR, an R package that takes high resolution wide-scan liquid chromatography–mass spectrometry (LC-MS) data sets and tailors peak detection for each metabolite through a simple, iterative user interface. It automatically integrates peak areas for all isotopologues and outputs extracted ion chromatograms (EICs), absolute and relative stacked bar charts for all isotopologues, and a .csv data file. We demonstrate several examples where AssayR provides more accurate and robust quantitation than XCMS, and we propose that tailored peak detection should be the preferred approach for targeted assays. In summary, AssayR provides easy and robust targeted metabolite and stable isotope analyses on wide-scan data sets from high resolution mass spectrometers. PMID:28850215
Nuclear model calculations and their role in space radiation research
NASA Technical Reports Server (NTRS)
Townsend, L. W.; Cucinotta, F. A.; Heilbronn, L. H.
2002-01-01
Proper assessments of spacecraft shielding requirements and concomitant estimates of risk to spacecraft crews from energetic space radiation requires accurate, quantitative methods of characterizing the compositional changes in these radiation fields as they pass through thick absorbers. These quantitative methods are also needed for characterizing accelerator beams used in space radiobiology studies. Because of the impracticality/impossibility of measuring these altered radiation fields inside critical internal body organs of biological test specimens and humans, computational methods rather than direct measurements must be used. Since composition changes in the fields arise from nuclear interaction processes (elastic, inelastic and breakup), knowledge of the appropriate cross sections and spectra must be available. Experiments alone cannot provide the necessary cross section and secondary particle (neutron and charged particle) spectral data because of the large number of nuclear species and wide range of energies involved in space radiation research. Hence, nuclear models are needed. In this paper current methods of predicting total and absorption cross sections and secondary particle (neutrons and ions) yields and spectra for space radiation protection analyses are reviewed. Model shortcomings are discussed and future needs presented. c2002 COSPAR. Published by Elsevier Science Ltd. All right reserved.
NASA Astrophysics Data System (ADS)
Glavanović, Siniša; Glavanović, Marija; Tomišić, Vladislav
2016-03-01
The UV spectrophotometric methods for simultaneous quantitative determination of paracetamol and tramadol in paracetamol-tramadol tablets were developed. The spectrophotometric data obtained were processed by means of partial least squares (PLS) and genetic algorithm coupled with PLS (GA-PLS) methods in order to determine the content of active substances in the tablets. The results gained by chemometric processing of the spectroscopic data were statistically compared with those obtained by means of validated ultra-high performance liquid chromatographic (UHPLC) method. The accuracy and precision of data obtained by the developed chemometric models were verified by analysing the synthetic mixture of drugs, and by calculating recovery as well as relative standard error (RSE). A statistically good agreement was found between the amounts of paracetamol determined using PLS and GA-PLS algorithms, and that obtained by UHPLC analysis, whereas for tramadol GA-PLS results were proven to be more reliable compared to those of PLS. The simplest and the most accurate and precise models were constructed by using the PLS method for paracetamol (mean recovery 99.5%, RSE 0.89%) and the GA-PLS method for tramadol (mean recovery 99.4%, RSE 1.69%).
Plainchont, Bertrand; Pitoux, Daisy; Cyrille, Mathieu; Giraud, Nicolas
2018-02-06
We propose an original concept to measure accurately enantiomeric excesses on proton NMR spectra, which combines high-resolution techniques based on a spatial encoding of the sample, with the use of optically active weakly orienting solvents. We show that it is possible to simulate accurately dipolar edited spectra of enantiomers dissolved in a chiral liquid crystalline phase, and to use these simulations to calibrate integrations that can be measured on experimental data, in order to perform a quantitative chiral analysis. This approach is demonstrated on a chemical intermediate for which optical purity is an essential criterion. We find that there is a very good correlation between the experimental and calculated integration ratios extracted from G-SERF spectra, which paves the way to a general method of determination of enantiomeric excesses based on the observation of 1 H nuclei.
NASA Astrophysics Data System (ADS)
Shuxia, ZHAO; Lei, ZHANG; Jiajia, HOU; Yang, ZHAO; Wangbao, YIN; Weiguang, MA; Lei, DONG; Liantuan, XIAO; Suotang, JIA
2018-03-01
The chemical composition of alloys directly determines their mechanical behaviors and application fields. Accurate and rapid analysis of both major and minor elements in alloys plays a key role in metallurgy quality control and material classification processes. A quantitative calibration-free laser-induced breakdown spectroscopy (CF-LIBS) analysis method, which carries out combined correction of plasma temperature and spectral intensity by using a second-order iterative algorithm and two boundary standard samples, is proposed to realize accurate composition measurements. Experimental results show that, compared to conventional CF-LIBS analysis, the relative errors for major elements Cu and Zn and minor element Pb in the copper-lead alloys has been reduced from 12%, 26% and 32% to 1.8%, 2.7% and 13.4%, respectively. The measurement accuracy for all elements has been improved substantially.
Numerical framework for the modeling of electrokinetic flows
NASA Astrophysics Data System (ADS)
Deshpande, Manish; Ghaddar, Chahid; Gilbert, John R.; St. John, Pamela M.; Woudenberg, Timothy M.; Connell, Charles R.; Molho, Joshua; Herr, Amy; Mungal, Godfrey; Kenny, Thomas W.
1998-09-01
This paper presents a numerical framework for design-based analyses of electrokinetic flow in interconnects. Electrokinetic effects, which can be broadly divided into electrophoresis and electroosmosis, are of importance in providing a transport mechanism in microfluidic devices for both pumping and separation. Models for the electrokinetic effects can be derived and coupled to the fluid dynamic equations through appropriate source terms. In the design of practical microdevices, however, accurate coupling of the electrokinetic effects requires the knowledge of several material and physical parameters, such as the diffusivity and the mobility of the solute in the solvent. Additionally wall-based effects such as chemical binding sites might exist that affect the flow patterns. In this paper, we address some of these issues by describing a synergistic numerical/experimental process to extract the parameters required. Experiments were conducted to provide the numerical simulations with a mechanism to extract these parameters based on quantitative comparisons with each other. These parameters were then applied in predicting further experiments to validate the process. As part of this research, we have created NetFlow, a tool for micro-fluid analyses. The tool can be validated and applied in existing technologies by first creating test structures to extract representations of the physical phenomena in the device, and then applying them in the design analyses to predict correct behavior.
Using Inequality Measures to Incorporate Environmental Justice into Regulatory Analyses
Abstract: Formally evaluating how specific policy measures influence environmental justice is challenging, especially in the context of regulatory analyses in which quantitative comparisons are the norm. However, there is a large literature on developing and applying quantitative...
Tumour cell dispersion by the ultrasonic aspirator during brain tumour resection.
Preston, J K; Masciopinto, J; Salamat, M S; Badie, B
1999-10-01
Ultrasonic aspirators are commonly used to resect brain tumours because they allow safe, rapid and accurate removal of diseased tissue. Since ultrasonic aspirators generate a spray of aerosolized irrigating fluid around the instrument tip, we questioned whether this spray might contain viable tumours cells that could contribute to intraoperative spread of tumour fragments. To test this hypothesis, we collected the spray produced during the resection of nine brain tumours with an ultrasonic aspirator and semi-quantitatively analysed it for tumour presence. The aerosolized irrigation fluid was found to contain intact tumour cells or clumps of tumour cells in all nine instances, and there was a trend of increasing tumour cell dispersion with increasing ultrasonic aspiration times. Further examination is required to determine if this intraoperative dispersion of apparently viable tumour fragments contributes to local neoplasm recurrence.
Speciation of mercury compounds by differential atomization - atomic absorption spectroscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, J.W.; Skelly, E.M.
This paper describes the dual stage atomization technique which allows speciation of several mercury-containing compounds in aqueous solution and in biological fluids. The technique holds great promise for further speciation studies. Accurate temperature control, expecially at temperatures less than 200/sup 0/C, is needed to separate the extremely volatile mercury halides and simple organomercurials from each other. Studies with mercury salts and EDTA, L-cysteine and dithioxamide demonstrate that this technique may be used to study the extent of complex formation. Investigations of biological fluids indicate that there is a single predominant form of mercury in sweat and a single predominant formmore » of mercury in urine. The mercury compound in urine is more volatile than that in sweat. Both quantitative and qualitative analyses are possible with this technique.« less
Loescher, Christine M; Morton, David W; Razic, Slavica; Agatonovic-Kustrin, Snezana
2014-09-01
Chromatography techniques such as HPTLC and HPLC are commonly used to produce a chemical fingerprint of a plant to allow identification and quantify the main constituents within the plant. The aims of this study were to compare HPTLC and HPLC, for qualitative and quantitative analysis of the major constituents of Calendula officinalis and to investigate the effect of different extraction techniques on the C. officinalis extract composition from different parts of the plant. The results found HPTLC to be effective for qualitative analysis, however, HPLC was found to be more accurate for quantitative analysis. A combination of the two methods may be useful in a quality control setting as it would allow rapid qualitative analysis of herbal material while maintaining accurate quantification of extract composition. Copyright © 2014 Elsevier B.V. All rights reserved.
[Doppler echocardiography of tricuspid insufficiency. Methods of quantification].
Loubeyre, C; Tribouilloy, C; Adam, M C; Mirode, A; Trojette, F; Lesbre, J P
1994-01-01
Evaluation of tricuspid incompetence has benefitted considerably from the development of Doppler ultrasound. In addition to direct analysis of the valves, which provides information about the mechanism involved, this method is able to provide an accurate evaluation, mainly through use of the Doppler mode. In addition to new criteria being evaluated (mainly the convergence zone of the regurgitant jet), some indices are recognised as good quantitative parameters: extension of the regurgitant jet into the right atrium, anterograde tricuspid flow, laminar nature of the regurgitant flow, analysis of the flow in the supra-hepatic veins, this is only semi-quantitative, since the calculation of the regurgitation fraction from the pulsed Doppler does not seem to be reliable; This accurate semi-quantitative evaluation is made possible by careful and consistent use of all the criteria available. The authors set out to discuss the value of the various evaluation criteria mentioned in the literature and try to define a practical approach.
Finding the bottom and using it
Sandoval, Ruben M.; Wang, Exing; Molitoris, Bruce A.
2014-01-01
Maximizing 2-photon parameters used in acquiring images for quantitative intravital microscopy, especially when high sensitivity is required, remains an open area of investigation. Here we present data on correctly setting the black level of the photomultiplier tube amplifier by adjusting the offset to allow for accurate quantitation of low intensity processes. When the black level is set too high some low intensity pixel values become zero and a nonlinear degradation in sensitivity occurs rendering otherwise quantifiable low intensity values virtually undetectable. Initial studies using a series of increasing offsets for a sequence of concentrations of fluorescent albumin in vitro revealed a loss of sensitivity for higher offsets at lower albumin concentrations. A similar decrease in sensitivity, and therefore the ability to correctly determine the glomerular permeability coefficient of albumin, occurred in vivo at higher offset. Finding the offset that yields accurate and linear data are essential for quantitative analysis when high sensitivity is required. PMID:25313346
Quantitative fluorescence tomography using a trimodality system: in vivo validation
Lin, Yuting; Barber, William C.; Iwanczyk, Jan S.; Roeck, Werner W.; Nalcioglu, Orhan; Gulsen, Gultekin
2010-01-01
A fully integrated trimodality fluorescence, diffuse optical, and x-ray computed tomography (FT∕DOT∕XCT) system for small animal imaging is reported in this work. The main purpose of this system is to obtain quantitatively accurate fluorescence concentration images using a multimodality approach. XCT offers anatomical information, while DOT provides the necessary background optical property map to improve FT image accuracy. The quantitative accuracy of this trimodality system is demonstrated in vivo. In particular, we show that a 2-mm-diam fluorescence inclusion located 8 mm deep in a nude mouse can only be localized when functional a priori information from DOT is available. However, the error in the recovered fluorophore concentration is nearly 87%. On the other hand, the fluorophore concentration can be accurately recovered within 2% error when both DOT functional and XCT structural a priori information are utilized together to guide and constrain the FT reconstruction algorithm. PMID:20799770
Feng, Xiang; Deistung, Andreas; Dwyer, Michael G; Hagemeier, Jesper; Polak, Paul; Lebenberg, Jessica; Frouin, Frédérique; Zivadinov, Robert; Reichenbach, Jürgen R; Schweser, Ferdinand
2017-06-01
Accurate and robust segmentation of subcortical gray matter (SGM) nuclei is required in many neuroimaging applications. FMRIB's Integrated Registration and Segmentation Tool (FIRST) is one of the most popular software tools for automated subcortical segmentation based on T 1 -weighted (T1w) images. In this work, we demonstrate that FIRST tends to produce inaccurate SGM segmentation results in the case of abnormal brain anatomy, such as present in atrophied brains, due to a poor spatial match of the subcortical structures with the training data in the MNI space as well as due to insufficient contrast of SGM structures on T1w images. Consequently, such deviations from the average brain anatomy may introduce analysis bias in clinical studies, which may not always be obvious and potentially remain unidentified. To improve the segmentation of subcortical nuclei, we propose to use FIRST in combination with a special Hybrid image Contrast (HC) and Non-Linear (nl) registration module (HC-nlFIRST), where the hybrid image contrast is derived from T1w images and magnetic susceptibility maps to create subcortical contrast that is similar to that in the Montreal Neurological Institute (MNI) template. In our approach, a nonlinear registration replaces FIRST's default linear registration, yielding a more accurate alignment of the input data to the MNI template. We evaluated our method on 82 subjects with particularly abnormal brain anatomy, selected from a database of >2000 clinical cases. Qualitative and quantitative analyses revealed that HC-nlFIRST provides improved segmentation compared to the default FIRST method. Copyright © 2017 Elsevier Inc. All rights reserved.
Digital image analysis: improving accuracy and reproducibility of radiographic measurement.
Bould, M; Barnard, S; Learmonth, I D; Cunningham, J L; Hardy, J R
1999-07-01
To assess the accuracy and reproducibility of a digital image analyser and the human eye, in measuring radiographic dimensions. We experimentally compared radiographic measurement using either an image analyser system or the human eye with digital caliper. The assessment of total hip arthroplasty wear from radiographs relies on both the accuracy of radiographic images and the accuracy of radiographic measurement. Radiographs were taken of a slip gauge (30+/-0.00036 mm) and slip gauge with a femoral stem. The projected dimensions of the radiographic images were calculated by trigonometry. The radiographic dimensions were then measured by blinded observers using both techniques. For a single radiograph, the human eye was accurate to 0.26 mm and reproducible to +/-0.1 mm. In comparison the digital image analyser system was accurate to 0.01 mm with a reproducibility of +/-0.08 mm. In an arthroplasty model, where the dimensions of an object were corrected for magnification by the known dimensions of a femoral head, the human eye was accurate to 0.19 mm, whereas the image analyser system was accurate to 0.04 mm. The digital image analysis system is up to 20 times more accurate than the human eye, and in an arthroplasty model the accuracy of measurement increases four-fold. We believe such image analysis may allow more accurate and reproducible measurement of wear from standard follow-up radiographs.
NASA Astrophysics Data System (ADS)
Nielsen, Roger L.; Ustunisik, Gokce; Weinsteiger, Allison B.; Tepley, Frank J.; Johnston, A. Dana; Kent, Adam J. R.
2017-09-01
Quantitative models of petrologic processes require accurate partition coefficients. Our ability to obtain accurate partition coefficients is constrained by their dependence on pressure temperature and composition, and on the experimental and analytical techniques we apply. The source and magnitude of error in experimental studies of trace element partitioning may go unrecognized if one examines only the processed published data. The most important sources of error are relict crystals, and analyses of more than one phase in the analytical volume. Because we have typically published averaged data, identification of compromised data is difficult if not impossible. We addressed this problem by examining unprocessed data from plagioclase/melt partitioning experiments, by comparing models based on that data with existing partitioning models, and evaluated the degree to which the partitioning models are dependent on the calibration data. We found that partitioning models are dependent on the calibration data in ways that result in erroneous model values, and that the error will be systematic and dependent on the value of the partition coefficient. In effect, use of different calibration datasets will result in partitioning models whose results are systematically biased, and that one can arrive at different and conflicting conclusions depending on how a model is calibrated, defeating the purpose of applying the models. Ultimately this is an experimental data problem, which can be solved if we publish individual analyses (not averages) or use a projection method wherein we use an independent compositional constraint to identify and estimate the uncontaminated composition of each phase.
USDA-ARS?s Scientific Manuscript database
Quantitative real-time polymerase chain reaction (qRT-PCR) is a commonly used technique for measuring gene expression levels due to its simplicity, specificity, and sensitivity. Reliable reference selection for the accurate quantification of gene expression under various experimental conditions is a...
Winfree, Seth; Dagher, Pierre C; Dunn, Kenneth W; Eadon, Michael T; Ferkowicz, Michael; Barwinska, Daria; Kelly, Katherine J; Sutton, Timothy A; El-Achkar, Tarek M
2018-06-05
Kidney biopsy remains the gold standard for uncovering the pathogenesis of acute and chronic kidney diseases. However, the ability to perform high resolution, quantitative, molecular and cellular interrogation of this precious tissue is still at a developing stage compared to other fields such as oncology. Here, we discuss recent advances in performing large-scale, three-dimensional (3D), multi-fluorescence imaging of kidney biopsies and quantitative analysis referred to as 3D tissue cytometry. This approach allows the accurate measurement of specific cell types and their spatial distribution in a thick section spanning the entire length of the biopsy. By uncovering specific disease signatures, including rare occurrences, and linking them to the biology in situ, this approach will enhance our understanding of disease pathogenesis. Furthermore, by providing accurate quantitation of cellular events, 3D cytometry may improve the accuracy of prognosticating the clinical course and response to therapy. Therefore, large-scale 3D imaging and cytometry of kidney biopsy is poised to become a bridge towards personalized medicine for patients with kidney disease. © 2018 S. Karger AG, Basel.
Wagner, Rebecca; Wetzel, Stephanie J; Kern, John; Kingston, H M Skip
2012-02-01
The employment of chemical weapons by rogue states and/or terrorist organizations is an ongoing concern in the United States. The quantitative analysis of nerve agents must be rapid and reliable for use in the private and public sectors. Current methods describe a tedious and time-consuming derivatization for gas chromatography-mass spectrometry and liquid chromatography in tandem with mass spectrometry. Two solid-phase extraction (SPE) techniques for the analysis of glyphosate and methylphosphonic acid are described with the utilization of isotopically enriched analytes for quantitation via atmospheric pressure chemical ionization-quadrupole time-of-flight mass spectrometry (APCI-Q-TOF-MS) that does not require derivatization. Solid-phase extraction-isotope dilution mass spectrometry (SPE-IDMS) involves pre-equilibration of a naturally occurring sample with an isotopically enriched standard. The second extraction method, i-Spike, involves loading an isotopically enriched standard onto the SPE column before the naturally occurring sample. The sample and the spike are then co-eluted from the column enabling precise and accurate quantitation via IDMS. The SPE methods in conjunction with IDMS eliminate concerns of incomplete elution, matrix and sorbent effects, and MS drift. For accurate quantitation with IDMS, the isotopic contribution of all atoms in the target molecule must be statistically taken into account. This paper describes two newly developed sample preparation techniques for the analysis of nerve agent surrogates in drinking water as well as statistical probability analysis for proper molecular IDMS. The methods described in this paper demonstrate accurate molecular IDMS using APCI-Q-TOF-MS with limits of quantitation as low as 0.400 mg/kg for glyphosate and 0.031 mg/kg for methylphosphonic acid. Copyright © 2012 John Wiley & Sons, Ltd.
Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results
NASA Astrophysics Data System (ADS)
Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.
2014-03-01
Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.
Alados, C.L.; Pueyo, Y.; Giner, M.L.; Navarro, T.; Escos, J.; Barroso, F.; Cabezudo, B.; Emlen, J.M.
2003-01-01
We studied the effect of grazing on the degree of regression of successional vegetation dynamic in a semi-arid Mediterranean matorral. We quantified the spatial distribution patterns of the vegetation by fractal analyses, using the fractal information dimension and spatial autocorrelation measured by detrended fluctuation analyses (DFA). It is the first time that fractal analysis of plant spatial patterns has been used to characterize the regressive ecological succession. Plant spatial patterns were compared over a long-term grazing gradient (low, medium and heavy grazing pressure) and on ungrazed sites for two different plant communities: A middle dense matorral of Chamaerops and Periploca at Sabinar-Romeral and a middle dense matorral of Chamaerops, Rhamnus and Ulex at Requena-Montano. The two communities differed also in the microclimatic characteristics (sea oriented at the Sabinar-Romeral site and inland oriented at the Requena-Montano site). The information fractal dimension increased as we moved from a middle dense matorral to discontinuous and scattered matorral and, finally to the late regressive succession, at Stipa steppe stage. At this stage a drastic change in the fractal dimension revealed a change in the vegetation structure, accurately indicating end successional vegetation stages. Long-term correlation analysis (DFA) revealed that an increase in grazing pressure leads to unpredictability (randomness) in species distributions, a reduction in diversity, and an increase in cover of the regressive successional species, e.g. Stipa tenacissima L. These comparisons provide a quantitative characterization of the successional dynamic of plant spatial patterns in response to grazing perturbation gradient. ?? 2002 Elsevier Science B.V. All rights reserved.
Blockface histology with optical coherence tomography: a comparison with Nissl staining.
Magnain, Caroline; Augustinack, Jean C; Reuter, Martin; Wachinger, Christian; Frosch, Matthew P; Ragan, Timothy; Akkin, Taner; Wedeen, Van J; Boas, David A; Fischl, Bruce
2014-01-01
Spectral domain optical coherence tomography (SD-OCT) is a high resolution imaging technique that generates excellent contrast based on intrinsic optical properties of the tissue, such as neurons and fibers. The SD-OCT data acquisition is performed directly on the tissue block, diminishing the need for cutting, mounting and staining. We utilized SD-OCT to visualize the laminar structure of the isocortex and compared cortical cytoarchitecture with the gold standard Nissl staining, both qualitatively and quantitatively. In histological processing, distortions routinely affect registration to the blockface image and prevent accurate 3D reconstruction of regions of tissue. We compared blockface registration to SD-OCT and Nissl, respectively, and found that SD-OCT-blockface registration was significantly more accurate than Nissl-blockface registration. Two independent observers manually labeled cortical laminae (e.g. III, IV and V) in SD-OCT images and Nissl stained sections. Our results show that OCT images exhibit sufficient contrast in the cortex to reliably differentiate the cortical layers. Furthermore, the modalities were compared with regard to cortical laminar organization and showed good agreement. Taken together, these SD-OCT results suggest that SD-OCT contains information comparable to standard histological stains such as Nissl in terms of distinguishing cortical layers and architectonic areas. Given these data, we propose that SD-OCT can be used to reliably generate 3D reconstructions of multiple cubic centimeters of cortex that can be used to accurately and semi-automatically perform standard histological analyses. © 2013.
Blockface Histology with Optical Coherence Tomography: A Comparison with Nissl Staining
Magnain, Caroline; Augustinack, Jean C.; Reuter, Martin; Wachinger, Christian; Frosch, Matthew P.; Ragan, Timothy; Akkin, Taner; Wedeen, Van J.; Boas, David A.; Fischl, Bruce
2015-01-01
Spectral domain optical coherence tomography (SD-OCT) is a high resolution imaging technique that generates excellent contrast based on intrinsic optical properties of the tissue, such as neurons and fibers. The SD-OCT data acquisition is performed directly on the tissue block, diminishing the need for cutting, mounting and staining. We utilized SD-OCT to visualize the laminar structure of the isocortex and compared cortical cytoarchitecture with the gold standard Nissl staining, both qualitatively and quantitatively. In histological processing, distortions routinely affect registration to the blockface image and prevent accurate 3D reconstruction of regions of tissue. We compared blockface registration to SD-OCT and Nissl, respectively, and found that SD-OCT-blockface registration was significantly more accurate than Nissl-blockface registration. Two independent observers manually labeled cortical laminae (e.g. III, IV and V) in SD-OCT images and Nissl stained sections. Our results show that OCT images exhibit sufficient contrast in the cortex to reliably differentiate the cortical layers. Furthermore, the modalities were compared with regard to cortical laminar organization and showed good agreement. Taken together, these SD-OCT results suggest that SD-OCT contains information comparable to standard histological stains such as Nissl in terms of distinguishing cortical layers and architectonic areas. Given these data, we propose that SD-OCT can be used to reliably generate 3D reconstructions of multiple cubic centimeters of cortex that can be used to accurately and semi-automatically perform standard histological analyses. PMID:24041872
Hemmateenejad, Bahram; Yazdani, Mahdieh
2009-02-16
Steroids are widely distributed in nature and are found in plants, animals, and fungi in abundance. A data set consists of a diverse set of steroids have been used to develop quantitative structure-electrochemistry relationship (QSER) models for their half-wave reduction potential. Modeling was established by means of multiple linear regression (MLR) and principle component regression (PCR) analyses. In MLR analysis, the QSPR models were constructed by first grouping descriptors and then stepwise selection of variables from each group (MLR1) and stepwise selection of predictor variables from the pool of all calculated descriptors (MLR2). Similar procedure was used in PCR analysis so that the principal components (or features) were extracted from different group of descriptors (PCR1) and from entire set of descriptors (PCR2). The resulted models were evaluated using cross-validation, chance correlation, application to prediction reduction potential of some test samples and accessing applicability domain. Both MLR approaches represented accurate results however the QSPR model found by MLR1 was statistically more significant. PCR1 approach produced a model as accurate as MLR approaches whereas less accurate results were obtained by PCR2 approach. In overall, the correlation coefficients of cross-validation and prediction of the QSPR models resulted from MLR1, MLR2 and PCR1 approaches were higher than 90%, which show the high ability of the models to predict reduction potential of the studied steroids.
Quantitative prediction of phase transformations in silicon during nanoindentation
NASA Astrophysics Data System (ADS)
Zhang, Liangchi; Basak, Animesh
2013-08-01
This paper establishes the first quantitative relationship between the phases transformed in silicon and the shape characteristics of nanoindentation curves. Based on an integrated analysis using TEM and unit cell properties of phases, the volumes of the phases emerged in a nanoindentation are formulated as a function of pop-out size and depth of nanoindentation impression. This simple formula enables a fast, accurate and quantitative prediction of the phases in a nanoindentation cycle, which has been impossible before.
Heijtel, D F R; Mutsaerts, H J M M; Bakker, E; Schober, P; Stevens, M F; Petersen, E T; van Berckel, B N M; Majoie, C B L M; Booij, J; van Osch, M J P; Vanbavel, E; Boellaard, R; Lammertsma, A A; Nederveen, A J
2014-05-15
Measurements of the cerebral blood flow (CBF) and cerebrovascular reactivity (CVR) provide useful information about cerebrovascular condition and regional metabolism. Pseudo-continuous arterial spin labeling (pCASL) is a promising non-invasive MRI technique to quantitatively measure the CBF, whereas additional hypercapnic pCASL measurements are currently showing great promise to quantitatively assess the CVR. However, the introduction of pCASL at a larger scale awaits further evaluation of the exact accuracy and precision compared to the gold standard. (15)O H₂O positron emission tomography (PET) is currently regarded as the most accurate and precise method to quantitatively measure both CBF and CVR, though it is one of the more invasive methods as well. In this study we therefore assessed the accuracy and precision of quantitative pCASL-based CBF and CVR measurements by performing a head-to-head comparison with (15)O H₂O PET, based on quantitative CBF measurements during baseline and hypercapnia. We demonstrate that pCASL CBF imaging is accurate during both baseline and hypercapnia with respect to (15)O H₂O PET with a comparable precision. These results pave the way for quantitative usage of pCASL MRI in both clinical and research settings. Copyright © 2014 Elsevier Inc. All rights reserved.
Ma, Shuguang; Li, Zhiling; Lee, Keun-Joong; Chowdhury, Swapan K
2010-12-20
A simple, reliable, and accurate method was developed for quantitative assessment of metabolite coverage in preclinical safety species by mixing equal volumes of human plasma with blank plasma of animal species and vice versa followed by an analysis using high-resolution full-scan accurate mass spectrometry. This approach provided comparable results (within (±15%) to those obtained from regulated bioanalysis and did not require synthetic standards or radiolabeled compounds. In addition, both qualitative and quantitative data were obtained from a single LC-MS analysis on all metabolites and, therefore, the coverage of any metabolite of interest can be obtained.
Multiphase Method for Analysing Online Discussions
ERIC Educational Resources Information Center
Häkkinen, P.
2013-01-01
Several studies have analysed and assessed online performance and discourse using quantitative and qualitative methods. Quantitative measures have typically included the analysis of participation rates and learning outcomes in terms of grades. Qualitative measures of postings, discussions and context features aim to give insights into the nature…
ERIC Educational Resources Information Center
Pobocik, Tamara J.
2013-01-01
The use of technology and electronic medical records in healthcare has exponentially increased. This quantitative research project used a pretest/posttest design, and reviewed how an educational electronic documentation system helped nursing students to identify the accurate related to statement of the nursing diagnosis for the patient in the case…
Crotta, M; Limon, G; Blake, D P; Guitian, J
2017-11-16
Toxoplasma gondii is recognized as a widely prevalent zoonotic parasite worldwide. Although several studies clearly identified meat products as an important source of T. gondii infections in humans, quantitative understanding of the risk posed to humans through the food chain is surprisingly scant. While probabilistic risk assessments for pathogens such as Campylobacter jejuni, Listeria monocytogenes or Escherichia coli have been well established, attempts to quantify the probability of human exposure to T. gondii through consumption of food products of animal origin are at early stages. The biological complexity of the life cycle of T. gondii and limited understanding of several fundamental aspects of the host/parasite interaction, require the adoption of numerous critical assumptions and significant simplifications. In this study, we present a hypothetical quantitative model for the assessment of human exposure to T. gondii through meat products. The model has been conceptualized to capture the dynamics leading to the presence of parasite in meat and, for illustrative purposes, used to estimate the probability of at least one viable cyst occurring in 100g of fresh pork meat in England. Available data, including the results of a serological survey in pigs raised in England were used as a starting point to implement a probabilistic model and assess the fate of the parasite along the food chain. Uncertainty distributions were included to describe and account for the lack of knowledge where necessary. To quantify the impact of the key model inputs, sensitivity and scenario analyses were performed. The overall probability of 100g of a hypothetical edible tissue containing at least 1 cyst was 5.54%. Sensitivity analysis indicated that the variables exerting the greater effect on the output mean were the number of cysts and number of bradyzoites per cyst. Under the best and the worst scenarios, the probability of a single portion of fresh pork meat containing at least 1 viable cyst resulted 1.14% and 9.97% indicating that the uncertainty and lack of data surrounding key input parameters of the model preclude accurate estimation of T. gondii exposure through consumption of meat products. The hypothetical model conceptualized here is coherent with current knowledge of the biology of the parasite. Simulation outputs clearly identify the key gaps in our knowledge of the host-parasite interaction that, when filled, will support quantitative assessments and much needed accurate estimates of the risk of human exposure. Copyright © 2017 Elsevier B.V. All rights reserved.
Pi, Liqun; Li, Xiang; Cao, Yiwei; Wang, Canhua; Pan, Liangwen; Yang, Litao
2015-04-01
Reference materials are important in accurate analysis of genetically modified organism (GMO) contents in food/feeds, and development of novel reference plasmid is a new trend in the research of GMO reference materials. Herein, we constructed a novel multi-targeting plasmid, pSOY, which contained seven event-specific sequences of five GM soybeans (MON89788-5', A2704-12-3', A5547-127-3', DP356043-5', DP305423-3', A2704-12-5', and A5547-127-5') and sequence of soybean endogenous reference gene Lectin. We evaluated the specificity, limit of detection and quantification, and applicability of pSOY in both qualitative and quantitative PCR analyses. The limit of detection (LOD) was as low as 20 copies in qualitative PCR, and the limit of quantification (LOQ) in quantitative PCR was 10 copies. In quantitative real-time PCR analysis, the PCR efficiencies of all event-specific and Lectin assays were higher than 90%, and the squared regression coefficients (R(2)) were more than 0.999. The quantification bias varied from 0.21% to 19.29%, and the relative standard deviations were from 1.08% to 9.84% in simulated samples analysis. All the results demonstrated that the developed multi-targeting plasmid, pSOY, was a credible substitute of matrix reference materials, and could be used as a reliable reference calibrator in the identification and quantification of multiple GM soybean events.
Yehia, Ali Mohamed; Essam, Hebatallah Mohamed
2016-09-01
A generally applicable high-performance liquid chromatographic method for the qualitative and quantitative determination of pharmaceutical preparations containing phenylephrine hydrochloride, paracetamol, ephedrine hydrochloride, guaifenesin, doxylamine succinate, and dextromethorphan hydrobromide is developed. Optimization of chromatographic conditions was performed for the gradient elution using different buffer pH values, flow rates and two C18 stationary phases. The method was developed using a Kinetex® C18 column as a core-shell stationary phase with a gradient profile using buffer pH 5.0 and acetonitrile at 2.0 mL/min flow rate. Detection was carried out at 220 nm and linear calibrations were obtained for all components within the studied ranges. The method was fully validated in agreement with ICH guidelines. The proposed method is specific, accurate and precise (RSD% < 3%). Limits of detection are lower than 2.0 μg/mL. Qualitative and quantitative responses were evaluated using experimental design to assist the method robustness. The method was proved to be highly robust against 10% change in buffer pH and flow rate (RSD% < 10%), however, the flow rate may significantly influence the quantitative responses of phenylephrine, paracetamol, and doxylamine (RSD% > 10%). Satisfactory results were obtained for commercial combinations analyses. Statistical comparison between the proposed chromatographic and official methods revealed no significant difference. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A novel statistical method for quantitative comparison of multiple ChIP-seq datasets.
Chen, Li; Wang, Chi; Qin, Zhaohui S; Wu, Hao
2015-06-15
ChIP-seq is a powerful technology to measure the protein binding or histone modification strength in the whole genome scale. Although there are a number of methods available for single ChIP-seq data analysis (e.g. 'peak detection'), rigorous statistical method for quantitative comparison of multiple ChIP-seq datasets with the considerations of data from control experiment, signal to noise ratios, biological variations and multiple-factor experimental designs is under-developed. In this work, we develop a statistical method to perform quantitative comparison of multiple ChIP-seq datasets and detect genomic regions showing differential protein binding or histone modification. We first detect peaks from all datasets and then union them to form a single set of candidate regions. The read counts from IP experiment at the candidate regions are assumed to follow Poisson distribution. The underlying Poisson rates are modeled as an experiment-specific function of artifacts and biological signals. We then obtain the estimated biological signals and compare them through the hypothesis testing procedure in a linear model framework. Simulations and real data analyses demonstrate that the proposed method provides more accurate and robust results compared with existing ones. An R software package ChIPComp is freely available at http://web1.sph.emory.edu/users/hwu30/software/ChIPComp.html. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Shen, Xiaomeng; Hu, Qiang; Li, Jun; Wang, Jianmin; Qu, Jun
2015-10-02
Comprehensive and accurate evaluation of data quality and false-positive biomarker discovery is critical to direct the method development/optimization for quantitative proteomics, which nonetheless remains challenging largely due to the high complexity and unique features of proteomic data. Here we describe an experimental null (EN) method to address this need. Because the method experimentally measures the null distribution (either technical or biological replicates) using the same proteomic samples, the same procedures and the same batch as the case-vs-contol experiment, it correctly reflects the collective effects of technical variability (e.g., variation/bias in sample preparation, LC-MS analysis, and data processing) and project-specific features (e.g., characteristics of the proteome and biological variation) on the performances of quantitative analysis. To show a proof of concept, we employed the EN method to assess the quantitative accuracy and precision and the ability to quantify subtle ratio changes between groups using different experimental and data-processing approaches and in various cellular and tissue proteomes. It was found that choices of quantitative features, sample size, experimental design, data-processing strategies, and quality of chromatographic separation can profoundly affect quantitative precision and accuracy of label-free quantification. The EN method was also demonstrated as a practical tool to determine the optimal experimental parameters and rational ratio cutoff for reliable protein quantification in specific proteomic experiments, for example, to identify the necessary number of technical/biological replicates per group that affords sufficient power for discovery. Furthermore, we assessed the ability of EN method to estimate levels of false-positives in the discovery of altered proteins, using two concocted sample sets mimicking proteomic profiling using technical and biological replicates, respectively, where the true-positives/negatives are known and span a wide concentration range. It was observed that the EN method correctly reflects the null distribution in a proteomic system and accurately measures false altered proteins discovery rate (FADR). In summary, the EN method provides a straightforward, practical, and accurate alternative to statistics-based approaches for the development and evaluation of proteomic experiments and can be universally adapted to various types of quantitative techniques.
Wang, Shunhai; Bobst, Cedric E; Kaltashov, Igor A
2015-01-01
Transferrin (Tf) is an 80 kDa iron-binding protein that is viewed as a promising drug carrier to target the central nervous system as a result of its ability to penetrate the blood-brain barrier. Among the many challenges during the development of Tf-based therapeutics, the sensitive and accurate quantitation of the administered Tf in cerebrospinal fluid (CSF) remains particularly difficult because of the presence of abundant endogenous Tf. Herein, we describe the development of a new liquid chromatography-mass spectrometry-based method for the sensitive and accurate quantitation of exogenous recombinant human Tf in rat CSF. By taking advantage of a His-tag present in recombinant Tf and applying Ni affinity purification, the exogenous human serum Tf can be greatly enriched from rat CSF, despite the presence of the abundant endogenous protein. Additionally, we applied a newly developed (18)O-labeling technique that can generate internal standards at the protein level, which greatly improved the accuracy and robustness of quantitation. The developed method was investigated for linearity, accuracy, precision, and lower limit of quantitation, all of which met the commonly accepted criteria for bioanalytical method validation.
NASA Astrophysics Data System (ADS)
Gupta, Arun; Kim, Kyeong Yun; Hwang, Donghwi; Lee, Min Sun; Lee, Dong Soo; Lee, Jae Sung
2018-06-01
SPECT plays important role in peptide receptor targeted radionuclide therapy using theranostic radionuclides such as Lu-177 for the treatment of various cancers. However, SPECT studies must be quantitatively accurate because the reliable assessment of tumor uptake and tumor-to-normal tissue ratios can only be performed using quantitatively accurate images. Hence, it is important to evaluate performance parameters and quantitative accuracy of preclinical SPECT systems for therapeutic radioisotopes before conducting pre- and post-therapy SPECT imaging or dosimetry studies. In this study, we evaluated system performance and quantitative accuracy of NanoSPECT/CT scanner for Lu-177 imaging using point source and uniform phantom studies. We measured recovery coefficient, uniformity, spatial resolution, system sensitivity and calibration factor for mouse whole body standard aperture. We also performed the experiments using Tc-99m to compare the results with that of Lu-177. We found that the recovery coefficient of more than 70% for Lu-177 at the optimum noise level when nine iterations were used. The spatial resolutions of Lu-177 with and without adding uniform background was comparable to that of Tc-99m in axial, radial and tangential directions. System sensitivity measured for Lu-177 was almost three times less than that of Tc-99m.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeraatkar, Navid; Farahani, Mohammad Hossein; Rahmim, Arman
Purpose: Given increasing efforts in biomedical research utilizing molecular imaging methods, development of dedicated high-performance small-animal SPECT systems has been growing rapidly in the last decade. In the present work, we propose and assess an alternative concept for SPECT imaging enabling desktop open-gantry imaging of small animals. Methods: The system, PERSPECT, consists of an imaging desk, with a set of tilted detector and pinhole collimator placed beneath it. The object to be imaged is simply placed on the desk. Monte Carlo (MC) and analytical simulations were utilized to accurately model and evaluate the proposed concept and design. Furthermore, a dedicatedmore » image reconstruction algorithm, finite-aperture-based circular projections (FABCP), was developed and validated for the system, enabling more accurate modeling of the system and higher quality reconstructed images. Image quality was quantified as a function of different tilt angles in the acquisition and number of iterations in the reconstruction algorithm. Furthermore, more complex phantoms including Derenzo, Defrise, and mouse whole body were simulated and studied. Results: The sensitivity of the PERSPECT was 207 cps/MBq. It was quantitatively demonstrated that for a tilt angle of 30°, comparable image qualities were obtained in terms of normalized squared error, contrast, uniformity, noise, and spatial resolution measurements, the latter at ∼0.6 mm. Furthermore, quantitative analyses demonstrated that 3 iterations of FABCP image reconstruction (16 subsets/iteration) led to optimally reconstructed images. Conclusions: The PERSPECT, using a novel imaging protocol, can achieve comparable image quality performance in comparison with a conventional pinhole SPECT with the same configuration. The dedicated FABCP algorithm, which was developed for reconstruction of data from the PERSPECT system, can produce high quality images for small-animal imaging via accurate modeling of the system as incorporated in the forward- and back-projection steps. Meanwhile, the developed MC model and the analytical simulator of the system can be applied for further studies on development and evaluation of the system.« less
USDA-ARS?s Scientific Manuscript database
Accurate identification and quantification of Fusarium virguliforme, the cause of sudden death syndrome (SDS) in soybean, within root tissue and soil are important tasks. Several quantitative PCR (qPCR) assays have been developed but there are no reports comparing their use in sensitive and specific...
The SIV plasma viral load assay performed by the Quantitative Molecular Diagnostics Core (QMDC) utilizes reagents specifically designed to detect and accurately quantify the full range of SIV/SHIV viral variants and clones in common usage in the rese
QUANTITATION OF MENSTRUAL BLOOD LOSS: A RADIOACTIVE METHOD UTILIZING A COUNTING DOME
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tauxe, W.N.
A description has been given of a simple, accurate tech nique for the quantitation of menstrual blood loss, involving the determination of a three- dimensional isosensitivity curve and the fashioning of a lucite dome with cover to fit these specifications. Ten normal subjects lost no more than 50 ml each per menstrual period. (auth)
Quantitative Characterization of the Filiform Mechanosensory Hair Array on the Cricket Cercus
Miller, John P.; Krueger, Susan; Heys, Jeffrey J.; Gedeon, Tomas
2011-01-01
Background Crickets and other orthopteran insects sense air currents with a pair of abdominal appendages resembling antennae, called cerci. Each cercus in the common house cricket Acheta domesticus is approximately 1 cm long, and is covered with 500 to 750 filiform mechanosensory hairs. The distribution of the hairs on the cerci, as well as the global patterns of their movement vectors, have been characterized semi-quantitatively in studies over the last 40 years, and have been shown to be very stereotypical across different animals in this species. Although the cercal sensory system has been the focus of many studies in the areas of neuroethology, development, biomechanics, sensory function and neural coding, there has not yet been a quantitative study of the functional morphology of the receptor array of this important model system. Methodology/Principal Findings We present a quantitative characterization of the structural characteristics and functional morphology of the cercal filiform hair array. We demonstrate that the excitatory direction along each hair's movement plane can be identified by features of its socket that are visible at the light-microscopic level, and that the length of the hair associated with each socket can also be estimated accurately from a structural parameter of the socket. We characterize the length and directionality of all hairs on the basal half of a sample of three cerci, and present statistical analyses of the distributions. Conclusions/Significance The inter-animal variation of several global organizational features is low, consistent with constraints imposed by functional effectiveness and/or developmental processes. Contrary to previous reports, however, we show that the filiform hairs are not re-identifiable in the strict sense. PMID:22132155
Vincent, Delphine; Elkins, Aaron; Condina, Mark R; Ezernieks, Vilnis; Rochfort, Simone
2016-01-01
Cow's milk is an important source of proteins in human nutrition. On average, cow's milk contains 3.5% protein. The most abundant proteins in bovine milk are caseins and some of the whey proteins, namely beta-lactoglobulin, alpha-lactalbumin, and serum albumin. A number of allelic variants and post-translationally modified forms of these proteins have been identified. Their occurrence varies with breed, individuality, stage of lactation, and health and nutritional status of the animal. It is therefore essential to have reliable methods of detection and quantitation of these proteins. Traditionally, major milk proteins are quantified using liquid chromatography (LC) and ultra violet detection method. However, as these protein variants co-elute to some degree, another dimension of separation is beneficial to accurately measure their amounts. Mass spectrometry (MS) offers such a tool. In this study, we tested several RP-HPLC and MS parameters to optimise the analysis of intact bovine proteins from milk. From our tests, we developed an optimum method that includes a 20-28-40% phase B gradient with 0.02% TFA in both mobile phases, at 0.2 mL/min flow rate, using 75°C for the C8 column temperature, scanning every 3 sec over a 600-3000 m/z window. The optimisations were performed using external standards commercially purchased for which ionisation efficiency, linearity of calibration, LOD, LOQ, sensitivity, selectivity, precision, reproducibility, and mass accuracy were demonstrated. From the MS analysis, we can use extracted ion chromatograms (EICs) of specific ion series of known proteins and integrate peaks at defined retention time (RT) window for quantitation purposes. This optimum quantitative method was successfully applied to two bulk milk samples from different breeds, Holstein-Friesian and Jersey, to assess differences in protein variant levels.
Arias, Alain; Lezcano, María Florencia; Saravia, Diego; Dias, Fernando José
2017-01-01
Masticatory movements are studied for decades in odontology; a better understanding of them could improve dental treatments. The aim of this study was to describe an innovative, accurate, and systematic method of analyzing masticatory cycles, generating comparable quantitative data. The masticatory cycles of 5 volunteers (Class I, 19 ± 1.7 years) without articular or dental occlusion problems were evaluated using 3D electromagnetic articulography supported by MATLAB software. The method allows the trajectory morphology of the set of chewing cycles to be analyzed from different views and angles. It was also possible to individualize the trajectory of each cycle providing accurate quantitative data, such as number of cycles, cycle areas in frontal view, and the ratio between each cycle area and the frontal mandibular border movement area. There was a moderate negative correlation (−0.61) between the area and the number of cycles: the greater the cycle area, the smaller the number of repetitions. Finally it was possible to evaluate the area of the cycles through time, which did not reveal a standardized behavior. The proposed method provided reproducible, intelligible, and accurate quantitative and graphical data, suggesting that it is promising and may be applied in different clinical situations and treatments. PMID:29075647
Fuentes, Ramón; Arias, Alain; Lezcano, María Florencia; Saravia, Diego; Kuramochi, Gisaku; Dias, Fernando José
2017-01-01
Masticatory movements are studied for decades in odontology; a better understanding of them could improve dental treatments. The aim of this study was to describe an innovative, accurate, and systematic method of analyzing masticatory cycles, generating comparable quantitative data. The masticatory cycles of 5 volunteers (Class I, 19 ± 1.7 years) without articular or dental occlusion problems were evaluated using 3D electromagnetic articulography supported by MATLAB software. The method allows the trajectory morphology of the set of chewing cycles to be analyzed from different views and angles. It was also possible to individualize the trajectory of each cycle providing accurate quantitative data, such as number of cycles, cycle areas in frontal view, and the ratio between each cycle area and the frontal mandibular border movement area. There was a moderate negative correlation (-0.61) between the area and the number of cycles: the greater the cycle area, the smaller the number of repetitions. Finally it was possible to evaluate the area of the cycles through time, which did not reveal a standardized behavior. The proposed method provided reproducible, intelligible, and accurate quantitative and graphical data, suggesting that it is promising and may be applied in different clinical situations and treatments.
Shubhakar, Archana; Kalla, Rahul; Nimmo, Elaine R.; Fernandes, Daryl L.; Satsangi, Jack; Spencer, Daniel I. R.
2015-01-01
Introduction Serum N-glycans have been identified as putative biomarkers for numerous diseases. The impact of different serum sample tubes and processing methods on N-glycan analysis has received relatively little attention. This study aimed to determine the effect of different sample tubes and processing methods on the whole serum N-glycan profile in both health and disease. A secondary objective was to describe a robot automated N-glycan release, labeling and cleanup process for use in a biomarker discovery system. Methods 25 patients with active and quiescent inflammatory bowel disease and controls had three different serum sample tubes taken at the same draw. Two different processing methods were used for three types of tube (with and without gel-separation medium). Samples were randomised and processed in a blinded fashion. Whole serum N-glycan release, 2-aminobenzamide labeling and cleanup was automated using a Hamilton Microlab STARlet Liquid Handling robot. Samples were analysed using a hydrophilic interaction liquid chromatography/ethylene bridged hybrid(BEH) column on an ultra-high performance liquid chromatography instrument. Data were analysed quantitatively by pairwise correlation and hierarchical clustering using the area under each chromatogram peak. Qualitatively, a blinded assessor attempted to match chromatograms to each individual. Results There was small intra-individual variation in serum N-glycan profiles from samples collected using different sample processing methods. Intra-individual correlation coefficients were between 0.99 and 1. Unsupervised hierarchical clustering and principal coordinate analyses accurately matched samples from the same individual. Qualitative analysis demonstrated good chromatogram overlay and a blinded assessor was able to accurately match individuals based on chromatogram profile, regardless of disease status. Conclusions The three different serum sample tubes processed using the described methods cause minimal inter-individual variation in serum whole N-glycan profile when processed using an automated workstream. This has important implications for N-glycan biomarker discovery studies using different serum processing standard operating procedures. PMID:25831126
Ventham, Nicholas T; Gardner, Richard A; Kennedy, Nicholas A; Shubhakar, Archana; Kalla, Rahul; Nimmo, Elaine R; Fernandes, Daryl L; Satsangi, Jack; Spencer, Daniel I R
2015-01-01
Serum N-glycans have been identified as putative biomarkers for numerous diseases. The impact of different serum sample tubes and processing methods on N-glycan analysis has received relatively little attention. This study aimed to determine the effect of different sample tubes and processing methods on the whole serum N-glycan profile in both health and disease. A secondary objective was to describe a robot automated N-glycan release, labeling and cleanup process for use in a biomarker discovery system. 25 patients with active and quiescent inflammatory bowel disease and controls had three different serum sample tubes taken at the same draw. Two different processing methods were used for three types of tube (with and without gel-separation medium). Samples were randomised and processed in a blinded fashion. Whole serum N-glycan release, 2-aminobenzamide labeling and cleanup was automated using a Hamilton Microlab STARlet Liquid Handling robot. Samples were analysed using a hydrophilic interaction liquid chromatography/ethylene bridged hybrid(BEH) column on an ultra-high performance liquid chromatography instrument. Data were analysed quantitatively by pairwise correlation and hierarchical clustering using the area under each chromatogram peak. Qualitatively, a blinded assessor attempted to match chromatograms to each individual. There was small intra-individual variation in serum N-glycan profiles from samples collected using different sample processing methods. Intra-individual correlation coefficients were between 0.99 and 1. Unsupervised hierarchical clustering and principal coordinate analyses accurately matched samples from the same individual. Qualitative analysis demonstrated good chromatogram overlay and a blinded assessor was able to accurately match individuals based on chromatogram profile, regardless of disease status. The three different serum sample tubes processed using the described methods cause minimal inter-individual variation in serum whole N-glycan profile when processed using an automated workstream. This has important implications for N-glycan biomarker discovery studies using different serum processing standard operating procedures.
Han, Chenggui; Yu, Jialin; Li, Dawei; Zhang, Yongliang
2012-01-01
Nicotiana benthamiana is the most widely-used experimental host in plant virology. The recent release of the draft genome sequence for N. benthamiana consolidates its role as a model for plant–pathogen interactions. Quantitative real-time PCR (qPCR) is commonly employed for quantitative gene expression analysis. For valid qPCR analysis, accurate normalisation of gene expression against an appropriate internal control is required. Yet there has been little systematic investigation of reference gene stability in N. benthamiana under conditions of viral infections. In this study, the expression profiles of 16 commonly used housekeeping genes (GAPDH, 18S, EF1α, SAMD, L23, UK, PP2A, APR, UBI3, SAND, ACT, TUB, GBP, F-BOX, PPR and TIP41) were determined in N. benthamiana and those with acceptable expression levels were further selected for transcript stability analysis by qPCR of complementary DNA prepared from N. benthamiana leaf tissue infected with one of five RNA plant viruses (Tobacco necrosis virus A, Beet black scorch virus, Beet necrotic yellow vein virus, Barley stripe mosaic virus and Potato virus X). Gene stability was analysed in parallel by three commonly-used dedicated algorithms: geNorm, NormFinder and BestKeeper. Statistical analysis revealed that the PP2A, F-BOX and L23 genes were the most stable overall, and that the combination of these three genes was sufficient for accurate normalisation. In addition, the suitability of PP2A, F-BOX and L23 as reference genes was illustrated by expression-level analysis of AGO2 and RdR6 in virus-infected N. benthamiana leaves. This is the first study to systematically examine and evaluate the stability of different reference genes in N. benthamiana. Our results not only provide researchers studying these viruses a shortlist of potential housekeeping genes to use as normalisers for qPCR experiments, but should also guide the selection of appropriate reference genes for gene expression studies of N. benthamiana under other biotic and abiotic stress conditions. PMID:23029521
Lin, Steven C; Heba, Elhamy; Wolfson, Tanya; Ang, Brandon; Gamst, Anthony; Han, Aiguo; Erdman, John W; O'Brien, William D; Andre, Michael P; Sirlin, Claude B; Loomba, Rohit
2015-07-01
Liver biopsy analysis is the standard method used to diagnose nonalcoholic fatty liver disease (NAFLD). Advanced magnetic resonance imaging is a noninvasive procedure that can accurately diagnose and quantify steatosis, but is expensive. Conventional ultrasound is more accessible but identifies steatosis with low levels of sensitivity, specificity, and quantitative accuracy, and results vary among technicians. A new quantitative ultrasound (QUS) technique can identify steatosis in animal models. We assessed the accuracy of QUS in the diagnosis and quantification of hepatic steatosis, comparing findings with those from magnetic resonance imaging proton density fat fraction (MRI-PDFF) analysis as a reference. We performed a prospective, cross-sectional analysis of a cohort of adults (N = 204) with NAFLD (MRI-PDFF, ≥5%) and without NAFLD (controls). Subjects underwent MRI-PDFF and QUS analyses of the liver on the same day at the University of California, San Diego, from February 2012 through March 2014. QUS parameters and backscatter coefficient (BSC) values were calculated. Patients were assigned randomly to training (n = 102; mean age, 51 ± 17 y; mean body mass index, 31 ± 7 kg/m(2)) and validation (n = 102; mean age, 49 ± 17 y; body mass index, 30 ± 6 kg/m(2)) groups; 69% of patients in each group had NAFLD. BSC (range, 0.00005-0.25 1/cm-sr) correlated with MRI-PDFF (Spearman ρ = 0.80; P < .0001). In the training group, the BSC analysis identified patients with NAFLD with an area under the curve value of 0.98 (95% confidence interval, 0.95-1.00; P < .0001). The optimal BSC cut-off value identified patients with NAFLD in the training and validation groups with 93% and 87% sensitivity, 97% and 91% specificity, 86% and 76% negative predictive values, and 99% and 95% positive predictive values, respectively. QUS measurements of BSC can accurately diagnose and quantify hepatic steatosis, based on a cross-sectional analysis that used MRI-PDFF as the reference. With further validation, QUS could be an inexpensive, widely available method to screen the general or at-risk population for NAFLD. Copyright © 2015 AGA Institute. Published by Elsevier Inc. All rights reserved.
Pütter, Carolin; Pechlivanis, Sonali; Nöthen, Markus M; Jöckel, Karl-Heinz; Wichmann, Heinz-Erich; Scherag, André
2011-01-01
Genome-wide association studies have identified robust associations between single nucleotide polymorphisms and complex traits. As the proportion of phenotypic variance explained is still limited for most of the traits, larger and larger meta-analyses are being conducted to detect additional associations. Here we investigate the impact of the study design and the underlying assumption about the true genetic effect in a bimodal mixture situation on the power to detect associations. We performed simulations of quantitative phenotypes analysed by standard linear regression and dichotomized case-control data sets from the extremes of the quantitative trait analysed by standard logistic regression. Using linear regression, markers with an effect in the extremes of the traits were almost undetectable, whereas analysing extremes by case-control design had superior power even for much smaller sample sizes. Two real data examples are provided to support our theoretical findings and to explore our mixture and parameter assumption. Our findings support the idea to re-analyse the available meta-analysis data sets to detect new loci in the extremes. Moreover, our investigation offers an explanation for discrepant findings when analysing quantitative traits in the general population and in the extremes. Copyright © 2011 S. Karger AG, Basel.
Quantitative Phase Imaging in a Volume Holographic Microscope
NASA Astrophysics Data System (ADS)
Waller, Laura; Luo, Yuan; Barbastathis, George
2010-04-01
We demonstrate a method for quantitative phase imaging in a Volume Holographic Microscope (VHM) from a single exposure, describe the properties of the system and show experimental results. The VHM system uses a multiplexed volume hologram (VH) to laterally separate images from different focal planes. This 3D intensity information is then used to solve the transport of intensity (TIE) equation and recover phase quantitatively. We discuss the modifications to the technique that were made in order to give accurate results.
Wagner, David W; Reed, Matthew P; Chaffin, Don B
2010-11-01
Accurate prediction of foot placements in relation to hand locations during manual materials handling tasks is critical for prospective biomechanical analysis. To address this need, the effects of lifting task conditions and anthropometric variables on foot placements were studied in a laboratory experiment. In total, 20 men and women performed two-handed object transfers that required them to walk to a shelf, lift an object from the shelf at waist height and carry the object to a variety of locations. Five different changes in the direction of progression following the object pickup were used, ranging from 45° to 180° relative to the approach direction. Object weights of 1.0 kg, 4.5 kg, 13.6 kg were used. Whole-body motions were recorded using a 3-D optical retro-reflective marker-based camera system. A new parametric system for describing foot placements, the Quantitative Transition Classification System, was developed to facilitate the parameterisation of foot placement data. Foot placements chosen by the subjects during the transfer tasks appeared to facilitate a change in the whole-body direction of progression, in addition to aiding in performing the lift. Further analysis revealed that five different stepping behaviours accounted for 71% of the stepping patterns observed. More specifically, the most frequently observed behaviour revealed that the orientation of the lead foot during the actual lifting task was primarily affected by the amount of turn angle required after the lift (R(2) = 0.53). One surprising result was that the object mass (scaled by participant body mass) was not found to significantly affect any of the individual step placement parameters. Regression models were developed to predict the most prevalent step placements and are included in this paper to facilitate more accurate human motion simulations and ergonomics analyses of manual material lifting tasks. STATEMENT OF RELEVANCE: This study proposes a method for parameterising the steps (foot placements) associated with manual material handling tasks. The influence of task conditions and subject anthropometry on the foot placements of the most frequently observed stepping pattern during a laboratory study is discussed. For prospective postural analyses conducted using digital human models, accurate prediction of the foot placements is critical to realistic postural analyses and improved biomechanical job evaluations.
NASA Astrophysics Data System (ADS)
Muhonda, P.; Mabiza, C.; Makurira, H.; Kujinga, K.; Nhapi, I.; Goldin, J.; Mashauri, D. A.
In recent years, the frequency of occurrence of floods has increased in Southern Africa. An increase in the frequency of extreme events is partly attributed to climate change. Floods negatively impact on livelihoods, especially those classified as poor, mainly by reducing livelihood options and also contributing to reduced crop yields. In response to these climatic events, governments within Southern Africa have formulated policies which try to mitigate the impacts of floods. Floods can be deadly, often occurring at short notice, lasting for short periods, and causing widespread damage to infrastructure. This study analysed institutional mechanisms in Mbire District of Zimbabwe which aim at mitigating the impact of floods. The study used both quantitative (i.e. questionnaires) and qualitative (i.e. key informant interviews, focus group discussions and observations) data collection methods. Secondary data such as policy and legislation documents and operational manuals of organisations that support communities affected by disasters were reviewed. Qualitative data was analysed using the thematic approach and social network analysis using UCINET 6. Quantitative data were analysed using SPSS 19.0. The study found out that there exists institutional framework that has been developed at the national and local level to support communities in the study area in response to the impacts of floods. This is supported by various pieces of legislation that are housed in different government departments. However, the existing institutional framework does not effectively strengthen disaster management mechanisms at the local level. Lack of financial resources and appropriate training and skills to undertake flood management activities reduce the capacity of communities and disaster management organisations to effectively mitigate the impacts of floods. The study also found that there are inadequate hydro-meteorological stations to enable accurate forecasts. Even in those cases where forecasts predicting extreme weather events have been made, communities have difficulties accessing and interpreting such forecasts due to inadequate communication systems. Such factors reduce the preparedness of communities to deal with extreme weather events.
Schulte, Friederike A; Lambers, Floor M; Mueller, Thomas L; Stauber, Martin; Müller, Ralph
2014-04-01
Time-lapsed in vivo micro-computed tomography is a powerful tool to analyse longitudinal changes in the bone micro-architecture. Registration can overcome problems associated with spatial misalignment between scans; however, it requires image interpolation which might affect the outcome of a subsequent bone morphometric analysis. The impact of the interpolation error itself, though, has not been quantified to date. Therefore, the purpose of this ex vivo study was to elaborate the effect of different interpolator schemes [nearest neighbour, tri-linear and B-spline (BSP)] on bone morphometric indices. None of the interpolator schemes led to significant differences between interpolated and non-interpolated images, with the lowest interpolation error found for BSPs (1.4%). Furthermore, depending on the interpolator, the processing order of registration, Gaussian filtration and binarisation played a role. Independent from the interpolator, the present findings suggest that the evaluation of bone morphometry should be done with images registered using greyscale information.
Receptor-based 3D-QSAR in Drug Design: Methods and Applications in Kinase Studies.
Fang, Cheng; Xiao, Zhiyan
2016-01-01
Receptor-based 3D-QSAR strategy represents a superior integration of structure-based drug design (SBDD) and three-dimensional quantitative structure-activity relationship (3D-QSAR) analysis. It combines the accurate prediction of ligand poses by the SBDD approach with the good predictability and interpretability of statistical models derived from the 3D-QSAR approach. Extensive efforts have been devoted to the development of receptor-based 3D-QSAR methods and two alternative approaches have been exploited. One associates with computing the binding interactions between a receptor and a ligand to generate structure-based descriptors for QSAR analyses. The other concerns the application of various docking protocols to generate optimal ligand poses so as to provide reliable molecular alignments for the conventional 3D-QSAR operations. This review highlights new concepts and methodologies recently developed in the field of receptorbased 3D-QSAR, and in particular, covers its application in kinase studies.
Evaluation of the AMSR-E Data Calibration Over Land
NASA Technical Reports Server (NTRS)
Njoku, E.; Chan, T.; Crosson, W.; Limaye, A.
2004-01-01
Land observations by the Advanced Microwave Scanning Radiometer for the Earth Observing System (AMSR-E), particularly of soil and vegetation moisture changes, have numerous applications in hydrology, ecology and climate. Quantitative retrieval of soil and vegetation parameters relies on accurate calibration of the brightness temperature measurements. Analyses of the spectral and polarization characteristics of early versions of the AMSR-E data revealed significant calibration biases over land at 6.9 GHz. The biases were estimated and removed in the current archived version of the data Radiofrequency interference (RFI) observed at 6.9 GHz is more difficult to quanti@ however. A calibration analysis of AMSR-E data over land is presented in this paper for a complete annual cycle from June 2002 through September 2003. The analysis indicates the general high quality of the data for land applications (except for RFI), and illustrates seasonal trends of the data for different land surface types and regions.
Zhou, Lei; Wang, Rui; Yao, Chi; Li, Xiaomin; Wang, Chengli; Zhang, Xiaoyan; Xu, Congjian; Zeng, Aijun; Zhao, Dongyuan; Zhang, Fan
2015-04-24
The identification of potential diagnostic markers and target molecules among the plethora of tumour oncoproteins for cancer diagnosis requires facile technology that is capable of quantitatively analysing multiple biomarkers in tumour cells and tissues. Diagnostic and prognostic classifications of human tumours are currently based on the western blotting and single-colour immunohistochemical methods that are not suitable for multiplexed detection. Herein, we report a general and novel method to prepare single-band upconversion nanoparticles with different colours. The expression levels of three biomarkers in breast cancer cells were determined using single-band upconversion nanoparticles, western blotting and immunohistochemical technologies with excellent correlation. Significantly, the application of antibody-conjugated single-band upconversion nanoparticle molecular profiling technology can achieve the multiplexed simultaneous in situ biodetection of biomarkers in breast cancer cells and tissue specimens and produce more accurate results for the simultaneous quantification of proteins present at low levels compared with classical immunohistochemical technology.
Cartographic quality of ERTS-1 images
NASA Technical Reports Server (NTRS)
Welch, R. I.
1973-01-01
Analyses of simulated and operational ERTS images have provided initial estimates of resolution, ground resolution, detectability thresholds and other measures of image quality of interest to earth scientists and cartographers. Based on these values, including an approximate ground resolution of 250 meters for both RBV and MSS systems, the ERTS-1 images appear suited to the production and/or revision of planimetric and photo maps of 1:500,000 scale and smaller for which map accuracy standards are compatible with the imaged detail. Thematic mapping, although less constrained by map accuracy standards, will be influenced by measurement thresholds and errors which have yet to be accurately determined for ERTS images. This study also indicates the desirability of establishing a quantitative relationship between image quality values and map products which will permit both engineers and cartographers/earth scientists to contribute to the design requirements of future satellite imaging systems.
Spacecraft self-contamination due to back-scattering of outgas products
NASA Technical Reports Server (NTRS)
Robertson, S. J.
1976-01-01
The back-scattering of outgas contamination near an orbiting spacecraft due to intermolecular collisions was analyzed. Analytical tools were developed for making reasonably accurate quantitative estimates of the outgas contamination return flux, given a knowledge of the pertinent spacecraft and orbit conditions. Two basic collision mechanisms were considered: (1) collisions involving only outgas molecules (self-scattering) and (2) collisions between outgas molecules and molecules in the ambient atmosphere (ambient-scattering). For simplicity, the geometry was idealized to a uniformly outgassing sphere and to a disk oriented normal to the freestream. The method of solution involved an integration of an approximation of the Boltzmann kinetic equation known as the BGK (or Krook) model equation. Results were obtained in the form of simple equations relating outgas return flux to spacecraft and orbit parameters. Results were compared with previous analyses based on more simplistic models of the collision processes.
OpenMS: a flexible open-source software platform for mass spectrometry data analysis.
Röst, Hannes L; Sachsenberg, Timo; Aiche, Stephan; Bielow, Chris; Weisser, Hendrik; Aicheler, Fabian; Andreotti, Sandro; Ehrlich, Hans-Christian; Gutenbrunner, Petra; Kenar, Erhan; Liang, Xiao; Nahnsen, Sven; Nilse, Lars; Pfeuffer, Julianus; Rosenberger, George; Rurik, Marc; Schmitt, Uwe; Veit, Johannes; Walzer, Mathias; Wojnar, David; Wolski, Witold E; Schilling, Oliver; Choudhary, Jyoti S; Malmström, Lars; Aebersold, Ruedi; Reinert, Knut; Kohlbacher, Oliver
2016-08-30
High-resolution mass spectrometry (MS) has become an important tool in the life sciences, contributing to the diagnosis and understanding of human diseases, elucidating biomolecular structural information and characterizing cellular signaling networks. However, the rapid growth in the volume and complexity of MS data makes transparent, accurate and reproducible analysis difficult. We present OpenMS 2.0 (http://www.openms.de), a robust, open-source, cross-platform software specifically designed for the flexible and reproducible analysis of high-throughput MS data. The extensible OpenMS software implements common mass spectrometric data processing tasks through a well-defined application programming interface in C++ and Python and through standardized open data formats. OpenMS additionally provides a set of 185 tools and ready-made workflows for common mass spectrometric data processing tasks, which enable users to perform complex quantitative mass spectrometric analyses with ease.
Variability of manual ciliary muscle segmentation in optical coherence tomography images.
Chang, Yu-Cherng; Liu, Keke; Cabot, Florence; Yoo, Sonia H; Ruggeri, Marco; Ho, Arthur; Parel, Jean-Marie; Manns, Fabrice
2018-02-01
Optical coherence tomography (OCT) offers new options for imaging the ciliary muscle allowing direct in vivo visualization. However, variation in image quality along the length of the muscle prevents accurate delineation and quantification of the muscle. Quantitative analyses of the muscle are accompanied by variability in segmentation between examiners and between sessions for the same examiner. In processes such as accommodation where changes in muscle thickness may be tens of microns- the equivalent of a small number of image pixels, differences in segmentation can influence the magnitude and potentially the direction of thickness change. A detailed analysis of variability in ciliary muscle thickness measurements was performed to serve as a benchmark for the extent of this variability in studies on the ciliary muscle. Variation between sessions and examiners were found to be insignificant but the magnitude of variation should be considered when interpreting ciliary muscle results.
Internet-based transfer of cardiac ultrasound images
NASA Technical Reports Server (NTRS)
Firstenberg, M. S.; Greenberg, N. L.; Garcia, M. J.; Morehead, A. J.; Cardon, L. A.; Klein, A. L.; Thomas, J. D.
2000-01-01
A drawback to large-scale multicentre studies is the time required for the centralized evaluation of diagnostic images. We evaluated the feasibility of digital transfer of echocardiographic images to a central laboratory for rapid and accurate interpretation. Ten patients undergoing trans-oesophageal echocardiographic scanning at three sites had representative single images and multiframe loops stored digitally. The images were analysed in the ordinary way. All images were then transferred via the Internet to a central laboratory and reanalysed by a different observer. The file sizes were 1.5-72 MByte and the transfer rates achieved were 0.6-4.8 Mbit/min. Quantitative measurements were similar between most on-site and central laboratory measurements (all P > 0.25), although measurements differed for left atrial width and pulmonary venous systolic velocities (both P < 0.05). Digital transfer of echocardiographic images and data to a central laboratory may be useful for multicentre trials.
Ono, Yohei; Kashihara, Rina; Yasojima, Nobutoshi; Kasahara, Hideki; Shimizu, Yuka; Tamura, Kenichi; Tsutsumi, Kaori; Sutherland, Kenneth; Koike, Takao; Kamishima, Tamotsu
2016-06-01
Accurate evaluation of joint space width (JSW) is important in the assessment of rheumatoid arthritis (RA). In clinical radiography of bilateral hands, the oblique incidence of X-rays is unavoidable, which may cause perceptional or measurement error of JSW. The objective of this study was to examine whether tomosynthesis, a recently developed modality, can facilitate a more accurate evaluation of JSW than radiography under the condition of oblique incidence of X-rays. We investigated quantitative errors derived from the oblique incidence of X-rays by imaging phantoms simulating various finger joint spaces using radiographs and tomosynthesis images. We then compared the qualitative results of the modified total Sharp score of a total of 320 joints from 20 patients with RA between these modalities. A quantitative error was prominent when the location of the phantom was shifted along the JSW direction. Modified total Sharp scores of tomosynthesis images were significantly higher than those of radiography, that is to say JSW was regarded as narrower in tomosynthesis than in radiography when finger joints were located where the oblique incidence of X-rays is expected in the JSW direction. Tomosynthesis can facilitate accurate evaluation of JSW in finger joints of patients with RA, even with oblique incidence of X-rays. Accurate evaluation of JSW is necessary for the management of patients with RA. Through phantom and clinical studies, we demonstrate that tomosynthesis may achieve more accurate evaluation of JSW.
Quantifying Forest Soil Physical Variables Potentially Important for Site Growth Analyses
John S. Kush; Douglas G. Pitt; Phillip J. Craul; William D. Boyer
2004-01-01
Accurate mean plot values of forest soil factors are required for use as independent variables in site-growth analyses. Adequate accuracy is often difficult to attain because soils are inherently widely variable. Estimates of the variability of appropriate soil factors influencing growth can be used to determine the sampling intensity required to secure accurate mean...
Quantitative Phase Microscopy for Accurate Characterization of Microlens Arrays
NASA Astrophysics Data System (ADS)
Grilli, Simonetta; Miccio, Lisa; Merola, Francesco; Finizio, Andrea; Paturzo, Melania; Coppola, Sara; Vespini, Veronica; Ferraro, Pietro
Microlens arrays are of fundamental importance in a wide variety of applications in optics and photonics. This chapter deals with an accurate digital holography-based characterization of both liquid and polymeric microlenses fabricated by an innovative pyro-electrowetting process. The actuation of liquid and polymeric films is obtained through the use of pyroelectric charges generated into polar dielectric lithium niobate crystals.
Automated selected reaction monitoring software for accurate label-free protein quantification.
Teleman, Johan; Karlsson, Christofer; Waldemarson, Sofia; Hansson, Karin; James, Peter; Malmström, Johan; Levander, Fredrik
2012-07-06
Selected reaction monitoring (SRM) is a mass spectrometry method with documented ability to quantify proteins accurately and reproducibly using labeled reference peptides. However, the use of labeled reference peptides becomes impractical if large numbers of peptides are targeted and when high flexibility is desired when selecting peptides. We have developed a label-free quantitative SRM workflow that relies on a new automated algorithm, Anubis, for accurate peak detection. Anubis efficiently removes interfering signals from contaminating peptides to estimate the true signal of the targeted peptides. We evaluated the algorithm on a published multisite data set and achieved results in line with manual data analysis. In complex peptide mixtures from whole proteome digests of Streptococcus pyogenes we achieved a technical variability across the entire proteome abundance range of 6.5-19.2%, which was considerably below the total variation across biological samples. Our results show that the label-free SRM workflow with automated data analysis is feasible for large-scale biological studies, opening up new possibilities for quantitative proteomics and systems biology.
Aslan, Kerim; Gunbey, Hediye Pinar; Tomak, Leman; Ozmen, Zafer; Incesu, Lutfi
The aim of this study was to investigate whether the use of combination quantitative metrics (mamillopontine distance [MPD], pontomesencephalic angle, and mesencephalon anterior-posterior/medial-lateral diameter ratios) with qualitative signs (dural enhancement, subdural collections/hematoma, venous engorgement, pituitary gland enlargements, and tonsillar herniations) provides a more accurate diagnosis of intracranial hypotension (IH). The quantitative metrics and qualitative signs of 34 patients and 34 control subjects were assessed by 2 independent observers. Receiver operating characteristic (ROC) curve was used to evaluate the diagnostic performance of quantitative metrics and qualitative signs, and for the diagnosis of IH, optimum cutoff values of quantitative metrics were found with ROC analysis. Combined ROC curve was measured for the quantitative metrics, and qualitative signs combinations in determining diagnostic accuracy and sensitivity, specificity, and positive and negative predictive values were found, and the best model combination was formed. Whereas MPD and pontomesencephalic angle were significantly lower in patients with IH when compared with the control group (P < 0.001), mesencephalon anterior-posterior/medial-lateral diameter ratio was significantly higher (P < 0.001). For qualitative signs, the highest individual distinctive power was dural enhancement with area under the ROC curve (AUC) of 0.838. For quantitative metrics, the highest individual distinctive power was MPD with AUC of 0.947. The best accuracy in the diagnosis of IH was obtained by combination of dural enhancement, venous engorgement, and MPD with an AUC of 1.00. This study showed that the combined use of dural enhancement, venous engorgement, and MPD had diagnostic accuracy of 100 % for the diagnosis of IH. Therefore, a more accurate IH diagnosis can be provided with combination of quantitative metrics with qualitative signs.
Choël, Marie; Deboudt, Karine; Osán, János; Flament, Pascal; Van Grieken, René
2005-09-01
Atmospheric aerosols consist of a complex heterogeneous mixture of particles. Single-particle analysis techniques are known to provide unique information on the size-resolved chemical composition of aerosols. A scanning electron microscope (SEM) combined with a thin-window energy-dispersive X-ray (EDX) detector enables the morphological and elemental analysis of single particles down to 0.1 microm with a detection limit of 1-10 wt %, low-Z elements included. To obtain data statistically representative of the air masses sampled, a computer-controlled procedure can be implemented in order to run hundreds of single-particle analyses (typically 1000-2000) automatically in a relatively short period of time (generally 4-8 h, depending on the setup and on the particle loading). However, automated particle analysis by SEM-EDX raises two practical challenges: the accuracy of the particle recognition and the reliability of the quantitative analysis, especially for micrometer-sized particles with low atomic number contents. Since low-Z analysis is hampered by the use of traditional polycarbonate membranes, an alternate choice of substrate is a prerequisite. In this work, boron is being studied as a promising material for particle microanalysis. As EDX is generally said to probe a volume of approximately 1 microm3, geometry effects arise from the finite size of microparticles. These particle geometry effects must be corrected by means of a robust concentration calculation procedure. Conventional quantitative methods developed for bulk samples generate elemental concentrations considerably in error when applied to microparticles. A new methodology for particle microanalysis, combining the use of boron as the substrate material and a reverse Monte Carlo quantitative program, was tested on standard particles ranging from 0.25 to 10 microm. We demonstrate that the quantitative determination of low-Z elements in microparticles is achievable and that highly accurate results can be obtained using the automatic data processing described here compared to conventional methods.
Exploring the gender gap in the conceptual survey of electricity and magnetism
NASA Astrophysics Data System (ADS)
Henderson, Rachel; Stewart, Gay; Stewart, John; Michaluk, Lynnette; Traxler, Adrienne
2017-12-01
The "gender gap" on various physics conceptual evaluations has been extensively studied. Men's average pretest scores on the Force Concept Inventory and Force and Motion Conceptual Evaluation are 13% higher than women's, and post-test scores are on average 12% higher than women's. This study analyzed the gender differences within the Conceptual Survey of Electricity and Magnetism (CSEM) in which the gender gap has been less well studied and is less consistent. In the current study, data collected from 1407 students (77% men, 23% women) in a calculus-based physics course over ten semesters showed that male students outperformed female students on the CSEM pretest (5%) and post-test (6%). Separate analyses were conducted for qualitative and quantitative problems on lab quizzes and course exams and showed that male students outperformed female students by 3% on qualitative quiz and exam problems. Male and female students performed equally on the quantitative course exam problems. The gender gaps within CSEM post-test scores, qualitative lab quiz scores, and qualitative exam scores were insignificant for students with a CSEM pretest score of 25% or less but grew as pretest scores increased. Structural equation modeling demonstrated that a latent variable, called Conceptual Physics Performance/Non-Quantitative (CPP/NonQnt), orthogonal to quantitative test performance was useful in explaining the differences observed in qualitative performance; this variable was most strongly related to CSEM post-test scores. The CPP/NonQnt of male students was 0.44 standard deviations higher than female students. The CSEM pretest measured CPP/NonQnt much less accurately for women (R2=4 % ) than for men (R2=17 % ). The failure to detect a gender gap for students scoring 25% or less on the pretest suggests that the CSEM instrument itself is not gender biased. The failure to find a performance difference in quantitative test performance while detecting a gap in qualitative performance suggests the qualitative differences do not result from psychological factors such as science anxiety or stereotype threat.
Dulohery, Kate; Papavdi, Asteria; Michalodimitrakis, Manolis; Kranioti, Elena F
2012-11-01
Coronary artery atherosclerosis is a hugely prevalent condition in the Western World and is often encountered during autopsy. Atherosclerotic plaques can cause luminal stenosis: which, if over a significant level (75%), is said to contribute to cause of death. Estimation of stenosis can be macroscopically performed by the forensic pathologists at the time of autopsy or by microscopic examination. This study compares macroscopic estimation with quantitative microscopic image analysis with a particular focus on the assessment of significant stenosis (>75%). A total of 131 individuals were analysed. The sample consists of an atherosclerotic group (n=122) and a control group (n=9). The results of the two methods were significantly different from each other (p=0.001) and the macroscopic method gave a greater percentage stenosis by an average of 3.5%. Also, histological examination of coronary artery stenosis yielded a difference in significant stenosis in 11.5% of cases. The differences were attributed to either histological quantitative image analysis underestimation; gross examination overestimation; or, a combination of both. The underestimation may have come from tissue shrinkage during tissue processing for histological specimen. The overestimation from the macroscopic assessment can be attributed to the lumen shape, to the examiner observer error or to a possible bias to diagnose coronary disease when no other cause of death is apparent. The results indicate that the macroscopic estimation is open to more biases and that histological quantitative image analysis only gives a precise assessment of stenosis ex vivo. Once tissue shrinkage, if any, is accounted for then histological quantitative image analysis will yield a more accurate assessment of in vivo stenosis. It may then be considered a complementary tool for the examination of coronary stenosis. Copyright © 2012 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Labuda, Aleksander; Proksch, Roger
An ongoing challenge in atomic force microscope (AFM) experiments is the quantitative measurement of cantilever motion. The vast majority of AFMs use the optical beam deflection (OBD) method to infer the deflection of the cantilever. The OBD method is easy to implement, has impressive noise performance, and tends to be mechanically robust. However, it represents an indirect measurement of the cantilever displacement, since it is fundamentally an angular rather than a displacement measurement. Here, we demonstrate a metrological AFM that combines an OBD sensor with a laser Doppler vibrometer (LDV) to enable accurate measurements of the cantilever velocity and displacement.more » The OBD/LDV AFM allows a host of quantitative measurements to be performed, including in-situ measurements of cantilever oscillation modes in piezoresponse force microscopy. As an example application, we demonstrate how this instrument can be used for accurate quantification of piezoelectric sensitivity—a longstanding goal in the electromechanical community.« less
NASA Astrophysics Data System (ADS)
Chen, Shichao; Zhu, Yizheng
2017-02-01
Sensitivity is a critical index to measure the temporal fluctuation of the retrieved optical pathlength in quantitative phase imaging system. However, an accurate and comprehensive analysis for sensitivity evaluation is still lacking in current literature. In particular, previous theoretical studies for fundamental sensitivity based on Gaussian noise models are not applicable to modern cameras and detectors, which are dominated by shot noise. In this paper, we derive two shot noiselimited theoretical sensitivities, Cramér-Rao bound and algorithmic sensitivity for wavelength shifting interferometry, which is a major category of on-axis interferometry techniques in quantitative phase imaging. Based on the derivations, we show that the shot noise-limited model permits accurate estimation of theoretical sensitivities directly from measured data. These results can provide important insights into fundamental constraints in system performance and can be used to guide system design and optimization. The same concepts can be generalized to other quantitative phase imaging techniques as well.
Ryan, C M; Yarmush, M L; Tompkins, R G
1992-04-01
Polyethylene glycol 3350 (PEG 3350) is useful as an orally administered probe to measure in vivo intestinal permeability to macromolecules. Previous methods to detect polyethylene glycol (PEG) excreted in the urine have been hampered by inherent inaccuracies associated with liquid-liquid extraction and turbidimetric analysis. For accurate quantitation by previous methods, radioactive labels were required. This paper describes a method to separate and quantitate PEG 3350 and PEG 400 in human urine that is independent of radioactive labels and is accurate in clinical practice. The method uses sized regenerated cellulose membranes and mixed ion-exchange resin for sample preparation and high-performance liquid chromatography with refractive index detection for analysis. The 24-h excretion for normal individuals after an oral dose of 40 g of PEG 3350 and 5 g of PEG 400 was 0.12 +/- 0.04% of the original dose of PEG 3350 and 26.3 +/- 5.1% of the original dose of PEG 400.
Funderburg, Rebecca; Arevalo, Ricardo; Locmelis, Marek; Adachi, Tomoko
2017-11-01
Laser ablation ICP-MS enables streamlined, high-sensitivity measurements of rare earth element (REE) abundances in geological materials. However, many REE isotope mass stations are plagued by isobaric interferences, particularly from diatomic oxides and argides. In this study, we compare REE abundances quantitated from mass spectra collected with low-resolution (m/Δm = 300 at 5% peak height) and medium-resolution (m/Δm = 2500) mass discrimination. A wide array of geological samples was analyzed, including USGS and NIST glasses ranging from mafic to felsic in composition, with NIST 610 employed as the bracketing calibrating reference material. The medium-resolution REE analyses are shown to be significantly more accurate and precise (at the 95% confidence level) than low-resolution analyses, particularly in samples characterized by low (<μg/g levels) REE abundances. A list of preferred mass stations that are least susceptible to isobaric interferences is reported. These findings impact the reliability of REE abundances derived from LA-ICP-MS methods, particularly those relying on mass analyzers that do not offer tuneable mass-resolution and/or collision cell technologies that can reduce oxide and/or argide formation. Graphical Abstract ᅟ.
NASA Astrophysics Data System (ADS)
Funderburg, Rebecca; Arevalo, Ricardo; Locmelis, Marek; Adachi, Tomoko
2017-07-01
Laser ablation ICP-MS enables streamlined, high-sensitivity measurements of rare earth element (REE) abundances in geological materials. However, many REE isotope mass stations are plagued by isobaric interferences, particularly from diatomic oxides and argides. In this study, we compare REE abundances quantitated from mass spectra collected with low-resolution (m/Δm = 300 at 5% peak height) and medium-resolution (m/Δm = 2500) mass discrimination. A wide array of geological samples was analyzed, including USGS and NIST glasses ranging from mafic to felsic in composition, with NIST 610 employed as the bracketing calibrating reference material. The medium-resolution REE analyses are shown to be significantly more accurate and precise (at the 95% confidence level) than low-resolution analyses, particularly in samples characterized by low (<μg/g levels) REE abundances. A list of preferred mass stations that are least susceptible to isobaric interferences is reported. These findings impact the reliability of REE abundances derived from LA-ICP-MS methods, particularly those relying on mass analyzers that do not offer tuneable mass-resolution and/or collision cell technologies that can reduce oxide and/or argide formation.
Tu, Yiyou; Plotnikov, Elizaveta Y; Seidman, David N
2015-04-01
This study investigates the effects of the charge-state ratio of evaporated ions on the accuracy of local-electrode atom-probe (LEAP) tomographic compositional and structural analyses, which employs a picosecond ultraviolet pulsed laser. Experimental results demonstrate that the charge-state ratio is a better indicator of the best atom-probe tomography (APT) experimental conditions compared with laser pulse energy. The thermal tails in the mass spectra decrease significantly, and the mass resolving power (m/Δm) increases by 87.5 and 185.7% at full-width half-maximum and full-width tenth-maximum, respectively, as the laser pulse energy is increased from 5 to 30 pJ/pulse. The measured composition of this alloy depends on the charge-state ratio of the evaporated ions, and the most accurate composition is obtained when Ni2+/Ni+ is in the range of 0.3-20. The γ(f.c.c.)/γ'(L12) interface is quantitatively more diffuse when determined from the measured concentration profiles for higher laser pulse energies. Conclusions of the APT compositional and structural analyses utilizing the same suitable charge-state ratio are more comparable than those collected with the same laser pulse energy.
Yehia, Ali M; Arafa, Reham M; Abbas, Samah S; Amer, Sawsan M
2016-01-15
Spectral resolution of cefquinome sulfate (CFQ) in the presence of its degradation products was studied. Three selective, accurate and rapid spectrophotometric methods were performed for the determination of CFQ in the presence of either its hydrolytic, oxidative or photo-degradation products. The proposed ratio difference, derivative ratio and mean centering are ratio manipulating spectrophotometric methods that were satisfactorily applied for selective determination of CFQ within linear range of 5.0-40.0 μg mL(-1). Concentration Residuals Augmented Classical Least Squares was applied and evaluated for the determination of the cited drug in the presence of its all degradation products. Traditional Partial Least Squares regression was also applied and benchmarked against the proposed advanced multivariate calibration. Experimentally designed 25 synthetic mixtures of three factors at five levels were used to calibrate and validate the multivariate models. Advanced chemometrics succeeded in quantitative and qualitative analyses of CFQ along with its hydrolytic, oxidative and photo-degradation products. The proposed methods were applied successfully for different pharmaceutical formulations analyses. These developed methods were simple and cost-effective compared with the manufacturer's RP-HPLC method. Copyright © 2015 Elsevier B.V. All rights reserved.
Tricarico, Carmela; Pinzani, Pamela; Bianchi, Simonetta; Paglierani, Milena; Distante, Vito; Pazzagli, Mario; Bustin, Stephen A; Orlando, Claudio
2002-10-15
Careful normalization is essential when using quantitative reverse transcription polymerase chain reaction assays to compare mRNA levels between biopsies from different individuals or cells undergoing different treatment. Generally this involves the use of internal controls, such as mRNA specified by a housekeeping gene, ribosomal RNA (rRNA), or accurately quantitated total RNA. The aim of this study was to compare these methods and determine which one can provide the most accurate and biologically relevant quantitative results. Our results show significant variation in the expression levels of 10 commonly used housekeeping genes and 18S rRNA, both between individuals and between biopsies taken from the same patient. Furthermore, in 23 breast cancers samples mRNA and protein levels of a regulated gene, vascular endothelial growth factor (VEGF), correlated only when normalized to total RNA, as did microvessel density. Finally, mRNA levels of VEGF and the most popular housekeeping gene, glyceraldehyde-3-phosphate dehydrogenase (GAPDH), were significantly correlated in the colon. Our results suggest that the use of internal standards comprising single housekeeping genes or rRNA is inappropriate for studies involving tissue biopsies.
Sexing chick mRNA: A protocol based on quantitative real-time polymerase chain reaction.
Wan, Z; Lu, Y; Rui, L; Yu, X; Li, Z
2017-03-01
The accurate identification of sex in birds is important for research on avian sex determination and differentiation. Polymerase chain reaction (PCR)-based methods have been widely applied for the molecular sexing of birds. However, these methods have used genomic DNA. Here, we present the first sexing protocol for chick mRNA based on real-time quantitative PCR. We demonstrate that this method can accurately determine sex using mRNA from chick gonads and other tissues, such as heart, liver, spleen, lung, and muscle. The strategy of this protocol also may be suitable for other species in which sex is determined by the inheritance of sex chromosomes (ZZ male and ZW female). © 2016 Poultry Science Association Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sercombe, W.J.; Smith, G.W.; Morse, J.D.
1996-01-01
The October field, a sub-salt giant in the extensional Gulf of Suez (Egypt) has been structurally reinterpreted for new reserve opportunities. Quantitative SCAT analyses of the wellbore dip data have been integrated with 3D seismic by using dip isogons to construct local structural sections. SCAT dip analysis was critical to the reinterpretation because SCAT revealed important structural information that previously was unresolvable using conventional tadpole plots. In gross aspect, the October Field is a homocline that trends NW-SE, dips to the NE, and is closed on the SW (updip) by the major Clysmic Normal Fault. SCAT accurately calculated the overallmore » trend of the field, but also identified important structural anomalies near the Clysmic fault and in the northwest and southeast plunge ends. In the northwest plunge end, SCAT has identified new, south dipping blocks that are transitional to the structurally-higher North October field. The southeast plunge end has been reinterpreted with correct azimuthal trends and new fault-block prospects. These new SCAT results have successfully improved the 3D seismic interpretation by providing a foundation of accurate in-situ structural control in an area of poor-to-fair seismic quality below the Miocene salt package.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sercombe, W.J.; Smith, G.W.; Morse, J.D.
1996-12-31
The October field, a sub-salt giant in the extensional Gulf of Suez (Egypt) has been structurally reinterpreted for new reserve opportunities. Quantitative SCAT analyses of the wellbore dip data have been integrated with 3D seismic by using dip isogons to construct local structural sections. SCAT dip analysis was critical to the reinterpretation because SCAT revealed important structural information that previously was unresolvable using conventional tadpole plots. In gross aspect, the October Field is a homocline that trends NW-SE, dips to the NE, and is closed on the SW (updip) by the major Clysmic Normal Fault. SCAT accurately calculated the overallmore » trend of the field, but also identified important structural anomalies near the Clysmic fault and in the northwest and southeast plunge ends. In the northwest plunge end, SCAT has identified new, south dipping blocks that are transitional to the structurally-higher North October field. The southeast plunge end has been reinterpreted with correct azimuthal trends and new fault-block prospects. These new SCAT results have successfully improved the 3D seismic interpretation by providing a foundation of accurate in-situ structural control in an area of poor-to-fair seismic quality below the Miocene salt package.« less
Box, Stephen E.; Bookstrom, Arthur A.; Ikramuddin, Mohammed; Lindsay, James
2001-01-01
(Fe), manganese (Mn), arsenic (As), and cadmium (Cd). In general inter-laboratory correlations are better for samples within the compositional range of the Standard Reference Materials (SRMs) from the National Institute of Standards and Technology (NIST). Analyses by EWU are the most accurate relative to the NIST standards (mean recoveries within 1% for Pb, Fe, Mn, and As, 3% for Zn and 5% for Cd) and are the most precise (within 7% of the mean at the 95% confidence interval). USGS-EDXRF is similarly accurate for Pb and Zn. XRAL and ACZ are relatively accurate for Pb (within 5-8% of certified NIST values), but were considerably less accurate for the other 5 elements of concern (10-25% of NIST values). However, analyses of sample splits by more than one laboratory reveal that, for some elements, XRAL (Pb, Mn, Cd) and ACZ (Pb, Mn, Zn, Fe) analyses were comparable to EWU analyses of the same samples (when values are within the range of NIST SRMs). These results suggest that, for some elements, XRAL and ACZ dissolutions are more effective on the matrix of the CdA samples than on the matrix of the NIST samples (obtained from soils around Butte, Montana). Splits of CdA samples analyzed by CHEMEX were the least accurate, yielding values 10-25% less than those of EWU.
Challenges in Higher Education Research: The Use of Quantitative Tools in Comparative Analyses
ERIC Educational Resources Information Center
Reale, Emanuela
2014-01-01
Despite the value of the comparative perspective for the study of higher education is widely recognised, there is little consensus about the specific methodological approaches. Quantitative tools outlined their relevance for addressing comparative analyses since they are supposed to reducing the complexity, finding out and graduating similarities…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-20
.... Please note that EPA's policy is that public comments, whether submitted electronically or in paper, will... learning to perform quantitative hot-spot analyses; new burden associated with using the MOVES model for..., adjustment for increased burden associated with quantitative hot-spot analyses, an adjustment for the...
ERIC Educational Resources Information Center
Ojala, Maria
2008-01-01
Theories about ambivalence, as well as quantitative and qualitative empirical approaches, are applied to obtain an understanding of recycling among young adults. A questionnaire was mailed to 422 Swedish young people. Regression analyses showed that a mix of negative emotions (worry) and positive emotions (hope and joy) about the environmental…
Spinks, Phillip Q; Thomson, Robert C; Shaffer, H Bradley
2014-05-01
As the field of phylogeography has matured, it has become clear that analyses of one or a few genes may reveal more about the history of those genes than the populations and species that are the targets of study. To alleviate these concerns, the discipline has moved towards larger analyses of more individuals and more genes, although little attention has been paid to the qualitative or quantitative gains that such increases in scale and scope may yield. Here, we increase the number of individuals and markers by an order of magnitude over previously published work to comprehensively assess the phylogeographical history of a well-studied declining species, the western pond turtle (Emys marmorata). We present a new analysis of 89 independent nuclear SNP markers and one mitochondrial gene sequence scored for rangewide sampling of >900 individuals, and compare these to smaller-scale, rangewide genetic and morphological analyses. Our enlarged SNP data fundamentally revise our understanding of evolutionary history for this lineage. Our results indicate that the gains from greatly increasing both the number of markers and individuals are substantial and worth the effort, particularly for species of high conservation concern such as the pond turtle, where accurate assessments of population history are a prerequisite for effective management. © 2014 John Wiley & Sons Ltd.
Design of primers and probes for quantitative real-time PCR methods.
Rodríguez, Alicia; Rodríguez, Mar; Córdoba, Juan J; Andrade, María J
2015-01-01
Design of primers and probes is one of the most crucial factors affecting the success and quality of quantitative real-time PCR (qPCR) analyses, since an accurate and reliable quantification depends on using efficient primers and probes. Design of primers and probes should meet several criteria to find potential primers and probes for specific qPCR assays. The formation of primer-dimers and other non-specific products should be avoided or reduced. This factor is especially important when designing primers for SYBR(®) Green protocols but also in designing probes to ensure specificity of the developed qPCR protocol. To design primers and probes for qPCR, multiple software programs and websites are available being numerous of them free. These tools often consider the default requirements for primers and probes, although new research advances in primer and probe design should be progressively added to different algorithm programs. After a proper design, a precise validation of the primers and probes is necessary. Specific consideration should be taken into account when designing primers and probes for multiplex qPCR and reverse transcription qPCR (RT-qPCR). This chapter provides guidelines for the design of suitable primers and probes and their subsequent validation through the development of singlex qPCR, multiplex qPCR, and RT-qPCR protocols.
A spatiotemporal characterization method for the dynamic cytoskeleton.
Alhussein, Ghada; Shanti, Aya; Farhat, Ilyas A H; Timraz, Sara B H; Alwahab, Noaf S A; Pearson, Yanthe E; Martin, Matthew N; Christoforou, Nicolas; Teo, Jeremy C M
2016-05-01
The significant gap between quantitative and qualitative understanding of cytoskeletal function is a pressing problem; microscopy and labeling techniques have improved qualitative investigations of localized cytoskeleton behavior, whereas quantitative analyses of whole cell cytoskeleton networks remain challenging. Here we present a method that accurately quantifies cytoskeleton dynamics. Our approach digitally subdivides cytoskeleton images using interrogation windows, within which box-counting is used to infer a fractal dimension (Df ) to characterize spatial arrangement, and gray value intensity (GVI) to determine actin density. A partitioning algorithm further obtains cytoskeleton characteristics from the perinuclear, cytosolic, and periphery cellular regions. We validated our measurement approach on Cytochalasin-treated cells using transgenically modified dermal fibroblast cells expressing fluorescent actin cytoskeletons. This method differentiates between normal and chemically disrupted actin networks, and quantifies rates of cytoskeletal degradation. Furthermore, GVI distributions were found to be inversely proportional to Df , having several biophysical implications for cytoskeleton formation/degradation. We additionally demonstrated detection sensitivity of differences in Df and GVI for cells seeded on substrates with varying degrees of stiffness, and coated with different attachment proteins. This general approach can be further implemented to gain insights on dynamic growth, disruption, and structure of the cytoskeleton (and other complex biological morphology) due to biological, chemical, or physical stimuli. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
A spatiotemporal characterization method for the dynamic cytoskeleton
Alhussein, Ghada; Shanti, Aya; Farhat, Ilyas A. H.; Timraz, Sara B. H.; Alwahab, Noaf S. A.; Pearson, Yanthe E.; Martin, Matthew N.; Christoforou, Nicolas
2016-01-01
The significant gap between quantitative and qualitative understanding of cytoskeletal function is a pressing problem; microscopy and labeling techniques have improved qualitative investigations of localized cytoskeleton behavior, whereas quantitative analyses of whole cell cytoskeleton networks remain challenging. Here we present a method that accurately quantifies cytoskeleton dynamics. Our approach digitally subdivides cytoskeleton images using interrogation windows, within which box‐counting is used to infer a fractal dimension (D f) to characterize spatial arrangement, and gray value intensity (GVI) to determine actin density. A partitioning algorithm further obtains cytoskeleton characteristics from the perinuclear, cytosolic, and periphery cellular regions. We validated our measurement approach on Cytochalasin‐treated cells using transgenically modified dermal fibroblast cells expressing fluorescent actin cytoskeletons. This method differentiates between normal and chemically disrupted actin networks, and quantifies rates of cytoskeletal degradation. Furthermore, GVI distributions were found to be inversely proportional to D f, having several biophysical implications for cytoskeleton formation/degradation. We additionally demonstrated detection sensitivity of differences in D f and GVI for cells seeded on substrates with varying degrees of stiffness, and coated with different attachment proteins. This general approach can be further implemented to gain insights on dynamic growth, disruption, and structure of the cytoskeleton (and other complex biological morphology) due to biological, chemical, or physical stimuli. © 2016 The Authors. Cytoskeleton Published by Wiley Periodicals, Inc. PMID:27015595
TFBSshape: a motif database for DNA shape features of transcription factor binding sites.
Yang, Lin; Zhou, Tianyin; Dror, Iris; Mathelier, Anthony; Wasserman, Wyeth W; Gordân, Raluca; Rohs, Remo
2014-01-01
Transcription factor binding sites (TFBSs) are most commonly characterized by the nucleotide preferences at each position of the DNA target. Whereas these sequence motifs are quite accurate descriptions of DNA binding specificities of transcription factors (TFs), proteins recognize DNA as a three-dimensional object. DNA structural features refine the description of TF binding specificities and provide mechanistic insights into protein-DNA recognition. Existing motif databases contain extensive nucleotide sequences identified in binding experiments based on their selection by a TF. To utilize DNA shape information when analysing the DNA binding specificities of TFs, we developed a new tool, the TFBSshape database (available at http://rohslab.cmb.usc.edu/TFBSshape/), for calculating DNA structural features from nucleotide sequences provided by motif databases. The TFBSshape database can be used to generate heat maps and quantitative data for DNA structural features (i.e., minor groove width, roll, propeller twist and helix twist) for 739 TF datasets from 23 different species derived from the motif databases JASPAR and UniPROBE. As demonstrated for the basic helix-loop-helix and homeodomain TF families, our TFBSshape database can be used to compare, qualitatively and quantitatively, the DNA binding specificities of closely related TFs and, thus, uncover differential DNA binding specificities that are not apparent from nucleotide sequence alone.
TFBSshape: a motif database for DNA shape features of transcription factor binding sites
Yang, Lin; Zhou, Tianyin; Dror, Iris; Mathelier, Anthony; Wasserman, Wyeth W.; Gordân, Raluca; Rohs, Remo
2014-01-01
Transcription factor binding sites (TFBSs) are most commonly characterized by the nucleotide preferences at each position of the DNA target. Whereas these sequence motifs are quite accurate descriptions of DNA binding specificities of transcription factors (TFs), proteins recognize DNA as a three-dimensional object. DNA structural features refine the description of TF binding specificities and provide mechanistic insights into protein–DNA recognition. Existing motif databases contain extensive nucleotide sequences identified in binding experiments based on their selection by a TF. To utilize DNA shape information when analysing the DNA binding specificities of TFs, we developed a new tool, the TFBSshape database (available at http://rohslab.cmb.usc.edu/TFBSshape/), for calculating DNA structural features from nucleotide sequences provided by motif databases. The TFBSshape database can be used to generate heat maps and quantitative data for DNA structural features (i.e., minor groove width, roll, propeller twist and helix twist) for 739 TF datasets from 23 different species derived from the motif databases JASPAR and UniPROBE. As demonstrated for the basic helix-loop-helix and homeodomain TF families, our TFBSshape database can be used to compare, qualitatively and quantitatively, the DNA binding specificities of closely related TFs and, thus, uncover differential DNA binding specificities that are not apparent from nucleotide sequence alone. PMID:24214955
Ochs, Matthias; Mühlfeld, Christian
2013-07-01
The growing awareness of the importance of accurate morphometry in lung research has recently motivated the publication of guidelines set forth by a combined task force of the American Thoracic Society and the European Respiratory Society (20). This official ATS/ERS Research Policy Statement provides general recommendations on which stereological methods are to be used in quantitative microscopy of the lung. However, to integrate stereology into a particular experimental study design, investigators are left with the problem of how to implement this in practice. Specifically, different animal models of human lung disease require the use of different stereological techniques and may determine the mode of lung fixation, tissue processing, preparation of sections, and other things. Therefore, the present companion articles were designed to allow a short practically oriented introduction into the concepts of design-based stereology (Part 1) and to provide recommendations for choosing the most appropriate methods to investigate a number of important disease models (Part 2). Worked examples with illustrative images will facilitate the practical performance of equivalent analyses. Study algorithms provide comprehensive surveys to ensure that no essential step gets lost during the multistage workflow. Thus, with this review, we hope to close the gap between theory and practice and enhance the use of stereological techniques in pulmonary research.
Czechowski, Tomasz; Stitt, Mark; Altmann, Thomas; Udvardi, Michael K.; Scheible, Wolf-Rüdiger
2005-01-01
Gene transcripts with invariant abundance during development and in the face of environmental stimuli are essential reference points for accurate gene expression analyses, such as RNA gel-blot analysis or quantitative reverse transcription-polymerase chain reaction (PCR). An exceptionally large set of data from Affymetrix ATH1 whole-genome GeneChip studies provided the means to identify a new generation of reference genes with very stable expression levels in the model plant species Arabidopsis (Arabidopsis thaliana). Hundreds of Arabidopsis genes were found that outperform traditional reference genes in terms of expression stability throughout development and under a range of environmental conditions. Most of these were expressed at much lower levels than traditional reference genes, making them very suitable for normalization of gene expression over a wide range of transcript levels. Specific and efficient primers were developed for 22 genes and tested on a diverse set of 20 cDNA samples. Quantitative reverse transcription-PCR confirmed superior expression stability and lower absolute expression levels for many of these genes, including genes encoding a protein phosphatase 2A subunit, a coatomer subunit, and an ubiquitin-conjugating enzyme. The developed PCR primers or hybridization probes for the novel reference genes will enable better normalization and quantification of transcript levels in Arabidopsis in the future. PMID:16166256
Clinical and physiological assessments for elucidating falls risk in Parkinson's disease.
Latt, Mark D; Lord, Stephen R; Morris, John G L; Fung, Victor S C
2009-07-15
The study aims were to devise (1) a fall risk screen for people with PD using routine clinical measures and (2) an explanatory (physiological) fall risk assessment for guiding fall prevention interventions. One hundred thirteen people with PD (age 66 +/- 95% CI 1.6 years) underwent clinical assessments and quantitative tests of sway, gait, strength, reaction time, and lower limb sensation. Participants were then followed up for 12 months to determine fall incidence. In the follow-up year, 51 participants (45%) fell one or more times whereas 62 participants (55%) did not fall. Multivariate analyses of routine clinical measures revealed that a fall in the past year, abnormal axial posture, cognitive impairment, and freezing of gait were independent risk factors for falls and predicted 38/51 fallers (75%) and 45/62 non-fallers (73%). A multivariate model combining clinical and physiological measures that elucidate the pathophysiology of falls identified abnormal posture, freezing of gait, frontal impairment, poor leaning balance, and leg weakness as independent risk factors. This model correctly classified 39/51 fallers (77%) and 51/62 non-fallers (82%). Patients with PD at risk of falls can be identified accurately with routine clinical assessments and quantitative physiological tests. Many of the risk factors identified are amenable to targeted intervention. 2009 Movement Disorder Society.
Bahmanabadi, L; Akhgari, M; Jokar, F; Sadeghi, H B
2017-02-01
Methamphetamine abuse is one of the most medical and social problems many countries face. In spite of the ban on the use of methamphetamine, it is widely available in Iran's drug black market. There are many analytical methods for the detection of methamphetamine in biological specimen. Oral fluid has become a popular specimen to test for the presence of methamphetamine. The purpose of the present study was to develop a method for the extraction and detection of methamphetamine in oral fluid samples using liquid-liquid extraction (LLE) and gas chromatography/mass spectrometry (GC/MS) methods. An analytical study was designed in that blank and 50 authentic oral fluid samples were collected to be first extracted by LLE and subsequently analysed by GC/MS. The method was fully validated and showed an excellent intra- and inter-assay precision (reflex sympathetic dystrophy ˂ 10%) for external quality control samples. Recovery with LLE methods was 96%. Limit of detection and limit of quantitation were 5 and 15 ng/mL, respectively. The method showed high selectivity, no additional peak due to interfering substances in samples was observed. The introduced method was sensitive, accurate and precise enough for the extraction of methamphetamine from oral fluid samples in forensic toxicology laboratories.
Xuan, Tong; Zhang, J Allen; Ahmad, Imran
2006-05-03
A simple HPLC method was developed for quantification of SN-38, 7-ethyl-10-hydroxycamptothecin, in a novel liposome-based formulation (LE-SN38). The chromatographic separation was achieved on an Agilent Zorbax SB-C18 (4.6 mmx250 mm, 5 microm) analytical column using a mobile phase consisting of a mixture of NaH2PO4 (pH 3.1, 25 mM) and acetonitrile (50:50, v/v). SN-38 was detected at UV wavelength of 265 nm and quantitatively determined using an external calibration method. The limit of detection (LOD) and limit of quantitation (LOQ) were found to be 0.05 and 0.25 microg/mL, respectively. The individual spike recovery of SN-38 ranged from 100 to 101%. The percent of relative standard deviation (%R.S.D.) of intra-day and inter-day analyses were less than 1.6%. The method validation results confirmed that the method is specific, linear, accurate, precise, robust and sensitive for its intended use. The current method was successfully applied to the determination of SN-38 content and drug entrapment efficiency in liposome-based formulation, LE-SN38 during early stage formulation development.
Zhang, Qi-Lin; Zhu, Qian-Hua; Liao, Xin; Wang, Xiu-Qiang; Chen, Tao; Xu, Han-Ting; Wang, Juan; Yuan, Ming-Long; Chen, Jun-Yuan
2016-01-01
Amphioxus is a closest living proxy to the ancestor of cephalochordates with vertebrates, and key animal for novel understanding in the evolutionary origin of vertebrate body plan, genome, tissues and immune system. Reliable analyses using quantitative real-time PCR (qRT-PCR) for answering these scientific questions is heavily dependent on reliable reference genes (RGs). In this study, we evaluated stability of thirteen candidate RGs in qRT-PCR for different developmental stages and tissues of amphioxus by four independent (geNorm, NormFinder, BestKeeper and deltaCt) and one comparative algorithms (RefFinder). The results showed that the top two stable RGs were the following: (1) S20 and 18 S in thirteen developmental stages, (2) EF1A and ACT in seven normal tissues, (3) S20 and L13 in both intestine and hepatic caecum challenged with lipopolysaccharide (LPS), and (4) S20 and EF1A in gill challenged with LPS. The expression profiles of two target genes (EYA and HHEX) in thirteen developmental stages were used to confirm the reliability of chosen RGs. This study identified optimal RGs that can be used to accurately measure gene expression under these conditions, which will benefit evolutionary and functional genomics studies in amphioxus. PMID:27869224
Unnikrishnan, Ginu U.; Morgan, Elise F.
2011-01-01
Inaccuracies in the estimation of material properties and errors in the assignment of these properties into finite element models limit the reliability, accuracy, and precision of quantitative computed tomography (QCT)-based finite element analyses of the vertebra. In this work, a new mesh-independent, material mapping procedure was developed to improve the quality of predictions of vertebral mechanical behavior from QCT-based finite element models. In this procedure, an intermediate step, called the material block model, was introduced to determine the distribution of material properties based on bone mineral density, and these properties were then mapped onto the finite element mesh. A sensitivity study was first conducted on a calibration phantom to understand the influence of the size of the material blocks on the computed bone mineral density. It was observed that varying the material block size produced only marginal changes in the predictions of mineral density. Finite element (FE) analyses were then conducted on a square column-shaped region of the vertebra and also on the entire vertebra in order to study the effect of material block size on the FE-derived outcomes. The predicted values of stiffness for the column and the vertebra decreased with decreasing block size. When these results were compared to those of a mesh convergence analysis, it was found that the influence of element size on vertebral stiffness was less than that of the material block size. This mapping procedure allows the material properties in a finite element study to be determined based on the block size required for an accurate representation of the material field, while the size of the finite elements can be selected independently and based on the required numerical accuracy of the finite element solution. The mesh-independent, material mapping procedure developed in this study could be particularly helpful in improving the accuracy of finite element analyses of vertebroplasty and spine metastases, as these analyses typically require mesh refinement at the interfaces between distinct materials. Moreover, the mapping procedure is not specific to the vertebra and could thus be applied to many other anatomic sites. PMID:21823740
Enjilela, Esmaeil; Lee, Ting-Yim; Hsieh, Jiang; Wisenberg, Gerald; Teefy, Patrick; Yadegari, Andrew; Bagur, Rodrigo; Islam, Ali; Branch, Kelley; So, Aaron
2018-03-01
We implemented and validated a compressed sensing (CS) based algorithm for reconstructing dynamic contrast-enhanced (DCE) CT images of the heart from sparsely sampled X-ray projections. DCE CT imaging of the heart was performed on five normal and ischemic pigs after contrast injection. DCE images were reconstructed with filtered backprojection (FBP) and CS from all projections (984-view) and 1/3 of all projections (328-view), and with CS from 1/4 of all projections (246-view). Myocardial perfusion (MP) measurements with each protocol were compared to those with the reference 984-view FBP protocol. Both the 984-view CS and 328-view CS protocols were in good agreements with the reference protocol. The Pearson correlation coefficients of 984-view CS and 328-view CS determined from linear regression analyses were 0.98 and 0.99 respectively. The corresponding mean biases of MP measurement determined from Bland-Altman analyses were 2.7 and 1.2ml/min/100g. When only 328 projections were used for image reconstruction, CS was more accurate than FBP for MP measurement with respect to 984-view FBP. However, CS failed to generate MP maps comparable to those with 984-view FBP when only 246 projections were used for image reconstruction. DCE heart images reconstructed from one-third of a full projection set with CS were minimally affected by aliasing artifacts, leading to accurate MP measurements with the effective dose reduced to just 33% of conventional full-view FBP method. The proposed CS sparse-view image reconstruction method could facilitate the implementation of sparse-view dynamic acquisition for ultra-low dose CT MP imaging. Copyright © 2017 Elsevier B.V. All rights reserved.
Buhot, M C; Chapuis, N; Scardigli, P; Herrmann, T
1991-07-01
The behaviour of sham-operated rats and rats with damage to the dorsal hippocampus was compared in a complex spatial problem-solving task using a 'hub-spoke-rim' wheel type maze. Compared to the classical Olton 8-arm radial maze and Morris water maze, this apparatus presents the animal with a series of possible alternative routes both direct and indirect to the goal (food). The task included 3 main stages: exploration, feeding and testing, as do the classic problem-solving tasks. During exploration, hippocampal rats were found to be more active than sham rats. Nevertheless, they displayed habituation and a relatively efficient circumnavigation, though, in both cases, different from those of sham rats. During test trials, hippocampal rats were characterized as being less accurate, making more errors than sham rats. Nevertheless, both groups increased their accuracy of first choices over trials. The qualitative analyses of test trial performance indicated that hippocampal rats were less accurate in terms of the initial error's deviation from the goal, and less efficient in terms of corrective behaviour than sham rats which used either the periphery or the spokes to attain economically the goal. Surprisingly, hippocampal rats were not limited to a taxon type orientation but learned to use the periphery, a tendency which developed over time. Seemingly, for sham rats, the problem-solving process took the form of updating information during transit. For hippocampal rats, the use of periphery reflected both an ability to discriminate its usefulness in reaching the goal via a taxis type behaviour, and some sparing of ability to generalize the closeness and the location of the goal. These results, especially the strategic correction patterns, are discussed in the light of Sutherland and Rudy's 'configurational association theory'.
A molecular computational model improves the preoperative diagnosis of thyroid nodules
2012-01-01
Background Thyroid nodules with indeterminate cytological features on fine needle aspiration (FNA) cytology have a 20% risk of thyroid cancer. The aim of the current study was to determine the diagnostic utility of an 8-gene assay to distinguish benign from malignant thyroid neoplasm. Methods The mRNA expression level of 9 genes (KIT, SYNGR2, C21orf4, Hs.296031, DDI2, CDH1, LSM7, TC1, NATH) was analysed by quantitative PCR (q-PCR) in 93 FNA cytological samples. To evaluate the diagnostic utility of all the genes analysed, we assessed the area under the curve (AUC) for each gene individually and in combination. BRAF exon 15 status was determined by pyrosequencing. An 8-gene computational model (Neural Network Bayesian Classifier) was built and a multiple-variable analysis was then performed to assess the correlation between the markers. Results The AUC for each significant marker ranged between 0.625 and 0.900, thus all the significant markers, alone and in combination, can be used to distinguish between malignant and benign FNA samples. The classifier made up of KIT, CDH1, LSM7, C21orf4, DDI2, TC1, Hs.296031 and BRAF had a predictive power of 88.8%. It proved to be useful for risk stratification of the most critical cytological group of the indeterminate lesions for which there is the greatest need of accurate diagnostic markers. Conclusion The genetic classification obtained with this model is highly accurate at differentiating malignant from benign thyroid lesions and might be a useful adjunct in the preoperative management of patients with thyroid nodules. PMID:22958914
Guo, Longhua; Qiu, Bin; Chi, Yuwu; Chen, Guonan
2008-09-01
In this paper, an ultrasensitive CE-CL detection system coupled with a novel double-on-column coaxial flow detection interface was developed for the detection of PCR products. A reliable procedure based on this system had been demonstrated for qualitative and quantitative analysis of genetically modified organism-the detection of Roundup Ready Soy (RRS) samples was presented as an example. The promoter, terminator, function and two reference genes of RRS were amplified with multiplex PCR simultaneously. After that, the multiplex PCR products were labeled with acridinium ester at the 5'-terminal through an amino modification and then analyzed by the proposed CE-CL system. Reproducibility of analysis times and peak heights for the CE-CL analysis were determined to be better than 0.91 and 3.07% (RSD, n=15), respectively, for three consecutive days. It was shown that this method could accurately and qualitatively detect RRS standards and the simulative samples. The evaluation in terms of quantitative analysis of RRS provided by this new method was confirmed by comparing our assay results with those of the standard real-time quantitative PCR (RT-QPCR) using SYBR Green I dyes. The results showed a good coherence between the two methods. This approach demonstrated the possibility for accurate qualitative and quantitative detection of GM plants in a single run.
NASA Technical Reports Server (NTRS)
Whalen, Robert T.; Napel, Sandy; Yan, Chye H.
1996-01-01
Progress in development of the methods required to study bone remodeling as a function of time is reported. The following topics are presented: 'A New Methodology for Registration Accuracy Evaluation', 'Registration of Serial Skeletal Images for Accurately Measuring Changes in Bone Density', and 'Precise and Accurate Gold Standard for Multimodality and Serial Registration Method Evaluations.'
Leadership and Culture-Building in Schools: Quantitative and Qualitative Understandings.
ERIC Educational Resources Information Center
Sashkin, Marshall; Sashkin, Molly G.
Understanding effective school leadership as a function of culture building through quantitative and qualitative analyses is the purpose of this paper. The two-part quantitative phase of the research focused on statistical measures of culture and leadership behavior directed toward culture building in the school. The first quantitative part…
Lau, Darryl; Hervey-Jumper, Shawn L; Han, Seunggu J; Berger, Mitchel S
2018-05-01
OBJECTIVE There is ample evidence that extent of resection (EOR) is associated with improved outcomes for glioma surgery. However, it is often difficult to accurately estimate EOR intraoperatively, and surgeon accuracy has yet to be reviewed. In this study, the authors quantitatively assessed the accuracy of intraoperative perception of EOR during awake craniotomy for tumor resection. METHODS A single-surgeon experience of performing awake craniotomies for tumor resection over a 17-year period was examined. Retrospective review of operative reports for quantitative estimation of EOR was recorded. Definitive EOR was based on postoperative MRI. Analysis of accuracy of EOR estimation was examined both as a general outcome (gross-total resection [GTR] or subtotal resection [STR]), and quantitatively (5% within EOR on postoperative MRI). Patient demographics, tumor characteristics, and surgeon experience were examined. The effects of accuracy on motor and language outcomes were assessed. RESULTS A total of 451 patients were included in the study. Overall accuracy of intraoperative perception of whether GTR or STR was achieved was 79.6%, and overall accuracy of quantitative perception of resection (within 5% of postoperative MRI) was 81.4%. There was a significant difference (p = 0.049) in accuracy for gross perception over the 17-year period, with improvement over the later years: 1997-2000 (72.6%), 2001-2004 (78.5%), 2005-2008 (80.7%), and 2009-2013 (84.4%). Similarly, there was a significant improvement (p = 0.015) in accuracy of quantitative perception of EOR over the 17-year period: 1997-2000 (72.2%), 2001-2004 (69.8%), 2005-2008 (84.8%), and 2009-2013 (93.4%). This improvement in accuracy is demonstrated by the significantly higher odds of correctly estimating quantitative EOR in the later years of the series on multivariate logistic regression. Insular tumors were associated with the highest accuracy of gross perception (89.3%; p = 0.034), but lowest accuracy of quantitative perception (61.1% correct; p < 0.001) compared with tumors in other locations. Even after adjusting for surgeon experience, this particular trend for insular tumors remained true. The absence of 1p19q co-deletion was associated with higher quantitative perception accuracy (96.9% vs 81.5%; p = 0.051). Tumor grade, recurrence, diagnosis, and isocitrate dehydrogenase-1 (IDH-1) status were not associated with accurate perception of EOR. Overall, new neurological deficits occurred in 8.4% of cases, and 42.1% of those new neurological deficits persisted after the 3-month follow-up. Correct quantitative perception was associated with lower postoperative motor deficits (2.4%) compared with incorrect perceptions (8.0%; p = 0.029). There were no detectable differences in language outcomes based on perception of EOR. CONCLUSIONS The findings from this study suggest that there is a learning curve associated with the ability to accurately assess intraoperative EOR during glioma surgery, and it may take more than a decade to be truly proficient. Understanding the factors associated with this ability to accurately assess EOR will provide safer surgeries while maximizing tumor resection.
Boggess, Andrew; Crump, Stephen; Gregory, Clint; ...
2017-12-06
Here, unique hazards are presented in the analysis of radiologically contaminated samples. Strenuous safety and security precautions must be in place to protect the analyst, laboratory, and instrumentation used to perform analyses. A validated method has been optimized for the analysis of select nitroaromatic explosives and degradative products using gas chromatography/mass spectrometry via sonication extraction of radiologically contaminated soils, for samples requiring ISO/IEC 17025 laboratory conformance. Target analytes included 2-nitrotoluene, 4-nitrotoluene, 2,6-dinitrotoluene, and 2,4,6-trinitrotoluene, as well as the degradative product 4-amino-2,6-dinitrotoluene. Analytes were extracted from soil in methylene chloride by sonication. Administrative and engineering controls, as well as instrument automationmore » and quality control measures, were utilized to minimize potential human exposure to radiation at all times and at all stages of analysis, from receiving through disposition. Though thermal instability increased uncertainties of these selected compounds, a mean lower quantitative limit of 2.37 µg/mL and mean accuracy of 2.3% relative error and 3.1% relative standard deviation were achieved. Quadratic regression was found to be optimal for calibration of all analytes, with compounds of lower hydrophobicity displaying greater parabolic curve. Blind proficiency testing (PT) of spiked soil samples demonstrated a mean relative error of 9.8%. Matrix spiked analyses of PT samples demonstrated that 99% recovery of target analytes was achieved. To the knowledge of the authors, this represents the first safe, accurate, and reproducible quantitative method for nitroaromatic explosives in soil for specific use on radiologically contaminated samples within the constraints of a nuclear analytical lab.« less
Company-Arumí, Dolors; Figueras, Mercè; Salvadó, Victoria; Molinas, Marisa; Serra, Olga; Anticó, Enriqueta
2016-11-01
Protective plant lipophilic barriers such as suberin and cutin, with their associated waxes, are complex fatty acyl derived polyesters. Their precise chemical composition is valuable to understand the specific role of each compound to the physiological function of the barrier. To develop a method for the compositional analysis of suberin and associated waxes by gas chromatography (GC) coupled to ion trap-mass spectrometry (IT-MS) using N-(tert-butyldimethylsilyl)-N-methyl-trifluoroacetamide (MTBSTFA) as sylilating reagent, and apply it to compare the suberin of the root and tuber periderm of potato (Solanum tuberosum). Waxes and suberin monomers from root and periderm were extracted subsequently using organic solvents and by methanolysis, and subjected to MTBSTFA derivatisation. GC analyses of periderm extracts were used to optimise the chromatographic method and the compound identification. Quantitative data was obtained using external calibration curves. The method was fully validated and applied for suberin composition analyses of roots and periderm. Wax and suberin compounds were successfully separated and compound identification was based on the specific (M-57) and non-specific ions in mass spectra. The use of calibration curves built with different external standards provided quantitative accurate data and showed that suberin from root contains shorter chained fatty acyl derivatives and a relative predominance of α,ω-alkanedioic acids compared to that of the periderm. We present a method for the analysis of suberin and their associated waxes based on MTBSTFA derivatisation. Moreover, the characteristic root suberin composition may be the adaptive response to its specific regulation of permeability to water and gases. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Miloudi, Lynda; Bonnier, Franck; Bertrand, Dominique; Byrne, Hugh J; Perse, Xavier; Chourpa, Igor; Munnier, Emilie
2017-07-01
Core-shell nanocarriers are increasingly being adapted in cosmetic and dermatological fields, aiming to provide an increased penetration of the active pharmaceutical or cosmetic ingredients (API and ACI) through the skin. In the final form, the nanocarriers (NC) are usually prepared in hydrogels, conferring desired viscous properties for topical application. Combined with the high chemical complexity of the encapsulating system itself, involving numerous ingredients to form a stable core and quantifying the NC and/or the encapsulated active without labor-intensive and destructive methods remains challenging. In this respect, the specific molecular fingerprint obtained from vibrational spectroscopy analysis could unambiguously overcome current obstacles in the development of fast and cost-effective quality control tools for NC-based products. The present study demonstrates the feasibility to deliver accurate quantification of the concentrations of curcumin (ACI)-loaded alginate nanocarriers in hydrogel matrices, coupling partial least square regression (PLSR) to infrared (IR) absorption and Raman spectroscopic analyses. With respective root mean square errors of 0.1469 ± 0.0175% w/w and 0.4462 ± 0.0631% w/w, both approaches offer acceptable precision. Further investigation of the PLSR results allowed to highlight the different selectivity of each approach, indicating only IR analysis delivers direct monitoring of the NC through the quantification of the Labrafac®, the main NC ingredient. Raman analyses are rather dominated by the contribution of the ACI which opens numerous perspectives to quantify the active molecules without interferences from the complex core-shell encapsulating systems thus positioning the technique as a powerful analytical tool for industrial screening of cosmetic and pharmaceutical products. Graphical abstract Quantitative analysis of encapuslated active molecules in hydrogel-based samples by means of infrared and Raman spectroscopy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boggess, Andrew; Crump, Stephen; Gregory, Clint
Here, unique hazards are presented in the analysis of radiologically contaminated samples. Strenuous safety and security precautions must be in place to protect the analyst, laboratory, and instrumentation used to perform analyses. A validated method has been optimized for the analysis of select nitroaromatic explosives and degradative products using gas chromatography/mass spectrometry via sonication extraction of radiologically contaminated soils, for samples requiring ISO/IEC 17025 laboratory conformance. Target analytes included 2-nitrotoluene, 4-nitrotoluene, 2,6-dinitrotoluene, and 2,4,6-trinitrotoluene, as well as the degradative product 4-amino-2,6-dinitrotoluene. Analytes were extracted from soil in methylene chloride by sonication. Administrative and engineering controls, as well as instrument automationmore » and quality control measures, were utilized to minimize potential human exposure to radiation at all times and at all stages of analysis, from receiving through disposition. Though thermal instability increased uncertainties of these selected compounds, a mean lower quantitative limit of 2.37 µg/mL and mean accuracy of 2.3% relative error and 3.1% relative standard deviation were achieved. Quadratic regression was found to be optimal for calibration of all analytes, with compounds of lower hydrophobicity displaying greater parabolic curve. Blind proficiency testing (PT) of spiked soil samples demonstrated a mean relative error of 9.8%. Matrix spiked analyses of PT samples demonstrated that 99% recovery of target analytes was achieved. To the knowledge of the authors, this represents the first safe, accurate, and reproducible quantitative method for nitroaromatic explosives in soil for specific use on radiologically contaminated samples within the constraints of a nuclear analytical lab.« less
Nolte, Tom M; Ragas, Ad M J
2017-03-22
Many organic chemicals are ionizable by nature. After use and release into the environment, various fate processes determine their concentrations, and hence exposure to aquatic organisms. In the absence of suitable data, such fate processes can be estimated using Quantitative Structure-Property Relationships (QSPRs). In this review we compiled available QSPRs from the open literature and assessed their applicability towards ionizable organic chemicals. Using quantitative and qualitative criteria we selected the 'best' QSPRs for sorption, (a)biotic degradation, and bioconcentration. The results indicate that many suitable QSPRs exist, but some critical knowledge gaps remain. Specifically, future focus should be directed towards the development of QSPR models for biodegradation in wastewater and sediment systems, direct photolysis and reaction with singlet oxygen, as well as additional reactive intermediates. Adequate QSPRs for bioconcentration in fish exist, but more accurate assessments can be achieved using pharmacologically based toxicokinetic (PBTK) models. No adequate QSPRs exist for bioconcentration in non-fish species. Due to the high variability of chemical and biological species as well as environmental conditions in QSPR datasets, accurate predictions for specific systems and inter-dataset conversions are problematic, for which standardization is needed. For all QSPR endpoints, additional data requirements involve supplementing the current chemical space covered and accurately characterizing the test systems used.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Yi; Hu, Dehong; Markillie, Lye Meng
Quantitative gene expression analysis in intact single cells can be achieved using single molecule- based fluorescence in situ hybridization (smFISH). This approach relies on fluorescence intensity to distinguish between true signals, emitted from an RNA copy hybridized with multiple FISH sub-probes, and background noise. Thus, the precision in smFISH is often compromised by partial or nonspecific binding of sub-probes and tissue autofluorescence, limiting its accuracy. Here we provide an accurate approach for setting quantitative thresholds between true and false signals, which relies on blinking frequencies of photoswitchable dyes. This fluctuation localization imaging-based FISH (fliFISH) uses blinking frequency patterns, emitted frommore » a transcript bound to multiple sub-probes, which are distinct from blinking patterns emitted from partial or nonspecifically bound sub-probes and autofluorescence. Using multicolor fliFISH, we identified radial gene expression patterns in mouse pancreatic islets for insulin, the transcription factor, NKX2-2, and their ratio (Nkx2-2/Ins2). These radial patterns, showing higher values in β cells at the islet core and lower values in peripheral cells, were lost in diabetic mouse islets. In summary, fliFISH provides an accurate, quantitative approach for detecting and counting true RNA copies and rejecting false signals by their distinct blinking frequency patterns, laying the foundation for reliable single-cell transcriptomics.« less
The accurate quantitation of proteins or peptides using Mass Spectrometry (MS) is gaining prominence in the biomedical research community as an alternative method for analyte measurement. The Clinical Proteomic Tumor Analysis Consortium (CPTAC) investigators have been at the forefront in the promotion of reproducible MS techniques, through the development and application of standardized proteomic methods for protein quantitation on biologically relevant samples.
ERIC Educational Resources Information Center
Xu, Xia; Veenstra, Timothy D.
2012-01-01
The list of physiological events in which sex steroids play a role continues to increase. To decipher the roles that sex steroids play in any condition requires high quality cohorts of samples and assays that provide highly accurate quantitative measures. Liquid and gas chromatography coupled with mass spectrometry (LC-MS and GC-MS) have…
Abdulhay, Enas; Khnouf, Ruba; Haddad, Shireen; Al-Bashir, Areen
2017-08-04
Improvement of medical content in Biomedical Engineering curricula based on a qualitative assessment process or on a comparison with another high-standard program has been approached by a number of studies. However, the quantitative assessment tools have not been emphasized. The quantitative assessment tools can be more accurate and robust in cases of challenging multidisciplinary fields like that of Biomedical Engineering which includes biomedicine elements mixed with technology aspects. The major limitations of the previous research are the high dependence on surveys or pure qualitative approaches as well as the absence of strong focus on medical outcomes without implicit confusion with the technical ones. The proposed work presents the development and evaluation of an accurate/robust quantitative approach to the improvement of the medical content in the challenging multidisciplinary BME curriculum. The work presents quantitative assessment tools and subsequent improvement of curriculum medical content applied, as example for explanation, to the ABET (Accreditation Board for Engineering and Technology, USA) accredited biomedical engineering BME department at Jordan University of Science and Technology. The quantitative results of assessment of curriculum/course, capstone, exit exam, course assessment by student (CAS) as well as of surveys filled by alumni, seniors, employers and training supervisors were, first, mapped to the expected students' outcomes related to the medical field (SOsM). The collected data were then analyzed and discussed to find curriculum weakness points by tracking shortcomings in every outcome degree of achievement. Finally, actions were taken to fill in the gaps of the curriculum. Actions were also mapped to the students' medical outcomes (SOsM). Weighted averages of obtained quantitative values, mapped to SOsM, indicated accurately the achievement levels of all outcomes as well as the necessary improvements to be performed in curriculum. Mapping the improvements to SOsM also helps in the assessment of the following cycle. The suggested assessment tools can be generalized and extended to any other BME department. Robust improvement of medical content in BME curriculum can subsequently be achieved.
Devonshire, Alison S; O'Sullivan, Denise M; Honeyborne, Isobella; Jones, Gerwyn; Karczmarczyk, Maria; Pavšič, Jernej; Gutteridge, Alice; Milavec, Mojca; Mendoza, Pablo; Schimmel, Heinz; Van Heuverswyn, Fran; Gorton, Rebecca; Cirillo, Daniela Maria; Borroni, Emanuele; Harris, Kathryn; Barnard, Marinus; Heydenrych, Anthenette; Ndusilo, Norah; Wallis, Carole L; Pillay, Keshree; Barry, Thomas; Reddington, Kate; Richter, Elvira; Mozioğlu, Erkan; Akyürek, Sema; Yalçınkaya, Burhanettin; Akgoz, Muslum; Žel, Jana; Foy, Carole A; McHugh, Timothy D; Huggett, Jim F
2016-08-03
Real-time PCR (qPCR) based methods, such as the Xpert MTB/RIF, are increasingly being used to diagnose tuberculosis (TB). While qualitative methods are adequate for diagnosis, the therapeutic monitoring of TB patients requires quantitative methods currently performed using smear microscopy. The potential use of quantitative molecular measurements for therapeutic monitoring has been investigated but findings have been variable and inconclusive. The lack of an adequate reference method and reference materials is a barrier to understanding the source of such disagreement. Digital PCR (dPCR) offers the potential for an accurate method for quantification of specific DNA sequences in reference materials which can be used to evaluate quantitative molecular methods for TB treatment monitoring. To assess a novel approach for the development of quality assurance materials we used dPCR to quantify specific DNA sequences in a range of prototype reference materials and evaluated accuracy between different laboratories and instruments. The materials were then also used to evaluate the quantitative performance of qPCR and Xpert MTB/RIF in eight clinical testing laboratories. dPCR was found to provide results in good agreement with the other methods tested and to be highly reproducible between laboratories without calibration even when using different instruments. When the reference materials were analysed with qPCR and Xpert MTB/RIF by clinical laboratories, all laboratories were able to correctly rank the reference materials according to concentration, however there was a marked difference in the measured magnitude. TB is a disease where the quantification of the pathogen could lead to better patient management and qPCR methods offer the potential to rapidly perform such analysis. However, our findings suggest that when precisely characterised materials are used to evaluate qPCR methods, the measurement result variation is too high to determine whether molecular quantification of Mycobacterium tuberculosis would provide a clinically useful readout. The methods described in this study provide a means by which the technical performance of quantitative molecular methods can be evaluated independently of clinical variability to improve accuracy of measurement results. These will assist in ultimately increasing the likelihood that such approaches could be used to improve patient management of TB.
ERIC Educational Resources Information Center
Plonsky, Luke
2013-01-01
This study assesses research and reporting practices in quantitative second language (L2) research. A sample of 606 primary studies, published from 1990 to 2010 in "Language Learning and Studies in Second Language Acquisition," was collected and coded for designs, statistical analyses, reporting practices, and outcomes (i.e., effect…
Industrial ecology: Quantitative methods for exploring a lower carbon future
NASA Astrophysics Data System (ADS)
Thomas, Valerie M.
2015-03-01
Quantitative methods for environmental and cost analyses of energy, industrial, and infrastructure systems are briefly introduced and surveyed, with the aim of encouraging broader utilization and development of quantitative methods in sustainable energy research. Material and energy flow analyses can provide an overall system overview. The methods of engineering economics and cost benefit analysis, such as net present values, are the most straightforward approach for evaluating investment options, with the levelized cost of energy being a widely used metric in electricity analyses. Environmental lifecycle assessment has been extensively developed, with both detailed process-based and comprehensive input-output approaches available. Optimization methods provide an opportunity to go beyond engineering economics to develop detailed least-cost or least-impact combinations of many different choices.
2001-10-25
analyses of electroencephalogram at half- closed eye and fully closed eye. This study aimed at quantitative estimating rest rhythm of horses by the...analyses of eyeball movement. The mask attached with a miniature CCD camera was newly developed. The continuous images of the horse eye for about 24...eyeball area were calculated. As for the results, the fluctuating status of eyeball area was analyzed quantitatively, and the rest rhythm of horses was
Monitoring cell morphology during necrosis and apoptosis by quantitative phase imaging
NASA Astrophysics Data System (ADS)
Mugnano, Martina; Calabuig, Alejandro; Grilli, Simonetta; Miccio, Lisa; Ferraro, Pietro
2015-05-01
Cellular morphology changes and volume alterations play significant roles in many biological processes and they are mirrors of cell functions. In this paper, we propose the Digital Holographic microscope (DH) as a non-invasive imaging technique for a rapid and accurate extraction of morphological information related to cell death. In particular, we investigate the morphological variations that occur during necrosis and apoptosis. The study of necrosis is extremely important because it is often associated with unwarranted loss of cells in human pathologies such as ischemia, trauma, and some forms of neurodegeneration; therefore, a better elucidation in terms of cell morphological changes could pave the way for new treatments. Also, apoptosis is extremely important because it's involved in cancer, both in its formation and in medical treatments. Because the inability to initiate apoptosis enhances tumour formation, current cancer treatments target this pathway. Within this framework, we have developed a transmission off-axis DH apparatus integrated with a micro incubator for investigation of living cells in a temperature and CO2 controlled environment. We employ DH to analyse the necrosis cell death induced by laser light (wavelength 473 nm, light power 4 mW). We have chosen as cellular model NIH 3T3 mouse embryonic fibroblasts because their adhesive features such as morphological changes, and the time needed to adhere and spread have been well characterized in the literature. We have monitored cell volume changes and morphological alterations in real time in order to study the necrosis process accurately and quantitatively. Cell volume changes were evaluated from the measured phase changes of light transmitted through cells. Our digital holographic experiments showed that after exposure of cells to laser light for 90-120 min., they swell and then take on a balloon-like shape until the plasma membrane ruptures and finally the cell volume decreases. Furthermore, we present a preliminary study on the variation of morphological parameters in case of cell apoptosis induced by exposure to 10 μM cadmium chloride. We employ the same cell line, monitoring the process for 18 hours. In the vast group of environmental pollutants, the toxic heavy metal cadmium is considered a likely candidate as a causative agent of several types of cancers. Widely distributed and used in industry, and with a broad range of target organs and a long half-life (10-30 years) in the human body, this element has been long known for its multiple adverse effects on human health, through occupational or environmental exposure. In apoptosis, we measure cell volume decrease and cell shrinking. Both data of apoptosis and necrosis were analysed by means of a Sigmoidal Statistical Distribution function, which allows several quantitative data to be established, such as swelling and cell death time, flux of intracellular material from inside to outside the cell, initial and final volume versus time. In addition, we can quantitatively study the cytoplasmatic granularity that occurs during necrosis. As a future application, DH could be employed as a non-invasive and label-free method to distinguish between apoptosis and necrosis in terms of morphological parameters.
Coltharp, Carla; Kessler, Rene P.; Xiao, Jie
2012-01-01
Localization-based superresolution microscopy techniques such as Photoactivated Localization Microscopy (PALM) and Stochastic Optical Reconstruction Microscopy (STORM) have allowed investigations of cellular structures with unprecedented optical resolutions. One major obstacle to interpreting superresolution images, however, is the overcounting of molecule numbers caused by fluorophore photoblinking. Using both experimental and simulated images, we determined the effects of photoblinking on the accurate reconstruction of superresolution images and on quantitative measurements of structural dimension and molecule density made from those images. We found that structural dimension and relative density measurements can be made reliably from images that contain photoblinking-related overcounting, but accurate absolute density measurements, and consequently faithful representations of molecule counts and positions in cellular structures, require the application of a clustering algorithm to group localizations that originate from the same molecule. We analyzed how applying a simple algorithm with different clustering thresholds (tThresh and dThresh) affects the accuracy of reconstructed images, and developed an easy method to select optimal thresholds. We also identified an empirical criterion to evaluate whether an imaging condition is appropriate for accurate superresolution image reconstruction with the clustering algorithm. Both the threshold selection method and imaging condition criterion are easy to implement within existing PALM clustering algorithms and experimental conditions. The main advantage of our method is that it generates a superresolution image and molecule position list that faithfully represents molecule counts and positions within a cellular structure, rather than only summarizing structural properties into ensemble parameters. This feature makes it particularly useful for cellular structures of heterogeneous densities and irregular geometries, and allows a variety of quantitative measurements tailored to specific needs of different biological systems. PMID:23251611
Daniel, Hubert Darius J; Fletcher, John G; Chandy, George M; Abraham, Priya
2009-01-01
Sensitive nucleic acid testing for the detection and accurate quantitation of hepatitis B virus (HBV) is necessary to reduce transmission through blood and blood products and for monitoring patients on antiviral therapy. The aim of this study is to standardize an "in-house" real-time HBV polymerase chain reaction (PCR) for accurate quantitation and screening of HBV. The "in-house" real-time assay was compared with a commercial assay using 30 chronically infected individuals and 70 blood donors who are negative for hepatitis B surface antigen, hepatitis C virus (HCV) antibody and human immunodeficiency virus (HIV) antibody. Further, 30 HBV-genotyped samples were tested to evaluate the "in-house" assay's capacity to detect genotypes prevalent among individuals attending this tertiary care hospital. The lower limit of detection of this "in-house" HBV real-time PCR was assessed against the WHO international standard and found to be 50 IU/mL. The interassay and intra-assay coefficient of variation (CV) of this "in-house" assay ranged from 1.4% to 9.4% and 0.0% to 2.3%, respectively. Virus loads as estimated with this "in-house" HBV real-time assay correlated well with the commercial artus HBV RG PCR assay ( r = 0.95, P < 0.0001). This assay can be used for the detection and accurate quantitation of HBV viral loads in plasma samples. This assay can be employed for the screening of blood donations and can potentially be adapted to a multiplex format for simultaneous detection of HBV, HIV and HCV to reduce the cost of testing in blood banks.
De Benedictis, Alessandro; Nocerino, Erica; Menna, Fabio; Remondino, Fabio; Barbareschi, Mattia; Rozzanigo, Umberto; Corsini, Francesco; Olivetti, Emanuele; Marras, Carlo Efisio; Chioffi, Franco; Avesani, Paolo; Sarubbo, Silvio
2018-04-13
Anatomic awareness of the structural connectivity of the brain is mandatory for neurosurgeons, to select the most effective approaches for brain resections. Although standard microdissection is a validated technique to investigate the different white matter (WM) pathways and to verify the results of tractography, the possibility of interactive exploration of the specimens and reliable acquisition of quantitative information has not been described. Photogrammetry is a well-established technique allowing an accurate metrology on highly defined three-dimensional (3D) models. The aim of this work is to propose the application of the photogrammetric technique for supporting the 3D exploration and the quantitative analysis on the cerebral WM connectivity. The main perisylvian pathways, including the superior longitudinal fascicle and the arcuate fascicle were exposed using the Klingler technique. The photogrammetric acquisition followed each dissection step. The point clouds were registered to a reference magnetic resonance image of the specimen. All the acquisitions were coregistered into an open-source model. We analyzed 5 steps, including the cortical surface, the short intergyral fibers, the indirect posterior and anterior superior longitudinal fascicle, and the arcuate fascicle. The coregistration between the magnetic resonance imaging mesh and the point clouds models was highly accurate. Multiple measures of distances between specific cortical landmarks and WM tracts were collected on the photogrammetric model. Photogrammetry allows an accurate 3D reproduction of WM anatomy and the acquisition of unlimited quantitative data directly on the real specimen during the postdissection analysis. These results open many new promising neuroscientific and educational perspectives and also optimize the quality of neurosurgical treatments. Copyright © 2018 Elsevier Inc. All rights reserved.
Pulmonary MRA: differentiation of pulmonary embolism from truncation artefact.
Bannas, Peter; Schiebler, Mark L; Motosugi, Utaroh; François, Christopher J; Reeder, Scott B; Nagle, Scott K
2014-08-01
Truncation artefact (Gibbs ringing) causes central signal drop within vessels in pulmonary magnetic resonance angiography (MRA) that can be mistaken for emboli, reducing diagnostic accuracy for pulmonary embolism (PE). We propose a quantitative approach to differentiate truncation artefact from PE. Twenty-eight patients who underwent pulmonary computed tomography angiography (CTA) for suspected PE were recruited for pulmonary MRA. Signal intensity drops within pulmonary arteries that persisted on both arterial-phase and delayed-phase MRA were identified. The percent signal loss between the vessel lumen and central drop was measured. CTA served as the reference standard for presence of pulmonary emboli. A total of 65 signal intensity drops were identified on MRA. Of these, 48 (74%) were artefacts and 17 (26%) were PE, as confirmed by CTA. Truncation artefacts had a significantly lower median signal drop than PE on both arterial-phase (26% [range 12-58%] vs. 85% [range 53-91%]) and delayed-phase MRA (26% [range 11-55%] vs. 77% [range 47-89%]), p < 0.0001 for both. Receiver operating characteristic (ROC) analyses revealed a threshold value of 51% (arterial phase) and 47% signal drop (delayed phase) to differentiate between truncation artefact and PE with 100% sensitivity and greater than 90% specificity. Quantitative signal drop is an objective tool to help differentiate truncation artefact and pulmonary embolism in pulmonary MRA. • Inexperienced readers may mistake truncation artefacts for emboli on pulmonary MRA • Pulmonary emboli have non-uniform signal drop • 51% (arterial phase) and 47% (delayed phase) cut-off differentiates truncation artefact from PE • Quantitative signal drop measurement enables more accurate pulmonary embolism diagnosis with MRA.
The use of semi-structured interviews for the characterisation of farmer irrigation practices
NASA Astrophysics Data System (ADS)
O'Keeffe, Jimmy; Buytaert, Wouter; Mijic, Ana; Brozović, Nicholas; Sinha, Rajiv
2016-05-01
For the development of sustainable and realistic water security, generating information on the behaviours, characteristics, and drivers of users, as well as on the resource itself, is essential. In this paper we present a methodology for collecting qualitative and quantitative data on water use practices through semi-structured interviews. This approach facilitates the collection of detailed information on actors' decisions in a convenient and cost-effective manner. Semi-structured interviews are organised around a topic guide, which helps lead the conversation in a standardised way while allowing sufficient opportunity for relevant issues to emerge. In addition, they can be used to obtain certain types of quantitative data. While not as accurate as direct measurements, they can provide useful information on local practices and users' insights. We present an application of the methodology on farmer water use in two districts in the state of Uttar Pradesh in northern India. By means of 100 farmer interviews, information was collected on various aspects of irrigation practices, including irrigation water volumes, irrigation cost, water source, and their spatial variability. Statistical analyses of the information, along with data visualisation, are also presented, indicating a significant variation in irrigation practices both within and between districts. Our application shows that semi-structured interviews are an effective and efficient method of collecting both qualitative and quantitative information for the assessment of drivers, behaviours, and their outcomes in a data-scarce region. The collection of this type of data could significantly improve insights on water resources, leading to more realistic management options and increased water security in the future.
Hunter, Margaret; Dorazio, Robert M.; Butterfield, John S.; Meigs-Friend, Gaia; Nico, Leo; Ferrante, Jason A.
2017-01-01
A set of universal guidelines is needed to determine the limit of detection (LOD) in PCR-based analyses of low concentration DNA. In particular, environmental DNA (eDNA) studies require sensitive and reliable methods to detect rare and cryptic species through shed genetic material in environmental samples. Current strategies for assessing detection limits of eDNA are either too stringent or subjective, possibly resulting in biased estimates of species’ presence. Here, a conservative LOD analysis grounded in analytical chemistry is proposed to correct for overestimated DNA concentrations predominantly caused by the concentration plateau, a nonlinear relationship between expected and measured DNA concentrations. We have used statistical criteria to establish formal mathematical models for both quantitative and droplet digital PCR. To assess the method, a new Grass Carp (Ctenopharyngodon idella) TaqMan assay was developed and tested on both PCR platforms using eDNA in water samples. The LOD adjustment reduced Grass Carp occupancy and detection estimates while increasing uncertainty – indicating that caution needs to be applied to eDNA data without LOD correction. Compared to quantitative PCR, digital PCR had higher occurrence estimates due to increased sensitivity and dilution of inhibitors at low concentrations. Without accurate LOD correction, species occurrence and detection probabilities based on eDNA estimates are prone to a source of bias that cannot be reduced by an increase in sample size or PCR replicates. Other applications also could benefit from a standardized LOD such as GMO food analysis, and forensic and clinical diagnostics.
Quantitative Hydrocarbon Energies from the PMO Method.
ERIC Educational Resources Information Center
Cooper, Charles F.
1979-01-01
Details a procedure for accurately calculating the quantum mechanical energies of hydrocarbons using the perturbational molecular orbital (PMO) method, which does not require the use of a computer. (BT)
Phommasone, Koukeo; Althaus, Thomas; Souvanthong, Phonesavanh; Phakhounthong, Khansoudaphone; Soyvienvong, Laxoy; Malapheth, Phatthaphone; Mayxay, Mayfong; Pavlicek, Rebecca L; Paris, Daniel H; Dance, David; Newton, Paul; Lubell, Yoel
2016-02-04
C-Reactive Protein (CRP) has been shown to be an accurate biomarker for discriminating bacterial from viral infections in febrile patients in Southeast Asia. Here we investigate the accuracy of existing rapid qualitative and semi-quantitative tests as compared with a quantitative reference test to assess their potential for use in remote tropical settings. Blood samples were obtained from consecutive patients recruited to a prospective fever study at three sites in rural Laos. At each site, one of three rapid qualitative or semi-quantitative tests was performed, as well as a corresponding quantitative NycoCard Reader II as a reference test. We estimate the sensitivity and specificity of the three tests against a threshold of 10 mg/L and kappa values for the agreement of the two semi-quantitative tests with the results of the reference test. All three tests showed high sensitivity, specificity and kappa values as compared with the NycoCard Reader II. With a threshold of 10 mg/L the sensitivity of the tests ranged from 87-98 % and the specificity from 91-98 %. The weighted kappa values for the semi-quantitative tests were 0.7 and 0.8. The use of CRP rapid tests could offer an inexpensive and effective approach to improve the targeting of antibiotics in remote settings where health facilities are basic and laboratories are absent. This study demonstrates that accurate CRP rapid tests are commercially available; evaluations of their clinical impact and cost-effectiveness at point of care is warranted.
Wei, Cong; Grace, James E; Zvyaga, Tatyana A; Drexler, Dieter M
2012-08-01
The polar nucleoside drug ribavirin (RBV) combined with IFN-α is a front-line treatment for chronic hepatitis C virus infection. RBV acts as a prodrug and exerts its broad antiviral activity primarily through its active phosphorylated metabolite ribavirin 5´-triphosphate (RTP), and also possibly through ribavirin 5´-monophosphate (RMP). To study RBV transport, diffusion, metabolic clearance and its impact on drug-metabolizing enzymes, a LC-MS method is needed to simultaneously quantify RBV and its phosphorylated metabolites (RTP, ribavirin 5´-diphosphate and RMP). In a recombinant human UGT1A1 assay, the assay buffer components uridine and its phosphorylated derivatives are isobaric with RBV and its phosphorylated metabolites, leading to significant interference when analyzed by LC-MS with the nominal mass resolution mode. Presented here is a LC-MS method employing LC coupled with full-scan high-resolution accurate MS analysis for the simultaneous quantitative determination of RBV, RMP, ribavirin 5´-diphosphate and RTP by differentiating RBV and its phosphorylated metabolites from uridine and its phosphorylated derivatives by accurate mass, thus avoiding interference. The developed LC-high-resolution accurate MS method allows for quantitation of RBV and its phosphorylated metabolites, eliminating the interferences from uridine and its phosphorylated derivatives in recombinant human UGT1A1 assays.
Hu, Valerie W.; Addington, Anjene; Hyman, Alexander
2011-01-01
The heterogeneity of symptoms associated with autism spectrum disorders (ASDs) has presented a significant challenge to genetic analyses. Even when associations with genetic variants have been identified, it has been difficult to associate them with a specific trait or characteristic of autism. Here, we report that quantitative trait analyses of ASD symptoms combined with case-control association analyses using distinct ASD subphenotypes identified on the basis of symptomatic profiles result in the identification of highly significant associations with 18 novel single nucleotide polymorphisms (SNPs). The symptom categories included deficits in language usage, non-verbal communication, social development, and play skills, as well as insistence on sameness or ritualistic behaviors. Ten of the trait-associated SNPs, or quantitative trait loci (QTL), were associated with more than one subtype, providing partial replication of the identified QTL. Notably, none of the novel SNPs is located within an exonic region, suggesting that these hereditary components of ASDs are more likely related to gene regulatory processes (or gene expression) than to structural or functional changes in gene products. Seven of the QTL reside within intergenic chromosomal regions associated with rare copy number variants that have been previously reported in autistic samples. Pathway analyses of the genes associated with the QTL identified in this study implicate neurological functions and disorders associated with autism pathophysiology. This study underscores the advantage of incorporating both quantitative traits as well as subphenotypes into large-scale genome-wide analyses of complex disorders. PMID:21556359
NASA Astrophysics Data System (ADS)
Hutchison, Keith D.; Etherton, Brian J.; Topping, Phillip C.
1996-12-01
Quantitative assessments on the performance of automated cloud analysis algorithms require the creation of highly accurate, manual cloud, no cloud (CNC) images from multispectral meteorological satellite data. In general, the methodology to create ground truth analyses for the evaluation of cloud detection algorithms is relatively straightforward. However, when focus shifts toward quantifying the performance of automated cloud classification algorithms, the task of creating ground truth images becomes much more complicated since these CNC analyses must differentiate between water and ice cloud tops while ensuring that inaccuracies in automated cloud detection are not propagated into the results of the cloud classification algorithm. The process of creating these ground truth CNC analyses may become particularly difficult when little or no spectral signature is evident between a cloud and its background, as appears to be the case when thin cirrus is present over snow-covered surfaces. In this paper, procedures are described that enhance the researcher's ability to manually interpret and differentiate between thin cirrus clouds and snow-covered surfaces in daytime AVHRR imagery. The methodology uses data in up to six AVHRR spectral bands, including an additional band derived from the daytime 3.7 micron channel, which has proven invaluable for the manual discrimination between thin cirrus clouds and snow. It is concluded that while the 1.6 micron channel remains essential to differentiate between thin ice clouds and snow. However, this capability that may be lost if the 3.7 micron data switches to a nighttime-only transmission with the launch of future NOAA satellites.
ERIC Educational Resources Information Center
Brückner, Sebastian; Pellegrino, James W.
2016-01-01
The Standards for Educational and Psychological Testing indicate that validation of assessments should include analyses of participants' response processes. However, such analyses typically are conducted only to supplement quantitative field studies with qualitative data, and seldom are such data connected to quantitative data on student or item…
NASA Technical Reports Server (NTRS)
1986-01-01
Digital Imaging is the computer processed numerical representation of physical images. Enhancement of images results in easier interpretation. Quantitative digital image analysis by Perceptive Scientific Instruments, locates objects within an image and measures them to extract quantitative information. Applications are CAT scanners, radiography, microscopy in medicine as well as various industrial and manufacturing uses. The PSICOM 327 performs all digital image analysis functions. It is based on Jet Propulsion Laboratory technology, is accurate and cost efficient.
Yankson, Kweku K.; Steck, Todd R.
2009-01-01
We present a simple strategy for isolating and accurately enumerating target DNA from high-clay-content soils: desorption with buffers, an optional magnetic capture hybridization step, and quantitation via real-time PCR. With the developed technique, μg quantities of DNA were extracted from mg samples of pure kaolinite and a field clay soil. PMID:19633108
Hindle, Ralph; Noestheden, Matthew; Peru, Kerry; Headley, John
2013-04-19
This study details the development of a routine method for quantitative analysis of oil sands naphthenic acids, which are a complex class of compounds found naturally and as contaminants in oil sands process waters from Alberta's Athabasca region. Expanding beyond classical naphthenic acids (CnH2n-zO2), those compounds conforming to the formula CnH2n-zOx (where 2≥x≤4) were examined in commercial naphthenic acid and environmental water samples. HPLC facilitated a five-fold reduction in ion suppression when compared to the more commonly used flow injection analysis. A comparison of 39 model naphthenic acids revealed significant variability in response factors, demonstrating the necessity of using naphthenic acid mixtures for quantitation, rather than model compounds. It was also demonstrated that naphthenic acidic heterogeneity (commercial and environmental) necessitates establishing a single NA mix as the standard against which all quantitation is performed. The authors present the first ISO17025 accredited method for the analysis of naphthenic acids in water using HPLC high resolution accurate mass time-of-flight mass spectrometry. The method detection limit was 1mg/L total oxy-naphthenic acids (Sigma technical mix). Copyright © 2013 Elsevier B.V. All rights reserved.
Chen, Ming; Wu, Si; Lu, Haidong D.; Roe, Anna W.
2013-01-01
Interpreting population responses in the primary visual cortex (V1) remains a challenge especially with the advent of techniques measuring activations of large cortical areas simultaneously with high precision. For successful interpretation, a quantitatively precise model prediction is of great importance. In this study, we investigate how accurate a spatiotemporal filter (STF) model predicts average response profiles to coherently drifting random dot motion obtained by optical imaging of intrinsic signals in V1 of anesthetized macaques. We establish that orientation difference maps, obtained by subtracting orthogonal axis-of-motion, invert with increasing drift speeds, consistent with the motion streak effect. Consistent with perception, the speed at which the map inverts (the critical speed) depends on cortical eccentricity and systematically increases from foveal to parafoveal. We report that critical speeds and response maps to drifting motion are excellently reproduced by the STF model. Our study thus suggests that the STF model is quantitatively accurate enough to be used as a first model of choice for interpreting responses obtained with intrinsic imaging methods in V1. We show further that this good quantitative correspondence opens the possibility to infer otherwise not easily accessible population receptive field properties from responses to complex stimuli, such as drifting random dot motions. PMID:23197457
Quantitative Live-Cell Confocal Imaging of 3D Spheroids in a High-Throughput Format.
Leary, Elizabeth; Rhee, Claire; Wilks, Benjamin T; Morgan, Jeffrey R
2018-06-01
Accurately predicting the human response to new compounds is critical to a wide variety of industries. Standard screening pipelines (including both in vitro and in vivo models) often lack predictive power. Three-dimensional (3D) culture systems of human cells, a more physiologically relevant platform, could provide a high-throughput, automated means to test the efficacy and/or toxicity of novel substances. However, the challenge of obtaining high-magnification, confocal z stacks of 3D spheroids and understanding their respective quantitative limitations must be overcome first. To address this challenge, we developed a method to form spheroids of reproducible size at precise spatial locations across a 96-well plate. Spheroids of variable radii were labeled with four different fluorescent dyes and imaged with a high-throughput confocal microscope. 3D renderings of the spheroid had a complex bowl-like appearance. We systematically analyzed these confocal z stacks to determine the depth of imaging and the effect of spheroid size and dyes on quantitation. Furthermore, we have shown that this loss of fluorescence can be addressed through the use of ratio imaging. Overall, understanding both the limitations of confocal imaging and the tools to correct for these limits is critical for developing accurate quantitative assays using 3D spheroids.
Quantitative characterization of surface topography using spectral analysis
NASA Astrophysics Data System (ADS)
Jacobs, Tevis D. B.; Junge, Till; Pastewka, Lars
2017-03-01
Roughness determines many functional properties of surfaces, such as adhesion, friction, and (thermal and electrical) contact conductance. Recent analytical models and simulations enable quantitative prediction of these properties from knowledge of the power spectral density (PSD) of the surface topography. The utility of the PSD is that it contains statistical information that is unbiased by the particular scan size and pixel resolution chosen by the researcher. In this article, we first review the mathematical definition of the PSD, including the one- and two-dimensional cases, and common variations of each. We then discuss strategies for reconstructing an accurate PSD of a surface using topography measurements at different size scales. Finally, we discuss detecting and mitigating artifacts at the smallest scales, and computing upper/lower bounds on functional properties obtained from models. We accompany our discussion with virtual measurements on computer-generated surfaces. This discussion summarizes how to analyze topography measurements to reconstruct a reliable PSD. Analytical models demonstrate the potential for tuning functional properties by rationally tailoring surface topography—however, this potential can only be achieved through the accurate, quantitative reconstruction of the PSDs of real-world surfaces.
Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong
2015-01-01
Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.
Analysis of ribosomal RNA stability in dead cells of wine yeast by quantitative PCR.
Sunyer-Figueres, Merce; Wang, Chunxiao; Mas, Albert
2018-04-02
During wine production, some yeasts enter a Viable But Not Culturable (VBNC) state, which may influence the quality and stability of the final wine through remnant metabolic activity or by resuscitation. Culture-independent techniques are used for obtaining an accurate estimation of the number of live cells, and quantitative PCR could be the most accurate technique. As a marker of cell viability, rRNA was evaluated by analyzing its stability in dead cells. The species-specific stability of rRNA was tested in Saccharomyces cerevisiae, as well as in three species of non-Saccharomyces yeast (Hanseniaspora uvarum, Torulaspora delbrueckii and Starmerella bacillaris). High temperature and antimicrobial dimethyl dicarbonate (DMDC) treatments were efficient in lysing the yeast cells. rRNA gene and rRNA (as cDNA) were analyzed over 48 h after cell lysis by quantitative PCR. The results confirmed the stability of rRNA for 48 h after the cell lysis treatments. To sum up, rRNA may not be a good marker of cell viability in the wine yeasts that were tested. Copyright © 2018 Elsevier B.V. All rights reserved.
Towards in vivo focal cortical dysplasia phenotyping using quantitative MRI.
Adler, Sophie; Lorio, Sara; Jacques, Thomas S; Benova, Barbora; Gunny, Roxana; Cross, J Helen; Baldeweg, Torsten; Carmichael, David W
2017-01-01
Focal cortical dysplasias (FCDs) are a range of malformations of cortical development each with specific histopathological features. Conventional radiological assessment of standard structural MRI is useful for the localization of lesions but is unable to accurately predict the histopathological features. Quantitative MRI offers the possibility to probe tissue biophysical properties in vivo and may bridge the gap between radiological assessment and ex-vivo histology. This review will cover histological, genetic and radiological features of FCD following the ILAE classification and will explain how quantitative voxel- and surface-based techniques can characterise these features. We will provide an overview of the quantitative MRI measures available, their link with biophysical properties and finally the potential application of quantitative MRI to the problem of FCD subtyping. Future research linking quantitative MRI to FCD histological properties should improve clinical protocols, allow better characterisation of lesions in vivo and tailored surgical planning to the individual.
Electron Probe Microanalysis | Materials Science | NREL
surveys of the area of interest before performing a more accurate quantitative analysis with WDS. WDS - Four spectrometers with ten diffracting crystals. The use of a single-channel analyzer allows much
NASA Astrophysics Data System (ADS)
Brigitte Neuland, Maike; Grimaudo, Valentine; Mezger, Klaus; Moreno-García, Pavel; Riedo, Andreas; Tulej, Marek; Wurz, Peter
2016-04-01
The chemical composition of planetary bodies, moons, comets and asteroids is a key to understand their origin and evolution [Wurz,2009]. Measurements of the elemental and isotopic composition of rocks yield information about the formation of the planetary body, its evolution and following processes shaping the planetary surface. From the elemental composition, conclusions about modal mineralogy and petrology can be drawn. Isotope ratios are a sensitive indicator for past events on the planetary body and yield information about origin and transformation of the matter, back to events that occurred in the early solar system. Finally, measurements of radiogenic isotopes make it possible to carry out dating analyses. All these topics, particularly in situ dating analyses, quantitative elemental and highly accurate isotopic composition measurements, are top priority scientific questions for future lunar missions. An instrument for precise measurements of chemical composition will be a key element in scientific payloads of future landers or rovers on lunar surface. We present a miniature laser ablation mass spectrometer (LMS) designed for in situ research in planetary and space science and optimised for measurements of the chemical composition of rocks and soils on a planetary surface. By means of measurements of standard reference materials we demonstrate that LMS is a suitable instrument for in situ measurements of elemental and isotopic composition with high precision and accuracy. Measurements of soil standards are used to confirm known sensitivity coefficients of the instrument and to prove the power of LMS for quantitative elemental analyses [Neuland,2016]. For demonstration of the capability of LMS to measure the chemical composition of extraterrestrial material we use a sample of Allende meteorite [Neuland,2014]. Investigations of layered samples confirm the high spatial resolution in vertical direction of LMS [Grimaudo,2015], which allows in situ studying of past surface processes on a planetary surface. Analyses of Pb isotopes show that the statistical uncertainty for the age determination by LMS is about ±100 Myrs, if abundance of 206Pb and 207Pb is 20ppm and 2ppm respectively [Riedo,2013]. These Pb isotopes have abundances of tens to hundreds of ppm in lunar KREEP [Nemchin,2008]. We demonstrate the measurement capabilities of LMS for petrographic and mineralogical analyses, for isotopic studies and dating analyses, which are key topics for future missions to the Moon. Having the LMS instrument installed on a lunar rover would allow measuring the chemical composition of many rock and soil samples, distributed over a certain area, inside the South Pole Aitken Basin for example. LMS measurements would yield valuable conclusions about age and mineralogy. References: [Wurz,2009]Wurz,P. et al. 2009, AIP Conf.Proc., CP1144:70-75. [Grimaudo,2015]Grimaudo, V. et al. 2015, Anal.Chem. 87: 2037-2041. [Neuland,2014]Neuland, M.B. et al. 2014, Planet.Space Sci.101:196-209. [Neuland,2016]Neuland M.B. et al. 2016, Meas. Sci. Technol.,submitted. [Riedo,2013]Riedo A. et al., 2013 Planet. Space Sci. 87: 1-13. [Nemchin,2008]Nemchin et al., 2008 Geochim. Cosmochim.Acta 72:668-689.
Toward Accurate and Quantitative Comparative Metagenomics
Nayfach, Stephen; Pollard, Katherine S.
2016-01-01
Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341
Reducing misfocus-related motion artefacts in laser speckle contrast imaging.
Ringuette, Dene; Sigal, Iliya; Gad, Raanan; Levi, Ofer
2015-01-01
Laser Speckle Contrast Imaging (LSCI) is a flexible, easy-to-implement technique for measuring blood flow speeds in-vivo. In order to obtain reliable quantitative data from LSCI the object must remain in the focal plane of the imaging system for the duration of the measurement session. However, since LSCI suffers from inherent frame-to-frame noise, it often requires a moving average filter to produce quantitative results. This frame-to-frame noise also makes the implementation of rapid autofocus system challenging. In this work, we demonstrate an autofocus method and system based on a novel measure of misfocus which serves as an accurate and noise-robust feedback mechanism. This measure of misfocus is shown to enable the localization of best focus with sub-depth-of-field sensitivity, yielding more accurate estimates of blood flow speeds and blood vessel diameters.
Marcelín-Jiménez, Gabriel; Contreras, Leticia; Esquivel, Javier; Ávila, Óscar; Batista, Dany; Ángeles, Alionka P; García-González, Alberto
2017-03-01
Cinitapride (CIN) is a benzamide-derived molecule used for the treatment of gastroesophageal reflux and dyspepsia. Its pharmacokinetics are controversial due to the use of supratherapeutic doses and the lack of sensitive methodology. Therefore, a sensitive and accurate micromethod was developed for its quantitation in human plasma. CIN was extracted from 300 µl of heparinized plasma by liquid-liquid extraction using cisapride as internal standard, and analyzed with an ultra performance liquid chromatograph employing positive multiple-reaction monitoring-MS. The method proved to be rapid, accurate and stable within a range between 50 and 2000 pg/ml and was successfully validated and applied in a pharmacokinetic interaction trial, where it was demonstrated that oral co-administration of simethicone does not modify the bioavailability of CIN.
Toward Accurate and Quantitative Comparative Metagenomics.
Nayfach, Stephen; Pollard, Katherine S
2016-08-25
Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. Copyright © 2016 Elsevier Inc. All rights reserved.
Cui, Yi; Hu, Dehong; Markillie, Lye Meng; ...
2017-10-04
Here, quantitative gene expression analysis in intact single cells can be achieved using single molecule-based fluorescence in situ hybridization (smFISH). This approach relies on fluorescence intensity to distinguish between true signals, emitted from an RNA copy hybridized with multiple oligonucleotide probes, and background noise. Thus, the precision in smFISH is often compromised by partial or nonspecific probe binding and tissue autofluorescence, especially when only a small number of probes can be fitted to the target transcript. Here we provide an accurate approach for setting quantitative thresholds between true and false signals, which relies on on-off duty cycles of photoswitchable dyes.more » This fluctuation localization imaging-based FISH (fliFISH) uses on-time fractions (measured over a series of exposures) collected from transcripts bound to as low as 8 probes, which are distinct from on-time fractions collected from nonspecifically bound probes or autofluorescence. Using multicolor fliFISH, we identified radial gene expression patterns in mouse pancreatic islets for insulin, the transcription factor, NKX2-2 and their ratio ( Nkx2- 2/Ins2). These radial patterns, showing higher values in β cells at the islet core and lower values in peripheral cells, were lost in diabetic mouse islets. In summary, fliFISH provides an accurate, quantitative approach for detecting and counting true RNA copies and rejecting false signals by their distinct on-time fractions, laying the foundation for reliable single-cell transcriptomics.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Yi; Hu, Dehong; Markillie, Lye Meng
Here, quantitative gene expression analysis in intact single cells can be achieved using single molecule-based fluorescence in situ hybridization (smFISH). This approach relies on fluorescence intensity to distinguish between true signals, emitted from an RNA copy hybridized with multiple oligonucleotide probes, and background noise. Thus, the precision in smFISH is often compromised by partial or nonspecific probe binding and tissue autofluorescence, especially when only a small number of probes can be fitted to the target transcript. Here we provide an accurate approach for setting quantitative thresholds between true and false signals, which relies on on-off duty cycles of photoswitchable dyes.more » This fluctuation localization imaging-based FISH (fliFISH) uses on-time fractions (measured over a series of exposures) collected from transcripts bound to as low as 8 probes, which are distinct from on-time fractions collected from nonspecifically bound probes or autofluorescence. Using multicolor fliFISH, we identified radial gene expression patterns in mouse pancreatic islets for insulin, the transcription factor, NKX2-2 and their ratio ( Nkx2- 2/Ins2). These radial patterns, showing higher values in β cells at the islet core and lower values in peripheral cells, were lost in diabetic mouse islets. In summary, fliFISH provides an accurate, quantitative approach for detecting and counting true RNA copies and rejecting false signals by their distinct on-time fractions, laying the foundation for reliable single-cell transcriptomics.« less
Pulverer, Walter; Hofner, Manuela; Preusser, Matthias; Dirnberger, Elisabeth; Hainfellner, Johannes A; Weinhaeusel, Andreas
2014-01-01
MGMT promoter methylation is associated with favorable prognosis and chemosensitivity in glioblastoma multiforme (GBM), especially in elderly patients. We aimed to develop a simple methylation-sensitive restriction enzyme (MSRE)-based quantitative PCR (qPCR) assay, allowing the quantification of MGMT promoter methylation. DNA was extracted from non-neoplastic brain (n = 24) and GBM samples (n = 20) upon 3 different sample conservation conditions (-80 °C, formalin-fixed and paraffin-embedded (FFPE); RCL2-fixed). We evaluated the suitability of each fixation method with respect to the MSRE-coupled qPCR methylation analyses. Methylation data were validated by MALDITOF. qPCR was used for evaluation of alternative tissue conservation procedures. DNA from FFPE tissue failed reliable testing; DNA from both RCL2-fixed and fresh frozen tissues performed equally well and was further used for validation of the quantitative MGMT methylation assay (limit of detection (LOD): 19.58 pg), using individual's undigested sample DNA for calibration. MGMT methylation analysis in non-neoplastic brain identified a background methylation of 0.10 ± 11% which we used for defining a cut-off of 0.32% for patient stratification. Of GBM patients 9 were MGMT methylationpositive (range: 0.56 - 91.95%), and 11 tested negative. MALDI-TOF measurements resulted in a concordant classification of 94% of GBM samples in comparison to qPCR. The presented methodology allows quantitative MGMT promoter methylation analyses. An amount of 200 ng DNA is sufficient for triplicate analyses including control reactions and individual calibration curves, thus excluding any DNA qualityderived bias. The combination of RCL2-fixation and quantitative methylation analyses improves pathological routine examination when histological and molecular analyses on limited amounts of tumor samples are necessary for patient stratification.
NASA Astrophysics Data System (ADS)
Lelièvre, Peter G.; Grey, Melissa
2017-08-01
Quantitative morphometric analyses of form are widely used in palaeontology, especially for taxonomic and evolutionary research. These analyses can involve several measurements performed on hundreds or even thousands of samples. Performing measurements of size and shape on large assemblages of macro- or microfossil samples is generally infeasible or impossible with traditional instruments such as vernier calipers. Instead, digital image processing software is required to perform measurements via suitable digital images of samples. Many software packages exist for morphometric analyses but there is not much available for the integral stage of data collection, particularly for the measurement of the outlines of samples. Some software exists to automatically detect the outline of a fossil sample from a digital image. However, automatic outline detection methods may perform inadequately when samples have incomplete outlines or images contain poor contrast between the sample and staging background. Hence, a manual digitization approach may be the only option. We are not aware of any software packages that are designed specifically for efficient digital measurement of fossil assemblages with numerous samples, especially for the purposes of manual outline analysis. Throughout several previous studies, we have developed a new software tool, JMorph, that is custom-built for that task. JMorph provides the means to perform many different types of measurements, which we describe in this manuscript. We focus on JMorph's ability to rapidly and accurately digitize the outlines of fossils. JMorph is freely available from the authors.
2017-01-01
Real-time quantitative PCR (qPCR) is the most reliable and accurate technique for analyses of gene expression. Endogenous reference genes are being used to normalize qPCR data even though their expression may vary under different conditions and in different tissues. Nonetheless, verification of expression of reference genes in selected studied tissue is essential in order to accurately assess the level of expression of target genes of interest. Therefore, in this study, we attempted to examine six commonly used reference genes in order to identify the gene being expressed most constantly under the influence of testosterone in the kidneys and hypothalamus. The reference genes include glyceraldehyde-3-phosphate dehydrogenase (GAPDH), actin beta (ACTB), beta-2 microglobulin (B2m), hypoxanthine phosphoribosyltransferase 1 (HPRT), peptidylprolylisomerase A (Ppia) and hydroxymethylbilane synthase (Hmbs). The cycle threshold (Ct) value for each gene was determined and data obtained were analyzed using the software programs NormFinder, geNorm, BestKeeper, and rank aggregation. Results showed that Hmbs and Ppia genes were the most stably expressed in the hypothalamus. Meanwhile, in kidneys, Hmbs and GAPDH appeared to be the most constant genes. In conclusion, variations in expression levels of reference genes occur in kidneys and hypothalamus under similar conditions; thus, it is important to verify reference gene levels in these tissues prior to commencing any studies. PMID:28591185
Machine learning in a graph framework for subcortical segmentation
NASA Astrophysics Data System (ADS)
Guo, Zhihui; Kashyap, Satyananda; Sonka, Milan; Oguz, Ipek
2017-02-01
Automated and reliable segmentation of subcortical structures from human brain magnetic resonance images is of great importance for volumetric and shape analyses in quantitative neuroimaging studies. However, poor boundary contrast and variable shape of these structures make the automated segmentation a tough task. We propose a 3D graph-based machine learning method, called LOGISMOS-RF, to segment the caudate and the putamen from brain MRI scans in a robust and accurate way. An atlas-based tissue classification and bias-field correction method is applied to the images to generate an initial segmentation for each structure. Then a 3D graph framework is utilized to construct a geometric graph for each initial segmentation. A locally trained random forest classifier is used to assign a cost to each graph node. The max-flow algorithm is applied to solve the segmentation problem. Evaluation was performed on a dataset of T1-weighted MRI's of 62 subjects, with 42 images used for training and 20 images for testing. For comparison, FreeSurfer, FSL and BRAINSCut approaches were also evaluated using the same dataset. Dice overlap coefficients and surface-to-surfaces distances between the automated segmentation and expert manual segmentations indicate the results of our method are statistically significantly more accurate than the three other methods, for both the caudate (Dice: 0.89 +/- 0.03) and the putamen (0.89 +/- 0.03).
Hu, Tangao; Liu, Jiahong; Zheng, Gang; Li, Yao; Xie, Bin
2018-05-09
Accurate and timely information describing urban wetland resources and their changes over time, especially in rapidly urbanizing areas, is becoming more important. We applied an object-based image analysis and nearest neighbour classifier to map and monitor changes in land use/cover using multi-temporal high spatial resolution satellite imagery in an urban wetland area (Hangzhou Xixi Wetland) from 2000, 2005, 2007, 2009 and 2013. The overall eight-class classification accuracies averaged 84.47% for the five years. The maps showed that between 2000 and 2013 the amount of non-wetland (urban) area increased by approximately 100%. Herbaceous (32.22%), forest (29.57%) and pond (23.85%) are the main land-cover types that changed to non-wetland, followed by cropland (6.97%), marsh (4.04%) and river (3.35%). In addition, the maps of change patterns showed that urban wetland loss is mainly distributed west and southeast of the study area due to real estate development, and the greatest loss of urban wetlands occurred from 2007 to 2013. The results demonstrate the advantages of using multi-temporal high spatial resolution satellite imagery to provide an accurate, economical means to map and analyse changes in land use/cover over time and the ability to use the results as inputs to urban wetland management and policy decisions.
Lelong, Camille C. D.; Burger, Philippe; Jubelin, Guillaume; Roux, Bruno; Labbé, Sylvain; Baret, Frédéric
2008-01-01
This paper outlines how light Unmanned Aerial Vehicles (UAV) can be used in remote sensing for precision farming. It focuses on the combination of simple digital photographic cameras with spectral filters, designed to provide multispectral images in the visible and near-infrared domains. In 2005, these instruments were fitted to powered glider and parachute, and flown at six dates staggered over the crop season. We monitored ten varieties of wheat, grown in trial micro-plots in the South-West of France. For each date, we acquired multiple views in four spectral bands corresponding to blue, green, red, and near-infrared. We then performed accurate corrections of image vignetting, geometric distortions, and radiometric bidirectional effects. Afterwards, we derived for each experimental micro-plot several vegetation indexes relevant for vegetation analyses. Finally, we sought relationships between these indexes and field-measured biophysical parameters, both generic and date-specific. Therefore, we established a robust and stable generic relationship between, in one hand, leaf area index and NDVI and, in the other hand, nitrogen uptake and GNDVI. Due to a high amount of noise in the data, it was not possible to obtain a more accurate model for each date independently. A validation protocol showed that we could expect a precision level of 15% in the biophysical parameters estimation while using these relationships. PMID:27879893
ADM guidance-Ceramics: guidance to the use of fractography in failure analysis of brittle materials.
Scherrer, Susanne S; Lohbauer, Ulrich; Della Bona, Alvaro; Vichi, Alessandro; Tholey, Michael J; Kelly, J Robert; van Noort, Richard; Cesar, Paulo Francisco
2017-06-01
To provide background information and guidance as to how to use fractography accurately, a powerful tool for failure analysis of dental ceramic structures. An extended palette of qualitative and quantitative fractography is provided, both for in vivo and in vitro fracture surface analyses. As visual support, this guidance document will provide micrographs of typical critical ceramic processing flaws, differentiating between pre- versus post sintering cracks, grinding damage related failures and occlusal contact wear origins and of failures due to surface degradation. The documentation emphasizes good labeling of crack features, precise indication of the direction of crack propagation (dcp), identification of the fracture origin, the use of fractographic photomontage of critical flaws or flaw labeling on strength data graphics. A compilation of recommendations for specific applications of fractography in Dentistry is also provided. This guidance document will contribute to a more accurate use of fractography and help researchers to better identify, describe and understand the causes of failure, for both clinical and laboratory-scale situations. If adequately performed at a large scale, fractography will assist in optimizing the methods of processing and designing of restorative materials and components. Clinical failures may be better understood and consequently reduced by sending out the correct message regarding the fracture origin in clinical trials. Copyright © 2017 The Academy of Dental Materials. All rights reserved.
NASA Astrophysics Data System (ADS)
Drinia, Hara; Antonarakou, Assimina; Tsourou, Theodora; Kontakiotis, George; Psychogiou, Maria; Anastasakis, George
2016-09-01
The South Evoikos Basin is a marginal basin in the Aegean Sea which receives little terrigenous supply and its sedimentation is dominated by hemipelagic processes. Late Quaternary benthic and planktonic foraminifera from core PAG-155 are investigated in order to understand their response to the glacial-interglacial cycles in this region. The quantitative analysis of planktonic foraminifera, coupled with accelerator mass spectrometry (14C-AMS) radiocarbon date measurements, provide an integrated chrono-stratigraphic time framework over the last 90 ka (time interval between late Marine Isotopic Stages 5 and 1; MIS5-MIS1). The temporary appearance and disappearance as well as several abundance peaks in the quantitative distribution of selected climate-sensitive planktonic species allowed the identification of several eco-bioevents, useful to accurately mark the boundaries of the eco-biozones widely recognized in the Mediterranean records and used for large-scale correlations. The established bio-ecozonation scheme allows a detailed palaecological reconstruction for the late Pleistocene archive in the central Aegean, and furthermore provides a notable contribution for palaeoclimatic studies, facilitating intercorrelations between various oceanographic basins. The quantitative analyses of benthic foraminifera identify four distinct assemblages, namely Biofacies: Elphidium spp., Haynesina spp. Biofacies, characterized by neritic species, dominated during the transition from MIS 5 to MIS 4; Cassidulina laevigata/carinata Biofacies dominated till 42 ka (transgressive trend from MIS 4 to MIS 3); Bulimina gibba Biofacies dominated from 42 ka to 9.5 ka (extensive regression MIS 3,2 through lowstand and early transgression; beginning of MIS 1); Bulimina marginata, Uvigerina spp. Biofacies dominated from 9.5 ka to the present (late transgression through early highstand; MIS 1)., This study showed that the South Evoikos Basin which is characterized by its critical depths and connections to the open sea, and its small volume water masses that nourished foraminiferal assemblages, accurately records 5th-4th order sea level and climatic fluctuations. Especially, the basin's limited communication with the open ocean implies that any climatic signals will be recorded in an amplified fashion, and therefore this heightened sensitivity to the effects of climate variability further underlies the prominent role of such marginal basins in the understanding of the global climatic evolution.
Quantitative analysis of drugs in hair by UHPLC high resolution mass spectrometry.
Kronstrand, Robert; Forsman, Malin; Roman, Markus
2018-02-01
Liquid chromatographic methods coupled to high resolution mass spectrometry are increasingly used to identify compounds in various matrices including hair but there are few recommendations regarding the parameters and their criteria to identify a compound. In this study we present a method for the identification and quantification of a range of drugs and discuss the parameters used to identify a compound with high resolution mass spectrometry. Drugs were extracted from hair by incubation in a buffer:solvent mixture at 37°C during 18h. Analysis was performed on a chromatographic system comprised of an Agilent 6550 QTOF coupled to a 1290 Infinity UHPLC system. High resolution accurate mass data were acquired in the All Ions mode and exported into Mass Hunter Quantitative software for quantitation and identification using qualifier fragment ions. Validation included selectivity, matrix effects, calibration range, within day and between day precision and accuracy. The analytes were 7-amino-flunitrazepam, 7-amino-clonazepam, 7-amino-nitrazepam, acetylmorphine, alimemazine, alprazolam, amphetamine, benzoylecgonine, buprenorphine, diazepam, ethylmorphine, fentanyl, hydroxyzine, ketobemidone, codeine, cocaine, MDMA, methadone, methamphetamine, morphine, oxycodone, promethazine, propiomazine, propoxyphene, tramadol, zaleplone, zolpidem, and zopiclone. As proof of concept, hair from 29 authentic post mortem cases were analysed. The calibration range was established between 0.05ng/mg to 5.0ng/mg for all analytes except fentanyl (0.02-2.0), buprenorphine (0.04-2.0), and ketobemidone (0.05-4.0) as well as for alimemazine, amphetamine, cocaine, methadone, and promethazine (0.10-5.0). For all analytes, the accuracy of the fortified pooled hair matrix was 84-108% at the low level and 89-106% at the high level. The within series precisions were between 1.4 and 6.7% and the between series precisions were between 1.4 and 10.1%. From the 29 autopsy cases, 121 positive findings were encountered from 23 of the analytes in concentrations similar to those previously published. We conclude that the developed method proved precise and accurate and that it had sufficient performance for the purpose of detecting regular use of drugs or treatment with prescription drugs. To identify a compound we recommend the use of ion ratios as a complement to instrument software "matching scores". Copyright © 2018 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gritsenko, Marina A.; Xu, Zhe; Liu, Tao
Comprehensive, quantitative information on abundances of proteins and their post-translational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labelling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification andmore » quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples, and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.« less
Gritsenko, Marina A; Xu, Zhe; Liu, Tao; Smith, Richard D
2016-01-01
Comprehensive, quantitative information on abundances of proteins and their posttranslational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labeling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification and quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.
NASA Astrophysics Data System (ADS)
Blanke, Bruno; Speich, Sabrina; Rusciano, Emanuela
2015-01-01
We use the tracer and velocity fields of a climatological ocean model to investigate the ability of Argo-like data to estimate accurately water mass movements and transformations, in the style of analyses commonly applied to the output of ocean general circulation model. To this end, we introduce an algorithm for the reconstruction of a fully non-divergent three-dimensional velocity field from the simple knowledge of the model vertical density profiles and 1000-m horizontal velocity components. The validation of the technique consists in comparing the resulting pathways for Antarctic Intermediate Water in the South Atlantic Ocean to equivalent reference results based on the full model information available for velocity and tracers. We show that the inclusion of a wind-induced Ekman pumping and of a well-thought-out expression for vertical velocity at the level of the intermediate waters is essential for the reliable reproduction of quantitative Lagrangian analyses. Neglecting the seasonal variability of the velocity and tracer fields is not a significant source of errors, at least well below the permanent thermocline. These results give us confidence in the success of the adaptation of the algorithm to true gridded Argo data for investigating the dynamics of flows in the ocean interior.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Webb-Robertson, Bobbie-Jo; Kim, Young -Mo; Zink, Erika M.
Urease pre-treatment of urine has been utilized since the early 1960s to remove high levels of urea from samples prior to further processing and analysis by gas chromatography-mass spectrometry (GC-MS). Aside from the obvious depletion or elimination of urea, the effect, if any, of urease pre-treatment on the urinary metabolome has not been studied in detail. Here, we report the results of three separate but related experiments that were designed to assess possible indirect effects of urease pre-treatment on the urinary metabolome as measured by GC-MS. In total, 235 GC-MS analyses were performed and over 106 identified and 200 unidentifiedmore » metabolites were quantified across the three experiments. The results showed that data from urease pre-treated samples 1) had the same or lower coefficients of variance among reproducibly detected metabolites, 2) more accurately reflected quantitative differences and the expected ratios among different urine volumes, and 3) increased the number of metabolite identifications. Altogether, we observed no negative consequences of urease pre-treatment. In contrast, urease pretreatment enhanced the ability to distinguish between volume-based and biological sample types compared to no treatment. Taken together, these results show that urease pretreatment of urine offers multiple beneficial effects that outweigh any artifacts that may be introduced to the data in urinary metabolomics analyses.« less
Identification of Reference Genes for RT-qPCR Data Normalization in Cannabis sativa Stem Tissues.
Mangeot-Peter, Lauralie; Legay, Sylvain; Hausman, Jean-Francois; Esposito, Sergio; Guerriero, Gea
2016-09-15
Gene expression profiling via quantitative real-time PCR is a robust technique widely used in the life sciences to compare gene expression patterns in, e.g., different tissues, growth conditions, or after specific treatments. In the field of plant science, real-time PCR is the gold standard to study the dynamics of gene expression and is used to validate the results generated with high throughput techniques, e.g., RNA-Seq. An accurate relative quantification of gene expression relies on the identification of appropriate reference genes, that need to be determined for each experimental set-up used and plant tissue studied. Here, we identify suitable reference genes for expression profiling in stems of textile hemp (Cannabis sativa L.), whose tissues (isolated bast fibres and core) are characterized by remarkable differences in cell wall composition. We additionally validate the reference genes by analysing the expression of putative candidates involved in the non-oxidative phase of the pentose phosphate pathway and in the first step of the shikimate pathway. The goal is to describe the possible regulation pattern of some genes involved in the provision of the precursors needed for lignin biosynthesis in the different hemp stem tissues. The results here shown are useful to design future studies focused on gene expression analyses in hemp.
Development of a reference material of a single DNA molecule for the quality control of PCR testing.
Mano, Junichi; Hatano, Shuko; Futo, Satoshi; Yoshii, Junji; Nakae, Hiroki; Naito, Shigehiro; Takabatake, Reona; Kitta, Kazumi
2014-09-02
We developed a reference material of a single DNA molecule with a specific nucleotide sequence. The double-strand linear DNA which has PCR target sequences at the both ends was prepared as a reference DNA molecule, and we named the PCR targets on each side as confirmation sequence and standard sequence. The highly diluted solution of the reference molecule was dispensed into 96 wells of a plastic PCR plate to make the average number of molecules in a well below one. Subsequently, the presence or absence of the reference molecule in each well was checked by real-time PCR targeting for the confirmation sequence. After an enzymatic treatment of the reaction mixture in the positive wells for the digestion of PCR products, the resultant solution was used as the reference material of a single DNA molecule with the standard sequence. PCR analyses revealed that the prepared samples included only one reference molecule with high probability. The single-molecule reference material developed in this study will be useful for the absolute evaluation of a detection limit of PCR-based testing methods, the quality control of PCR analyses, performance evaluations of PCR reagents and instruments, and the preparation of an accurate calibration curve for real-time PCR quantitation.
Arku, Raphael E; Birch, Aaron; Shupler, Matthew; Yusuf, Salim; Hystad, Perry; Brauer, Michael
2018-05-01
Household air pollution (HAP) from combustion of solid fuels is an important contributor to disease burden in low- and middle-income countries (LIC, and MIC). However, current HAP disease burden estimates are based on integrated exposure response curves that are not currently informed by quantitative HAP studies in LIC and MIC. While there is adequate evidence supporting causal relationships between HAP and respiratory disease, large cohort studies specifically examining relationships between quantitative measures of HAP exposure with cardiovascular disease are lacking. We aim to improve upon exposure proxies based on fuel type, and to reduce exposure misclassification by quantitatively measuring exposure across varying cooking fuel types and conditions in diverse geographies and socioeconomic settings. We leverage technology advancements to estimate household and personal PM 2.5 (particles below 2.5 μm in aerodynamic diameter) exposure within the large (N~250,000) multi-country (N~26) Prospective Urban and Rural Epidemiological (PURE) cohort study. Here, we detail the study protocol and the innovative methodologies being used to characterize HAP exposures, and their application in epidemiologic analyses. This study characterizes HAP PM 2.5 exposures for participants in rural communities in ten PURE countries with >10% solid fuel use at baseline (Bangladesh, Brazil, Chile, China, Colombia, India, Pakistan, South Africa, Tanzania, and Zimbabwe). PM 2.5 monitoring includes 48-h cooking area measurements in 4500 households and simultaneous personal monitoring of male and female pairs from 20% of the selected households. Repeat measurements occur in 20% of households to assess impacts of seasonality. Monitoring began in 2017, and will continue through 2019. The Ultrasonic Personal Aerosol Sampler (UPAS), a novel, robust, and inexpensive filter based monitor that is programmable through a dedicated mobile phone application is used for sampling. Pilot study field evaluation of cooking area measurements indicated high correlation between the UPAS and reference Harvard Impactors (r = 0.91; 95% CI: 0.84, 0.95; slope = 0.95). To facilitate tracking and to minimize contamination and analytical error, the samplers utilize barcoded filters and filter cartridges that are weighed pre- and post-sampling using a fully automated weighing system. Pump flow and pressure measurements, temperature and RH, GPS coordinates and semi-quantitative continuous particle mass concentrations based on filter differential pressure are uploaded to a central server automatically whenever the mobile phone is connected to the internet, with sampled data automatically screened for quality control parameters. A short survey is administered during the 48-h monitoring period. Post-weighed filters are further analyzed to estimate black carbon concentrations through a semi-automated, rapid, cost-effective image analysis approach. The measured PM 2.5 data will then be combined with PURE survey information on household characteristics and behaviours collected at baseline and during follow-up to develop quantitative HAP models for PM 2.5 exposures for all rural PURE participants (~50,000) and across different cooking fuel types within the 10 index countries. Both the measured (in the subset) and the modelled exposures will be used in separate longitudinal epidemiologic analyses to assess associations with cardiopulmonary mortality, and disease incidence. The collected data and resulting characterization of cooking area and personal PM 2.5 exposures in multiple rural communities from 10 countries will better inform exposure assessment as well as future epidemiologic analyses assessing the relationships between quantitative estimates of chronic HAP exposure with adult mortality and incident cardiovascular and respiratory disease. This will provide refined and more accurate exposure estimates in global CVD related exposure-response analyses. Copyright © 2018 Elsevier Ltd. All rights reserved.
He, Wei; Kularatne, Sumith A; Kalli, Kimberly R; Prendergast, Franklyn G; Amato, Robert J; Klee, George G; Hartmann, Lynn C; Low, Philip S
2008-10-15
Quantitation of circulating tumor cells (CTCs) can provide information on the stage of a malignancy, onset of disease progression and response to therapy. In an effort to more accurately quantitate CTCs, we have synthesized fluorescent conjugates of 2 high-affinity tumor-specific ligands (folate-AlexaFluor 488 and DUPA-FITC) that bind tumor cells >20-fold more efficiently than fluorescent antibodies. Here we determine whether these tumor-specific dyes can be exploited for quantitation of CTCs in peripheral blood samples from cancer patients. A CTC-enriched fraction was isolated from the peripheral blood of ovarian and prostate cancer patients by an optimized density gradient centrifugation protocol and labeled with the aforementioned fluorescent ligands. CTCs were then quantitated by flow cytometry. CTCs were detected in 18 of 20 ovarian cancer patients (mean 222 CTCs/ml; median 15 CTCs/ml; maximum 3,118 CTCs/ml), whereas CTC numbers in 16 gender-matched normal volunteers were negligible (mean 0.4 CTCs/ml; median 0.3 CTCs/ml; maximum 1.5 CTCs/ml; p < 0.001, chi(2)). CTCs were also detected in 10 of 13 prostate cancer patients (mean 26 CTCs/ml, median 14 CTCs/ml, maximum 94 CTCs/ml) but not in 18 gender-matched healthy donors (mean 0.8 CTCs/ml, median 1, maximum 3 CTC/ml; p < 0.0026, chi(2)). Tumor-specific fluorescent antibodies were much less efficient in quantitating CTCs because of their lower CTC labeling efficiency. Use of tumor-specific fluorescent ligands to label CTCs in peripheral blood can provide a simple, accurate and sensitive method for determining the number of cancer cells circulating in the bloodstream.
A quantitative evaluation of the three dimensional reconstruction of patients' coronary arteries.
Klein, J L; Hoff, J G; Peifer, J W; Folks, R; Cooke, C D; King, S B; Garcia, E V
1998-04-01
Through extensive training and experience angiographers learn to mentally reconstruct the three dimensional (3D) relationships of the coronary arterial branches. Graphic computer technology can assist angiographers to more quickly visualize the coronary 3D structure from limited initial views and then help to determine additional helpful views by predicting subsequent angiograms before they are obtained. A new computer method for facilitating 3D reconstruction and visualization of human coronary arteries was evaluated by reconstructing biplane left coronary angiograms from 30 patients. The accuracy of the reconstruction was assessed in two ways: 1) by comparing the vessel's centerlines of the actual angiograms with the centerlines of a 2D projection of the 3D model projected into the exact angle of the actual angiogram; and 2) by comparing two 3D models generated by different simultaneous pairs on angiograms. The inter- and intraobserver variability of reconstruction were evaluated by mathematically comparing the 3D model centerlines of repeated reconstructions. The average absolute corrected displacement of 14,662 vessel centerline points in 2D from 30 patients was 1.64 +/- 2.26 mm. The average corrected absolute displacement of 3D models generated from different biplane pairs was 7.08 +/- 3.21 mm. The intraobserver variability of absolute 3D corrected displacement was 5.22 +/- 3.39 mm. The interobserver variability was 6.6 +/- 3.1 mm. The centerline analyses show that the reconstruction algorithm is mathematically accurate and reproducible. The figures presented in this report put these measurement errors into clinical perspective showing that they yield an accurate representation of the clinically relevant information seen on the actual angiograms. These data show that this technique can be clinically useful by accurately displaying in three dimensions the complex relationships of the branches of the coronary arterial tree.
Non-contact measurement of linear external dimensions of the mouse eye
Wisard, Jeffrey; Chrenek, Micah A.; Wright, Charles; Dalal, Nupur; Pardue, Machelle T.; Boatright, Jeffrey H.; Nickerson, John M.
2010-01-01
Biometric analyses of quantitative traits in eyes of mice can reveal abnormalities related to refractive or ocular development. Due to the small size of the mouse eye, highly accurate and precise measurements are needed to detect meaningful differences. We sought a non-contact measuring technique to obtain highly accurate and precise linear dimensions of the mouse eye. Laser micrometry was validated with gauge block standards. Simple procedures to measure eye dimensions on three axes were devised. Mouse eyes from C57BL/6J and rd10 on a C57BL/6J background were dissected and extraocular muscle and fat removed. External eye dimensions of axial length (anterior-posterior (A-P) axis) and equatorial diameter (superior-inferior (S-I) and nasal-temporal (N-T) axes) were obtained with a laser micrometer. Several approaches to prevent or ameliorate evaporation due to room air were employed. The resolution of the laser micrometer was less than 0.77 microns, and it provided accurate and precise non-contact measurements of eye dimensions on three axes. External dimensions of the eye strongly correlated with eye weight. The N-T and S-I dimensions of the eye correlated with each other most closely from among the 28 pair-wise combinations of the several parameters that were collected. The equatorial axis measurements correlated well from the right and left eye of each mouse. The A-P measurements did not correlate or correlated poorly in each pair of eyes. The instrument is well suited for the measurement of enucleated eyes and other structures from most commonly used species in experimental vision research and ophthalmology. PMID:20067806
Non-contact measurement of linear external dimensions of the mouse eye.
Wisard, Jeffrey; Chrenek, Micah A; Wright, Charles; Dalal, Nupur; Pardue, Machelle T; Boatright, Jeffrey H; Nickerson, John M
2010-03-30
Biometric analyses of quantitative traits in eyes of mice can reveal abnormalities related to refractive or ocular development. Due to the small size of the mouse eye, highly accurate and precise measurements are needed to detect meaningful differences. We sought a non-contact measuring technique to obtain highly accurate and precise linear dimensions of the mouse eye. Laser micrometry was validated with gauge block standards. Simple procedures to measure eye dimensions on three axes were devised. Mouse eyes from C57BL/6J and rd10 on a C57BL/6J background were dissected and extraocular muscle and fat removed. External eye dimensions of axial length (anterior-posterior (A-P) axis) and equatorial diameter (superior-inferior (S-I) and nasal-temporal (N-T) axes) were obtained with a laser micrometer. Several approaches to prevent or ameliorate evaporation due to room air were employed. The resolution of the laser micrometer was less than 0.77 microm, and it provided accurate and precise non-contact measurements of eye dimensions on three axes. External dimensions of the eye strongly correlated with eye weight. The N-T and S-I dimensions of the eye correlated with each other most closely from among the 28 pair-wise combinations of the several parameters that were collected. The equatorial axis measurements correlated well from the right and left eye of each mouse. The A-P measurements did not correlate or correlated poorly in each pair of eyes. The instrument is well suited for the measurement of enucleated eyes and other structures from most commonly used species in experimental vision research and ophthalmology. Copyright (c) 2010 Elsevier B.V. All rights reserved.
Vincent, Delphine; Elkins, Aaron; Condina, Mark R.; Ezernieks, Vilnis; Rochfort, Simone
2016-01-01
Cow’s milk is an important source of proteins in human nutrition. On average, cow’s milk contains 3.5% protein. The most abundant proteins in bovine milk are caseins and some of the whey proteins, namely beta-lactoglobulin, alpha-lactalbumin, and serum albumin. A number of allelic variants and post-translationally modified forms of these proteins have been identified. Their occurrence varies with breed, individuality, stage of lactation, and health and nutritional status of the animal. It is therefore essential to have reliable methods of detection and quantitation of these proteins. Traditionally, major milk proteins are quantified using liquid chromatography (LC) and ultra violet detection method. However, as these protein variants co-elute to some degree, another dimension of separation is beneficial to accurately measure their amounts. Mass spectrometry (MS) offers such a tool. In this study, we tested several RP-HPLC and MS parameters to optimise the analysis of intact bovine proteins from milk. From our tests, we developed an optimum method that includes a 20-28-40% phase B gradient with 0.02% TFA in both mobile phases, at 0.2 mL/min flow rate, using 75°C for the C8 column temperature, scanning every 3 sec over a 600–3000 m/z window. The optimisations were performed using external standards commercially purchased for which ionisation efficiency, linearity of calibration, LOD, LOQ, sensitivity, selectivity, precision, reproducibility, and mass accuracy were demonstrated. From the MS analysis, we can use extracted ion chromatograms (EICs) of specific ion series of known proteins and integrate peaks at defined retention time (RT) window for quantitation purposes. This optimum quantitative method was successfully applied to two bulk milk samples from different breeds, Holstein-Friesian and Jersey, to assess differences in protein variant levels. PMID:27749892
NASA Astrophysics Data System (ADS)
García-Florentino, Cristina; Maguregui, Maite; Marguí, Eva; Torrent, Laura; Queralt, Ignasi; Madariaga, Juan Manuel
2018-05-01
In this work, a Total Reflection X-ray fluorescence (TXRF) spectrometry based quantitative methodology for elemental characterization of liquid extracts and solids belonging to old building materials and their degradation products from a building of the beginning of 20th century with a high historic cultural value in Getxo, (Basque Country, North of Spain) is proposed. This quantification strategy can be considered a faster methodology comparing to traditional Energy or Wavelength Dispersive X-ray fluorescence (ED-XRF and WD-XRF) spectrometry based methodologies or other techniques such as Inductively Coupled Plasma Mass Spectrometry (ICP-MS). In particular, two kinds of liquid extracts were analysed: (i) water soluble extracts from different mortars and (ii) acid extracts from mortars, black crusts, and calcium carbonate formations. In order to try to avoid the acid extraction step of the materials and their degradation products, it was also studied the TXRF direct measurement of the powdered solid suspensions in water. With this aim, different parameters such as the deposition volume and the measuring time were studied for each kind of samples. Depending on the quantified element, the limits of detection achieved with the TXRF quantitative methodologies for liquid extracts and solids were set around 0.01-1.2 and 2-200 mg/L respectively. The quantification of K, Ca, Ti, Mn, Fe, Zn, Rb, Sr, Sn and Pb in the liquid extracts was proved to be a faster alternative to other more classic quantification techniques (i.e. ICP-MS), accurate enough to obtain information about the composition of the acidic soluble part of the materials and their degradation products. Regarding the solid samples measured as suspensions, it was quite difficult to obtain stable and repetitive suspensions affecting in this way the accuracy of the results. To cope with this problem, correction factors based on the quantitative results obtained using ED-XRF were calculated to improve the accuracy of the TXRF results.
van der Put, Robert M F; de Haan, Alex; van den IJssel, Jan G M; Hamidi, Ahd; Beurret, Michel
2015-11-27
Due to the rapidly increasing introduction of Haemophilus influenzae type b (Hib) and other conjugate vaccines worldwide during the last decade, reliable and robust analytical methods are needed for the quantitative monitoring of intermediate samples generated during fermentation (upstream processing, USP) and purification (downstream processing, DSP) of polysaccharide vaccine components. This study describes the quantitative characterization of in-process control (IPC) samples generated during the fermentation and purification of the capsular polysaccharide (CPS), polyribosyl-ribitol-phosphate (PRP), derived from Hib. Reliable quantitative methods are necessary for all stages of production; otherwise accurate process monitoring and validation is not possible. Prior to the availability of high performance anion exchange chromatography methods, this polysaccharide was predominantly quantified either with immunochemical methods, or with the colorimetric orcinol method, which shows interference from fermentation medium components and reagents used during purification. Next to an improved high performance anion exchange chromatography-pulsed amperometric detection (HPAEC-PAD) method, using a modified gradient elution, both the orcinol assay and high performance size exclusion chromatography (HPSEC) analyses were evaluated. For DSP samples, it was found that the correlation between the results obtained by HPAEC-PAD specific quantification of the PRP monomeric repeat unit released by alkaline hydrolysis, and those from the orcinol method was high (R(2)=0.8762), and that it was lower between HPAEC-PAD and HPSEC results. Additionally, HPSEC analysis of USP samples yielded surprisingly comparable results to those obtained by HPAEC-PAD. In the early part of the fermentation, medium components interfered with the different types of analysis, but quantitative HPSEC data could still be obtained, although lacking the specificity of the HPAEC-PAD method. Thus, the HPAEC-PAD method has the advantage of giving a specific response compared to the orcinol assay and HPSEC, and does not show interference from various components that can be present in intermediate and purified PRP samples. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Landry-Boozer, Kristine L.
Traditional cognitive-developmental researchers have provided a large body of evidence supporting the stage-like progression of children's cognitive development. Further, from this body of research comes evidence that children's understanding of HIV/AIDS develops in much the same way as their understanding of other illness-related concepts. Researchers from a newer perspective assert that biological concepts develop from intuitive theories. In general, as children are exposed to relevant content and have opportunities to organize this information, their theories become more accurate and differentiated. According to this perspective, there are no broad structural constraints on developing concepts, as asserted by cognitive developmental theorists. The purpose of the current study was two-fold: to provide support for both theoretical perspectives, while at the same time to explore children's conceptualizations of the immune system, which has not been done previously in the cognitive-developmental literature. One hundred ninety children ranging in age from 4 years old through 11 years old, and a group of adults, participated. Each participant was interviewed regarding health concepts and the body's function in maintaining health. Participants were also asked to report if they had certain experiences that would have led to relevant content exposure. Qualitative analyses were utilized to code the interviews with rubrics based on both theoretical perspectives. Quantitative analyses consisted of a series of univariate ANOVAs (and post hoc tests when appropriate) examining all three coding variables (accuracy, differentiation, and developmental level) across various age-group combinations and exposure groups. Results of these analyses provided support for both theoretical perspectives. When the data were analyzed for developmental level by all ages, a stage-like progression consistent with Piagetian stages emerged. When accuracy and differentiation were examined (intuitive theories perspective), discrete groups could not be formed. Instead, a gradual increase in accuracy and differentiation was observed. Additional support for this perspective was found when the responses of participants who had additional exposure provided responses that were more accurate, differentiated, and sophisticated than those of participants with no additional exposure. Theoretical and educational implications of these findings are discussed.
NASA Astrophysics Data System (ADS)
Łazarek, Łukasz; Antończak, Arkadiusz J.; Wójcik, Michał R.; Kozioł, Paweł E.; Stepak, Bogusz; Abramski, Krzysztof M.
2014-08-01
Laser-induced breakdown spectroscopy (LIBS) is a fast, fully optical method, that needs little or no sample preparation. In this technique qualitative and quantitative analysis is based on comparison. The determination of composition is generally based on the construction of a calibration curve namely the LIBS signal versus the concentration of the analyte. Typically, to calibrate the system, certified reference materials with known elemental composition are used. Nevertheless, such samples due to differences in the overall composition with respect to the used complex inorganic materials can influence significantly on the accuracy. There are also some intermediate factors which can cause imprecision in measurements, such as optical absorption, surface structure, thermal conductivity etc. This paper presents the calibration procedure performed with especially prepared pellets from the tested materials, which composition was previously defined. We also proposed methods of post-processing which allowed for mitigation of the matrix effects and for a reliable and accurate analysis. This technique was implemented for determination of trace elements in industrial copper concentrates standardized by conventional atomic absorption spectroscopy with a flame atomizer. A series of copper flotation concentrate samples was analyzed for contents of three elements, that is silver, cobalt and vanadium. It has been shown that the described technique can be used to qualitative and quantitative analyses of complex inorganic materials, such as copper flotation concentrates.
Park, Sang-Je; Huh, Jae-Won; Kim, Young-Hyun; Lee, Sang-Rae; Kim, Sang-Hyun; Kim, Sun-Uk; Kim, Heui-Soo; Kim, Min Kyu; Chang, Kyu-Tae
2013-05-01
Quantitative reverse transcription polymerase chain reaction (qRT-PCR) is a specific and sensitive technique for quantifying gene expression. To analyze qRT-PCR data accurately, suitable reference genes that show consistent expression patterns across different tissues and experimental conditions should be selected. The objective of this study was to obtain the most stable reference genes in dogs, using samples from 13 different brain tissues and 10 other organs. 16 well-known candidate reference genes were analyzed by the geNorm, NormFinder, and BestKeeper programs. Brain tissues were derived from several different anatomical regions, including the forebrain, cerebrum, diencephalon, hindbrain, and metencephalon, and grouped accordingly. Combination of the three different analyses clearly indicated that the ideal reference genes are ribosomal protien S5 (RPS5) in whole brain, RPL8 and RPS5 in whole body tissues, RPS5 and RPS19 in the forebrain and cerebrum, RPL32 and RPS19 in the diencephalon, GAPDH and RPS19 in the hindbrain, and MRPS7 and RPL13A in the metencephalon. These genes were identified as ideal for the normalization of qRT-PCR results in the respective tissues. These findings indicate more suitable and stable reference genes for future studies of canine gene expression.
Bayesian B-spline mapping for dynamic quantitative traits.
Xing, Jun; Li, Jiahan; Yang, Runqing; Zhou, Xiaojing; Xu, Shizhong
2012-04-01
Owing to their ability and flexibility to describe individual gene expression at different time points, random regression (RR) analyses have become a popular procedure for the genetic analysis of dynamic traits whose phenotypes are collected over time. Specifically, when modelling the dynamic patterns of gene expressions in the RR framework, B-splines have been proved successful as an alternative to orthogonal polynomials. In the so-called Bayesian B-spline quantitative trait locus (QTL) mapping, B-splines are used to characterize the patterns of QTL effects and individual-specific time-dependent environmental errors over time, and the Bayesian shrinkage estimation method is employed to estimate model parameters. Extensive simulations demonstrate that (1) in terms of statistical power, Bayesian B-spline mapping outperforms the interval mapping based on the maximum likelihood; (2) for the simulated dataset with complicated growth curve simulated by B-splines, Legendre polynomial-based Bayesian mapping is not capable of identifying the designed QTLs accurately, even when higher-order Legendre polynomials are considered and (3) for the simulated dataset using Legendre polynomials, the Bayesian B-spline mapping can find the same QTLs as those identified by Legendre polynomial analysis. All simulation results support the necessity and flexibility of B-spline in Bayesian mapping of dynamic traits. The proposed method is also applied to a real dataset, where QTLs controlling the growth trajectory of stem diameters in Populus are located.
Chaturvedi, Palak; Doerfler, Hannes; Jegadeesan, Sridharan; Ghatak, Arindam; Pressman, Etan; Castillejo, Maria Angeles; Wienkoop, Stefanie; Egelhofer, Volker; Firon, Nurit; Weckwerth, Wolfram
2015-11-06
Recently, we have developed a quantitative shotgun proteomics strategy called mass accuracy precursor alignment (MAPA). The MAPA algorithm uses high mass accuracy to bin mass-to-charge (m/z) ratios of precursor ions from LC-MS analyses, determines their intensities, and extracts a quantitative sample versus m/z ratio data alignment matrix from a multitude of samples. Here, we introduce a novel feature of this algorithm that allows the extraction and alignment of proteotypic peptide precursor ions or any other target peptide from complex shotgun proteomics data for accurate quantification of unique proteins. This strategy circumvents the problem of confusing the quantification of proteins due to indistinguishable protein isoforms by a typical shotgun proteomics approach. We applied this strategy to a comparison of control and heat-treated tomato pollen grains at two developmental stages, post-meiotic and mature. Pollen is a temperature-sensitive tissue involved in the reproductive cycle of plants and plays a major role in fruit setting and yield. By LC-MS-based shotgun proteomics, we identified more than 2000 proteins in total for all different tissues. By applying the targeted MAPA data-processing strategy, 51 unique proteins were identified as heat-treatment-responsive protein candidates. The potential function of the identified candidates in a specific developmental stage is discussed.
Recent applications of gas chromatography with high-resolution mass spectrometry.
Špánik, Ivan; Machyňáková, Andrea
2018-01-01
Gas chromatography coupled to high-resolution mass spectrometry is a powerful analytical method that combines excellent separation power of gas chromatography with improved identification based on an accurate mass measurement. These features designate gas chromatography with high-resolution mass spectrometry as the first choice for identification and structure elucidation of unknown volatile and semi-volatile organic compounds. Gas chromatography with high-resolution mass spectrometry quantitative analyses was previously focused on the determination of dioxins and related compounds using magnetic sector type analyzers, a standing requirement of many international standards. The introduction of a quadrupole high-resolution time-of-flight mass analyzer broadened interest in this method and novel applications were developed, especially for multi-target screening purposes. This review is focused on the development and the most interesting applications of gas chromatography coupled to high-resolution mass spectrometry towards analysis of environmental matrices, biological fluids, and food safety since 2010. The main attention is paid to various approaches and applications of gas chromatography coupled to high-resolution mass spectrometry for non-target screening to identify contaminants and to characterize the chemical composition of environmental, food, and biological samples. The most interesting quantitative applications, where a significant contribution of gas chromatography with high-resolution mass spectrometry over the currently used methods is expected, will be discussed as well. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Structural Characterization of Ginsenosides from Flower Buds of Panax ginseng by RRLC-Q-TOF MS.
Wu, Wei; Lu, Ziyan; Teng, Yaran; Guo, Yingying; Liu, Shuying
2016-02-01
Ginseng flower bud as a part of Panax ginseng has received much attention as a valuable functional food with medicinal potential. A few studies focused on systematic and comprehensive studies on its major ingredients. This study aims to rapidly characterize ginsenosides in ginseng flower buds and provide scientific basis for developing functional food, exploiting pharmaceutical effects and making full use of ginseng resources. A rapid resolution liquid chromatography coupled with quadrupole-time-of-flight mass spectrometry (RRLC-Q-TOF-MS) method was developed for rapid qualitative and quantitative analysis of ginsenosides in ginseng flower buds. The compounds were identified by comparing retention time of the reference standards, accurate mass measurement and the fragment ions obtained from RRLC-Q-TOF-MS/MS analyses. A total of 14 kinds of ginsenosides were identified and 5 kinds of malonyl-ginsenosides were first tentatively identified in ginseng flower buds. Ten kinds of main ginsenosides were quantitatively analyzed. The developed RRLC-Q-TOF-MS method was demonstrated as an effective analytical means for rapid characterization of the ginsenosides in flower buds of P. ginseng. The research result is valuable for quality control, assessment of authenticity and stability evaluation of ginseng flower buds. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Ferreira, Fabiano Guerra; Barbosa, Igor Bastos; Scelza, Pantaleo; Montagnana, Marcello Bulhões; Russano, Daniel; Neff, John; Scelza, Miriam Zaccaro
2017-09-28
The aim of this study was to undertake a qualitative and quantitative assessment of nanoscale alterations and wear on the surfaces of nickel-titanium (NiTi) endodontic instruments, before and after use, through a high-resolution, noncontact, three-dimensional optical profiler, and to verify the accuracy of the evaluation method. Cutting blade surfaces of two different brands of NiTi endodontic instruments, Reciproc R25 (n = 5) and WaveOne Primary (n = 5), were examined and compared before and after two uses in simulated root canals made in clear resin blocks. The analyses were performed on three-dimensional images which were obtained from surface areas measuring 211 × 211 µm, located 3 mm from their tips. The quantitative evaluation of the samples was conducted before and after the first and second usage, by the recordings of three amplitude parameters. The data were subjected to statistical analysis at a 5% level of significance. The results revealed statistically significant increases in the surface wear of both instruments groups after the second use. The presence of irregularities was found on the surface topography of all the instruments, before and after use. Regardless of the evaluation stage, most of the defects were observed in the WaveOne instruments. The three-dimensional technique was suitable and effective for the accurate investigation of the same surfaces of the instruments in different periods of time.
Direct quantification of rare earth doped titania nanoparticles in individual human cells
NASA Astrophysics Data System (ADS)
Jeynes, J. C. G.; Jeynes, C.; Palitsin, V.; Townley, H. E.
2016-07-01
There are many possible biomedical applications for titania nanoparticles (NPs) doped with rare earth elements (REEs), from dose enhancement and diagnostic imaging in radiotherapy, to biosensing. However, there are concerns that the NPs could disintegrate in the body thus releasing toxic REE ions to undesired locations. As a first step, we investigate how accurately the Ti/REE ratio from the NPs can be measured inside human cells. A quantitative analysis of whole, unsectioned, individual human cells was performed using proton microprobe elemental microscopy. This method is unique in being able to quantitatively analyse all the elements in an unsectioned individual cell with micron resolution, while also scanning large fields of view. We compared the Ti/REE signal inside cells to NPs that were outside the cells, non-specifically absorbed onto the polypropylene substrate. We show that the REE signal in individual cells co-localises with the titanium signal, indicating that the NPs have remained intact. Within the uncertainty of the measurement, there is no difference between the Ti/REE ratio inside and outside the cells. Interestingly, we also show that there is considerable variation in the uptake of the NPs from cell-to-cell, by a factor of more than 10. We conclude that the NPs enter the cells and remain intact. The large heterogeneity in NP concentrations from cell-to-cell should be considered if they are to be used therapeutically.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-19
... Review Draft. These two draft assessment documents describe the quantitative analyses the EPA is... NAAQS,\\3\\ the Agency is conducting quantitative assessments characterizing the: (1) Health risks... present the initial key results, observations, and related uncertainties associated with the quantitative...
Text mixing shapes the anatomy of rank-frequency distributions
NASA Astrophysics Data System (ADS)
Williams, Jake Ryland; Bagrow, James P.; Danforth, Christopher M.; Dodds, Peter Sheridan
2015-05-01
Natural languages are full of rules and exceptions. One of the most famous quantitative rules is Zipf's law, which states that the frequency of occurrence of a word is approximately inversely proportional to its rank. Though this "law" of ranks has been found to hold across disparate texts and forms of data, analyses of increasingly large corpora since the late 1990s have revealed the existence of two scaling regimes. These regimes have thus far been explained by a hypothesis suggesting a separability of languages into core and noncore lexica. Here we present and defend an alternative hypothesis that the two scaling regimes result from the act of aggregating texts. We observe that text mixing leads to an effective decay of word introduction, which we show provides accurate predictions of the location and severity of breaks in scaling. Upon examining large corpora from 10 languages in the Project Gutenberg eBooks collection, we find emphatic empirical support for the universality of our claim.
microRNA Expression Profiling: Technologies, Insights, and Prospects.
Roden, Christine; Mastriano, Stephen; Wang, Nayi; Lu, Jun
2015-01-01
Since the early days of microRNA (miRNA) research, miRNA expression profiling technologies have provided important tools toward both better understanding of the biological functions of miRNAs and using miRNA expression as potential diagnostics. Multiple technologies, such as microarrays, next-generation sequencing, bead-based detection system, single-molecule measurements, and quantitative RT-PCR, have enabled accurate quantification of miRNAs and the subsequent derivation of key insights into diverse biological processes. As a class of ~22 nt long small noncoding RNAs, miRNAs present unique challenges in expression profiling that require careful experimental design and data analyses. We will particularly discuss how normalization and the presence of miRNA isoforms can impact data interpretation. We will present one example in which the consideration in data normalization has provided insights that helped to establish the global miRNA expression as a tumor suppressor. Finally, we discuss two future prospects of using miRNA profiling technologies to understand single cell variability and derive new rules for the functions of miRNA isoforms.
Comprehensive two-dimensional gas chromatography for the analysis of Fischer-Tropsch oil products.
van der Westhuizen, Rina; Crous, Renier; de Villiers, André; Sandra, Pat
2010-12-24
The Fischer-Tropsch (FT) process involves a series of catalysed reactions of carbon monoxide and hydrogen, originating from coal, natural gas or biomass, leading to a variety of synthetic chemicals and fuels. The benefits of comprehensive two-dimensional gas chromatography (GC×GC) compared to one-dimensional GC (1D-GC) for the detailed investigation of the oil products of low and high temperature FT processes are presented. GC×GC provides more accurate quantitative data to construct Anderson-Schultz-Flory (ASF) selectivity models that correlate the FT product distribution with reaction variables. On the other hand, the high peak capacity and sensitivity of GC×GC allow the detailed study of components present at trace level. Analyses of the aromatic and oxygenated fractions of a high temperature FT (HT-FT) process are presented. GC×GC data have been used to optimise or tune the HT-FT process by using a lab-scale micro-FT-reactor. Copyright © 2010 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
van Rensburg, L.; Claassens, S.; Bezuidenhout, J. J.; Jansen van Rensburg, P. J.
2009-03-01
The much publicised problem with major asbestos pollution and related health issues in South Africa, has called for action to be taken to negate the situation. The aim of this project was to establish a prioritisation index that would provide a scientifically based sequence in which polluted asbestos mines in Southern Africa ought to be rehabilitated. It was reasoned that a computerised database capable of calculating such a Rehabilitation Prioritisation Index (RPI) would be a fruitful departure from the previously used subjective selection prone to human bias. The database was developed in Microsoft Access and both quantitative and qualitative data were used for the calculation of the RPI value. The logical database structure consists of a number of mines, each consisting of a number of dumps, for which a number of samples have been analysed to determine asbestos fibre contents. For this system to be accurate as well as relevant, the data in the database should be revalidated and updated on a regular basis.
Jager, N G L; Rosing, H; Linn, S C; Schellens, J H M; Beijnen, J H
2012-06-01
The antiestrogenic effect of tamoxifen is mainly attributable to the active metabolites endoxifen and 4-hydroxytamoxifen. This effect is assumed to be concentration-dependent and therefore quantitative analysis of tamoxifen and metabolites for clinical studies and therapeutic drug monitoring is increasing. We investigated the large discrepancies in reported mean endoxifen and 4-hydroxytamoxifen concentrations. Two published LC-MS/MS methods are used to analyse a set of 75 serum samples from patients treated with tamoxifen. The method from Teunissen et al. (J Chrom B, 879:1677-1685, 2011) separates endoxifen and 4-hydroxytamoxifen from other tamoxifen metabolites with similar masses and fragmentation patterns. The second method, published by Gjerde et al. (J Chrom A, 1082:6-14, 2005) however lacks selectivity, resulting in a factor 2-3 overestimation of the endoxifen and 4-hydroxytamoxifen levels, respectively. We emphasize the use of highly selective LC-MS/MS methods for the quantification of tamoxifen and its metabolites in biological samples.
Hwang Fu, Yu-Hsien; Huang, William Y C; Shen, Kuang; Groves, Jay T; Miller, Thomas; Shan, Shu-Ou
2017-07-28
The signal recognition particle (SRP) delivers ~30% of the proteome to the eukaryotic endoplasmic reticulum, or the bacterial plasma membrane. The precise mechanism by which the bacterial SRP receptor, FtsY, interacts with and is regulated at the target membrane remain unclear. Here, quantitative analysis of FtsY-lipid interactions at single-molecule resolution revealed a two-step mechanism in which FtsY initially contacts membrane via a Dynamic mode, followed by an SRP-induced conformational transition to a Stable mode that activates FtsY for downstream steps. Importantly, mutational analyses revealed extensive auto-inhibitory mechanisms that prevent free FtsY from engaging membrane in the Stable mode; an engineered FtsY pre-organized into the Stable mode led to indiscriminate targeting in vitro and disrupted FtsY function in vivo. Our results show that the two-step lipid-binding mechanism uncouples the membrane association of FtsY from its conformational activation, thus optimizing the balance between the efficiency and fidelity of co-translational protein targeting.
Keller, Andrew; Bader, Samuel L.; Shteynberg, David; Hood, Leroy; Moritz, Robert L.
2015-01-01
Proteomics by mass spectrometry technology is widely used for identifying and quantifying peptides and proteins. The breadth and sensitivity of peptide detection have been advanced by the advent of data-independent acquisition mass spectrometry. Analysis of such data, however, is challenging due to the complexity of fragment ion spectra that have contributions from multiple co-eluting precursor ions. We present SWATHProphet software that identifies and quantifies peptide fragment ion traces in data-independent acquisition data, provides accurate probabilities to ensure results are correct, and automatically detects and removes contributions to quantitation originating from interfering precursor ions. Integration in the widely used open source Trans-Proteomic Pipeline facilitates subsequent analyses such as combining results of multiple data sets together for improved discrimination using iProphet and inferring sample proteins using ProteinProphet. This novel development should greatly help make data-independent acquisition mass spectrometry accessible to large numbers of users. PMID:25713123
Small values in big data: The continuing need for appropriate metadata
Stow, Craig A.; Webster, Katherine E.; Wagner, Tyler; Lottig, Noah R.; Soranno, Patricia A.; Cha, YoonKyung
2018-01-01
Compiling data from disparate sources to address pressing ecological issues is increasingly common. Many ecological datasets contain left-censored data – observations below an analytical detection limit. Studies from single and typically small datasets show that common approaches for handling censored data — e.g., deletion or substituting fixed values — result in systematic biases. However, no studies have explored the degree to which the documentation and presence of censored data influence outcomes from large, multi-sourced datasets. We describe left-censored data in a lake water quality database assembled from 74 sources and illustrate the challenges of dealing with small values in big data, including detection limits that are absent, range widely, and show trends over time. We show that substitutions of censored data can also bias analyses using ‘big data’ datasets, that censored data can be effectively handled with modern quantitative approaches, but that such approaches rely on accurate metadata that describe treatment of censored data from each source.
NASA Astrophysics Data System (ADS)
Bell, Andrew F.; Naylor, Mark; Heap, Michael J.; Main, Ian G.
2011-08-01
Power-law accelerations in the mean rate of strain, earthquakes and other precursors have been widely reported prior to material failure phenomena, including volcanic eruptions, landslides and laboratory deformation experiments, as predicted by several theoretical models. The Failure Forecast Method (FFM), which linearizes the power-law trend, has been routinely used to forecast the failure time in retrospective analyses; however, its performance has never been formally evaluated. Here we use synthetic and real data, recorded in laboratory brittle creep experiments and at volcanoes, to show that the assumptions of the FFM are inconsistent with the error structure of the data, leading to biased and imprecise forecasts. We show that a Generalized Linear Model method provides higher-quality forecasts that converge more accurately to the eventual failure time, accounting for the appropriate error distributions. This approach should be employed in place of the FFM to provide reliable quantitative forecasts and estimate their associated uncertainties.
Dias, Rafael Carlos Eloy; Valderrama, Patrícia; Março, Paulo Henrique; Dos Santos Scholz, Maria Brigida; Edelmann, Michael; Yeretzian, Chahan
2018-07-30
Chemical analyses and sensory evaluation are the most applied methods for quality control of roasted and ground coffee (RG). However, faster alternatives would be highly valuable. Here, we applied infrared-photoacoustic spectroscopy (FTIR-PAS) on RG powder. Mixtures of specific defective beans were blended with healthy (defect-free) Coffea arabica and Coffea canephora bases in specific ratios, forming different classes of blends. Principal Component Analysis allowed predicting the amount/fraction and nature of the defects in blends while partial Least Squares Discriminant Analysis revealed similarities between blends (=samples). A successful predictive model was obtained using six classes of blends. The model could classify 100% of the samples into four classes. The specificities were higher than 0.9. Application of FTIR-PAS on RG coffee to characterize and classify blends has shown to be an accurate, easy, quick and "green" alternative to current methods. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Measurement of replication structures at the nanometer scale using super-resolution light microscopy
Baddeley, D.; Chagin, V. O.; Schermelleh, L.; Martin, S.; Pombo, A.; Carlton, P. M.; Gahl, A.; Domaing, P.; Birk, U.; Leonhardt, H.; Cremer, C.; Cardoso, M. C.
2010-01-01
DNA replication, similar to other cellular processes, occurs within dynamic macromolecular structures. Any comprehensive understanding ultimately requires quantitative data to establish and test models of genome duplication. We used two different super-resolution light microscopy techniques to directly measure and compare the size and numbers of replication foci in mammalian cells. This analysis showed that replication foci vary in size from 210 nm down to 40 nm. Remarkably, spatially modulated illumination (SMI) and 3D-structured illumination microscopy (3D-SIM) both showed an average size of 125 nm that was conserved throughout S-phase and independent of the labeling method, suggesting a basic unit of genome duplication. Interestingly, the improved optical 3D resolution identified 3- to 5-fold more distinct replication foci than previously reported. These results show that optical nanoscopy techniques enable accurate measurements of cellular structures at a level previously achieved only by electron microscopy and highlight the possibility of high-throughput, multispectral 3D analyses. PMID:19864256
Witte, K; Cameron, K A; Lapinski, M K; Nzyuko, S
1998-01-01
Print HIV/AIDS prevention campaign materials (e.g., posters, pamphlets, stickers) from 10 public health organizations in Kenya were evaluated according to the Extended Parallel Process Model (EPPM), a health behavior change theory based on the fear appeal literature, at various sites along the Trans-Africa Highway in Kenya. Three groups each of commercial sex workers (CSWs), truck drivers (TDs) and their assistants (ASSTs), and young men (YM) who live and work at the truck stops participated in focus group discussions where reactions to the campaign materials were gathered according to this theoretical base. Reactions to campaign materials varied substantially, according to the poster or pamphlet viewed. Overall, most participants wanted more detailed information about (a) the proper way to use condoms, (b) ideas for how to negotiate condom use with reluctant partners, and (c) accurate information on symptoms of AIDS and what to do once one contracted HIV. Both quantitative and qualitative analyses of the campaign materials are reported.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manes, Nathan P.; Estep, Ryan D.; Mottaz, Heather M.
2008-03-07
Orthopoxviruses are the largest and most complex of the animal viruses. In response to the recent emergence of monkeypox in Africa and the threat of smallpox bioterrorism, virulent (monkeypox virus) and benign (vaccinia virus) orthopoxviruses were proteomically compared with the goal of identifying proteins required for pathogenesis. Orthopoxviruses were grown in HeLa cells to two different viral forms (intracellular mature virus and extracellular enveloped virus), purified by sucrose gradient ultracentrifugation, denatured using RapiGest™ surfactant, and digested with trypsin. Unfractionated samples and strong cation exchange HPLC fractions were analyzed by reversed-phase LC-MS/MS, and analyses of the MS/MS spectra using SEQUEST® andmore » X! Tandem resulted in the identification of hundreds of monkeypox, vaccinia, and copurified host proteins. The unfractionated samples were additionally analyzed by LC-MS on an LTQ-Orbitrap™, and the accurate mass and elution time tag approach was used to perform quantitative comparisons. Possible pathophysiological roles of differentially expressed orthopoxvirus genes are discussed.« less
Poon, G K; Raynaud, F I; Mistry, P; Odell, D E; Kelland, L R; Harrap, K R; Barnard, C F; Murrer, B A
1995-09-29
Bis(acetato)amminedichloro(cyclohexylamine) platinum(IV) (JM216) is a new orally administered platinum complex with antitumor properties, and is currently undergoing phase II clinical trials. When JM216 was incubated with human plasma ultrafiltrate, 93% of the platinum species were protein-bound and 7% were unbound. The unbound platinum complexes in the ultrafiltrates of human plasma were analysed using a liquid chromatography-electrospray ionization-mass spectrometry (LC-ESI-MS) method. Apart from the parent drug, four metabolites were identified and characterised. These include JM118 [amminedichloro(cyclohexylamine) platinum(II)], JM383 [bis(acetato)ammine(cyclohexylamine)dihydroxo platinum(IV)] and the two isomers JM559 and JM518 [bis(acetato)amminechloro(cyclohexylamine) hydroxo platinum(IV)]. Their elemental compositions were determined by accurate mass measurement during the LC analysis, to confirm their identities. Quantitation of these metabolites by off-line LC atomic absorption spectroscopy demonstrated that JM118 is the major metabolite in plasma from patients receiving JM216 treatment.
The fundamental parameter method applied to X-ray fluorescence analysis with synchrotron radiation
NASA Astrophysics Data System (ADS)
Pantenburg, F. J.; Beier, T.; Hennrich, F.; Mommsen, H.
1992-05-01
Quantitative X-ray fluorescence analysis applying the fundamental parameter method is usually restricted to monochromatic excitation sources. It is shown here, that such analyses can be performed as well with a white synchrotron radiation spectrum. To determine absolute elemental concentration values it is necessary to know the spectral distribution of this spectrum. A newly designed and tested experimental setup, which uses the synchrotron radiation emitted from electrons in a bending magnet of ELSA (electron stretcher accelerator of the university of Bonn) is presented. The determination of the exciting spectrum, described by the given electron beam parameters, is limited due to uncertainties in the vertical electron beam size and divergence. We describe a method which allows us to determine the relative and absolute spectral distributions needed for accurate analysis. First test measurements of different alloys and standards of known composition demonstrate that it is possible to determine exact concentration values in bulk and trace element analysis.
SPH simulation of free surface flow over a sharp-crested weir
NASA Astrophysics Data System (ADS)
Ferrari, Angela
2010-03-01
In this paper the numerical simulation of a free surface flow over a sharp-crested weir is presented. Since in this case the usual shallow water assumptions are not satisfied, we propose to solve the problem using the full weakly compressible Navier-Stokes equations with the Tait equation of state for water. The numerical method used consists of the new meshless Smooth Particle Hydrodynamics (SPH) formulation proposed by Ferrari et al. (2009) [8], that accurately tracks the free surface profile and provides monotone pressure fields. Thus, the unsteady evolution of the complex moving material interface (free surface) can been properly solved. The simulations involving about half a million of fluid particles have been run in parallel on two of the most powerful High Performance Computing (HPC) facilities in Europe. The validation of the results has been carried out analysing the pressure field and comparing the free surface profiles obtained with the SPH scheme with experimental measurements available in literature [18]. A very good quantitative agreement has been obtained.
Predicting the size of individual and group differences on speeded cognitive tasks.
Chen, Jing; Hale, Sandra; Myerson, Joel
2007-06-01
An a priori test of the difference engine model (Myerson, Hale, Zheng, Jenkins, & Widaman, 2003) was conducted using a large, diverse sample of individuals who performed three speeded verbal tasks and three speeded visuospatial tasks. Results demonstrated that, as predicted by the model, the group standard deviation (SD) on any task was proportional to the amount of processing required by that task. Both individual performances as well as those of fast and slow subgroups could be accurately predicted by the model using no free parameters, just an individual or subgroup's mean z-score and the values of theoretical constructs estimated from fits to the group SDs. Taken together, these results are consistent with post hoc analyses reported by Myerson et al. and provide even stronger supporting evidence. In particular, the ability to make quantitative predictions without using any free parameters provides the clearest demonstration to date of the power of an analytic approach on the basis of the difference engine.
Physically-based in silico light sheet microscopy for visualizing fluorescent brain models
2015-01-01
Background We present a physically-based computational model of the light sheet fluorescence microscope (LSFM). Based on Monte Carlo ray tracing and geometric optics, our method simulates the operational aspects and image formation process of the LSFM. This simulated, in silico LSFM creates synthetic images of digital fluorescent specimens that can resemble those generated by a real LSFM, as opposed to established visualization methods producing visually-plausible images. We also propose an accurate fluorescence rendering model which takes into account the intrinsic characteristics of fluorescent dyes to simulate the light interaction with fluorescent biological specimen. Results We demonstrate first results of our visualization pipeline to a simplified brain tissue model reconstructed from the somatosensory cortex of a young rat. The modeling aspects of the LSFM units are qualitatively analysed, and the results of the fluorescence model were quantitatively validated against the fluorescence brightness equation and characteristic emission spectra of different fluorescent dyes. AMS subject classification Modelling and simulation PMID:26329404
Borakati, Aditya; Razack, Abdul; Cawthorne, Chris; Roy, Rajarshi; Usmani, Sharjeel; Ahmed, Najeeb
2018-07-01
This study aims to assess the correlation between PET/CT and endoscopic ultrasound (EUS) parameters in patients with oesophageal cancer. All patients who had complete PET/CT and EUS staging performed for oesophageal cancer at our centre between 2010 and 2016 were included. Images were retrieved and analysed for a range of parameters including tumour length, volume and position relative to the aortic arch. Seventy patients were included in the main analysis. A strong correlation was found between EUS and PET/CT in the tumour length, the volume and the position of the tumour relative to the aortic arch. Regression modelling showed a reasonable predictive value for PET/CT in calculating EUS parameters, with r higher than 0.585 in some cases. Given the strong correlation between EUS and PET parameters, fluorine-18 fluorodeoxyglucose (F-FDG) PET can provide accurate information on the length and the volume of tumour in patients who either cannot tolerate EUS or have impassable strictures.
Text mixing shapes the anatomy of rank-frequency distributions.
Williams, Jake Ryland; Bagrow, James P; Danforth, Christopher M; Dodds, Peter Sheridan
2015-05-01
Natural languages are full of rules and exceptions. One of the most famous quantitative rules is Zipf's law, which states that the frequency of occurrence of a word is approximately inversely proportional to its rank. Though this "law" of ranks has been found to hold across disparate texts and forms of data, analyses of increasingly large corpora since the late 1990s have revealed the existence of two scaling regimes. These regimes have thus far been explained by a hypothesis suggesting a separability of languages into core and noncore lexica. Here we present and defend an alternative hypothesis that the two scaling regimes result from the act of aggregating texts. We observe that text mixing leads to an effective decay of word introduction, which we show provides accurate predictions of the location and severity of breaks in scaling. Upon examining large corpora from 10 languages in the Project Gutenberg eBooks collection, we find emphatic empirical support for the universality of our claim.
Analysing black phosphorus transistors using an analytic Schottky barrier MOSFET model.
Penumatcha, Ashish V; Salazar, Ramon B; Appenzeller, Joerg
2015-11-13
Owing to the difficulties associated with substitutional doping of low-dimensional nanomaterials, most field-effect transistors built from carbon nanotubes, two-dimensional crystals and other low-dimensional channels are Schottky barrier MOSFETs (metal-oxide-semiconductor field-effect transistors). The transmission through a Schottky barrier-MOSFET is dominated by the gate-dependent transmission through the Schottky barriers at the metal-to-channel interfaces. This makes the use of conventional transistor models highly inappropriate and has lead researchers in the past frequently to extract incorrect intrinsic properties, for example, mobility, for many novel nanomaterials. Here we propose a simple modelling approach to quantitatively describe the transfer characteristics of Schottky barrier-MOSFETs from ultra-thin body materials accurately in the device off-state. In particular, after validating the model through the analysis of a set of ultra-thin silicon field-effect transistor data, we have successfully applied our approach to extract Schottky barrier heights for electrons and holes in black phosphorus devices for a large range of body thicknesses.
Analysing black phosphorus transistors using an analytic Schottky barrier MOSFET model
Penumatcha, Ashish V.; Salazar, Ramon B.; Appenzeller, Joerg
2015-01-01
Owing to the difficulties associated with substitutional doping of low-dimensional nanomaterials, most field-effect transistors built from carbon nanotubes, two-dimensional crystals and other low-dimensional channels are Schottky barrier MOSFETs (metal-oxide-semiconductor field-effect transistors). The transmission through a Schottky barrier-MOSFET is dominated by the gate-dependent transmission through the Schottky barriers at the metal-to-channel interfaces. This makes the use of conventional transistor models highly inappropriate and has lead researchers in the past frequently to extract incorrect intrinsic properties, for example, mobility, for many novel nanomaterials. Here we propose a simple modelling approach to quantitatively describe the transfer characteristics of Schottky barrier-MOSFETs from ultra-thin body materials accurately in the device off-state. In particular, after validating the model through the analysis of a set of ultra-thin silicon field-effect transistor data, we have successfully applied our approach to extract Schottky barrier heights for electrons and holes in black phosphorus devices for a large range of body thicknesses. PMID:26563458
Evaluation of spatial filtering on the accuracy of wheat area estimate
NASA Technical Reports Server (NTRS)
Dejesusparada, N. (Principal Investigator); Moreira, M. A.; Chen, S. C.; Delima, A. M.
1982-01-01
A 3 x 3 pixel spatial filter for postclassification was used for wheat classification to evaluate the effects of this procedure on the accuracy of area estimation using LANDSAT digital data obtained from a single pass. Quantitative analyses were carried out in five test sites (approx 40 sq km each) and t tests showed that filtering with threshold values significantly decreased errors of commission and omission. In area estimation filtering improved the overestimate of 4.5% to 2.7% and the root-mean-square error decreased from 126.18 ha to 107.02 ha. Extrapolating the same procedure of automatic classification using spatial filtering for postclassification to the whole study area, the accuracy in area estimate was improved from the overestimate of 10.9% to 9.7%. It is concluded that when single pass LANDSAT data is used for crop identification and area estimation the postclassification procedure using a spatial filter provides a more accurate area estimate by reducing classification errors.
NASA Astrophysics Data System (ADS)
Veryovkin, I. V.; Calaway, W. F.; Tripa, C. E.; Pellin, M. J.; Burnett, D. S.
2005-12-01
A new secondary neutral mass spectrometry (SNMS) instrument implementing laser post ionization (LPI) of ion sputtered and laser desorbed neutral species has been developed and constructed for the specific purpose of quantitative analysis of metallic elements at ultra trace levels in solar wind collector samples returned to Earth by the Genesis Discovery mission. The first LPI SNMS measurements are focusing on determining Al, Ca, Cr, and Mg in these samples. These measurements provide the first concentration and isotopic abundances determinations for several key metallic elements and also elucidate possible fractionation effects between the photosphere and the solar wind compositions. It is now documented that Genesis samples suffered surface contamination both during flight and during the breach of the Sample Return Capsule when it crashed. Since accurate quantitative analysis is compromised by sample contamination, several features have been built into the new LPI SNMS instrument to mitigate this difficulty. A normally-incident, low-energy (<500 eV) ion beam combined with a keV energy ion beam and a desorbing laser beam (both microfocused) enables dual beam analyses. The low-energy ion beam can be used to remove surface contaminant by sputtering with minimum ion beam mixing. This low-energy beam also will be used to perform ion beam milling, while either the microfocused ion or laser beam probes the solar wind elemental compositions as a function of sample depth. Because of the high depth resolution of dual beam analyses, such depth profiles clearly distinguish between surface contaminants and solar wind implanted atoms. In addition, in-situ optical and electron beam imaging for observing and avoiding particulates and scratches on solar wind sample surfaces is incorporated in the new LPI SNMS instrument to further reduce quantification problems. The current status of instrument tests and analyses will be presented. This work is supported by the U. S. Department of Energy, BES-Materials Sciences, under Contract W-31-109-ENG-38, and by NASA under Work Orders W-19,895 and W-10,091.
Kumpel, Belinda; Hazell, Matthew; Guest, Alan; Dixey, Jonathan; Mushens, Rosey; Bishop, Debbie; Wreford-Bush, Tim; Lee, Edmond
2014-05-01
Quantitation of fetomaternal hemorrhage (FMH) is performed to determine the dose of prophylactic anti-D (RhIG) required to prevent D immunization of D- women. Flow cytometry (FC) is the most accurate method. However, maternal white blood cells (WBCs) can give high background by binding anti-D nonspecifically, compromising accuracy. Maternal blood samples (69) were sent for FC quantitation of FMH after positive Kleihauer-Betke test (KBT) analysis and RhIG administration. Reagents used were BRAD-3-fluorescein isothiocyanate (FITC; anti-D), AEVZ5.3-FITC (anti-varicella zoster [anti-VZ], negative control), anti-fetal hemoglobin (HbF)-FITC, blended two-color reagents, BRAD-3-FITC/anti-CD45-phycoerythrin (PE; anti-D/L), and BRAD-3-FITC/anti-CD66b-PE (anti-D/G). PE-positive WBCs were eliminated from analysis by gating. Full blood counts were performed on maternal samples and female donors. Elevated numbers of neutrophils were present in 80% of patients. Red blood cell (RBC) indices varied widely in maternal blood. D+ FMH values obtained with anti-D/L, anti-D/G, and anti-HbF-FITC were very similar (r = 0.99, p < 0.001). Correlation between KBT and anti-HbF-FITC FMH results was low (r = 0.716). Inaccurate FMH quantitation using the current method (anti-D minus anti-VZ) occurred with 71% samples having less than 15 mL of D+ FMH (RBCs) and insufficient RhIG calculated for 9%. Using two-color reagents and anti-HbF-FITC, approximately 30% patients had elevated F cells, 26% had no fetal cells, 6% had D- FMH, 26% had 4 to 15 mL of D+ FMH, and 12% patients had more than 15 mL of D+ FMH (RBCs) requiring more than 300 μg of RhIG. Without accurate quantitation of D+ FMH by FC, some women would receive inappropriate or inadequate anti-D prophylaxis. The latter may be at risk of immunization leading to hemolytic disease of the newborn. © 2013 American Association of Blood Banks.
Quantitative MAS NMR characterization of the LiMn(1/2)Ni(1/2)O(2) electrode/electrolyte interphase.
Cuisinier, M; Martin, J F; Moreau, P; Epicier, T; Kanno, R; Guyomard, D; Dupré, N
2012-04-01
The conditions in which degradation processes at the positive electrode/electrolyte interface occur are still incompletely understood and traditional surface analytical techniques struggle to characterize and depict accurately interfacial films. In the present work, information on the growth and evolution of the interphases upon storage and cycling as well as their electrochemical consequences are gathered in the case of LiNi(1/2)Mn(1/2)O(2) with commonly used LiPF(6) (1M in EC/DMC) electrolyte. The use of (7)Li, (19)F and (31)P MAS NMR, made quantitative through the implementation of empirical calibration, is combined with transmission electron microscopy (TEM) and electron energy loss spectroscopy (EELS) to probe the elements involved in surface species and to unravel the inhomogenous architecture of the interphase. At room temperature, contact with the electrolyte leads to a covering of the oxide surface first by LiF and lithiated organic species are found on the outer part of the interphase. At 55°C, not only the interphase proceeds in further covering of the surface but also thickens resulting in an increase of 240% of lithiated species and the presence of -POF(2) fluorophosphates. The composition gradient within the interphase depth is also strongly affected by the temperature. In agreement with the electrochemical performance, quantitative NMR surface analyses show that the use of LiBOB-modified electrolyte results in a Li-enriched interphase, intrinsically less resistive than the standard LiPF(6)-based interphase, comprised of a mixture of resistive LiF with non lithiated species. Copyright © 2011 Elsevier Inc. All rights reserved.
Two imaging techniques for 3D quantification of pre-cementation space for CAD/CAM crowns.
Rungruanganunt, Patchanee; Kelly, J Robert; Adams, Douglas J
2010-12-01
Internal three-dimensional (3D) "fit" of prostheses to prepared teeth is likely more important clinically than "fit" judged only at the level of the margin (i.e. marginal "opening"). This work evaluates two techniques for quantitatively defining 3D "fit", both using pre-cementation space impressions: X-ray microcomputed tomography (micro-CT) and quantitative optical analysis. Both techniques are of interest for comparison of CAD/CAM system capabilities and for documenting "fit" as part of clinical studies. Pre-cementation space impressions were taken of a single zirconia coping on its die using a low viscosity poly(vinyl siloxane) impression material. Calibration specimens of this material were fabricated between the measuring platens of a micrometre. Both calibration curves and pre-cementation space impression data sets were obtained by examination using micro-CT and quantitative optical analysis. Regression analysis was used to compare calibration curves with calibration sets. Micro-CT calibration data showed tighter 95% confidence intervals and was able to measure over a wider thickness range than for the optical technique. Regions of interest (e.g., lingual, cervical) were more easily analysed with optical image analysis and this technique was more suitable for extremely thin impression walls (<10-15μm). Specimen preparation is easier for micro-CT and segmentation parameters appeared to capture dimensions accurately. Both micro-CT and the optical method can be used to quantify the thickness of pre-cementation space impressions. Each has advantages and limitations but either technique has the potential for use as part of clinical studies or CAD/CAM protocol optimization. Copyright © 2010 Elsevier Ltd. All rights reserved.
Li, Xiao-jun; Yi, Eugene C; Kemp, Christopher J; Zhang, Hui; Aebersold, Ruedi
2005-09-01
There is an increasing interest in the quantitative proteomic measurement of the protein contents of substantially similar biological samples, e.g. for the analysis of cellular response to perturbations over time or for the discovery of protein biomarkers from clinical samples. Technical limitations of current proteomic platforms such as limited reproducibility and low throughput make this a challenging task. A new LC-MS-based platform is able to generate complex peptide patterns from the analysis of proteolyzed protein samples at high throughput and represents a promising approach for quantitative proteomics. A crucial component of the LC-MS approach is the accurate evaluation of the abundance of detected peptides over many samples and the identification of peptide features that can stratify samples with respect to their genetic, physiological, or environmental origins. We present here a new software suite, SpecArray, that generates a peptide versus sample array from a set of LC-MS data. A peptide array stores the relative abundance of thousands of peptide features in many samples and is in a format identical to that of a gene expression microarray. A peptide array can be subjected to an unsupervised clustering analysis to stratify samples or to a discriminant analysis to identify discriminatory peptide features. We applied the SpecArray to analyze two sets of LC-MS data: one was from four repeat LC-MS analyses of the same glycopeptide sample, and another was from LC-MS analysis of serum samples of five male and five female mice. We demonstrate through these two study cases that the SpecArray software suite can serve as an effective software platform in the LC-MS approach for quantitative proteomics.
Hunter, Margaret E; Dorazio, Robert M; Butterfield, John S S; Meigs-Friend, Gaia; Nico, Leo G; Ferrante, Jason A
2017-03-01
A set of universal guidelines is needed to determine the limit of detection (LOD) in PCR-based analyses of low-concentration DNA. In particular, environmental DNA (eDNA) studies require sensitive and reliable methods to detect rare and cryptic species through shed genetic material in environmental samples. Current strategies for assessing detection limits of eDNA are either too stringent or subjective, possibly resulting in biased estimates of species' presence. Here, a conservative LOD analysis grounded in analytical chemistry is proposed to correct for overestimated DNA concentrations predominantly caused by the concentration plateau, a nonlinear relationship between expected and measured DNA concentrations. We have used statistical criteria to establish formal mathematical models for both quantitative and droplet digital PCR. To assess the method, a new Grass Carp (Ctenopharyngodon idella) TaqMan assay was developed and tested on both PCR platforms using eDNA in water samples. The LOD adjustment reduced Grass Carp occupancy and detection estimates while increasing uncertainty-indicating that caution needs to be applied to eDNA data without LOD correction. Compared to quantitative PCR, digital PCR had higher occurrence estimates due to increased sensitivity and dilution of inhibitors at low concentrations. Without accurate LOD correction, species occurrence and detection probabilities based on eDNA estimates are prone to a source of bias that cannot be reduced by an increase in sample size or PCR replicates. Other applications also could benefit from a standardized LOD such as GMO food analysis and forensic and clinical diagnostics. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.
How important is aspirin adherence when evaluating effectiveness of low-dose aspirin?
Navaratnam, Kate; Alfirevic, Zarko; Pirmohamed, Munir; Alfirevic, Ana
2017-12-01
Low-dose aspirin (LDA) is advocated for women at high-risk of pre-eclampsia, providing a modest, 10%, reduction in risk. Cardiology meta-analyses demonstrate 18% reduction in serious vascular events with LDA. Non-responsiveness to aspirin (sometimes termed aspirin resistance) and variable clinical effectiveness are often attributed to suboptimal adherence. The aim of this review was to identify the scope of adherence assessments in RCTs evaluating aspirin effectiveness in cardiology and obstetrics and discuss the quality of information provided by current methods. We searched MEDLINE, EMBASE and the Cochrane Library, limited to humans and English language, for RCTs evaluating aspirin in cardiology; 14/03/13-13/03/16 and pregnancy 1957-13/03/16. Search terms; 'aspirin', 'acetylsalicylic acid' appearing adjacent to 'myocardial infarction' or 'pregnancy', 'pregnant', 'obstetric' were used. 38% (25/68) of obstetric and 32% (20/62) of cardiology RCTs assessed aspirin adherence and 24% (6/25) and 29% (6/21) of obstetric and cardiology RCTs, respectively, defined acceptable adherence. Semi-quantitative methods (pill counts, medication weighing) prevailed in obstetric RCTs (93%), qualitative methods (interviews, questionnaires) were more frequent in obstetrics (67%). Two obstetric RCTs quantified serum thromboxane B 2 and salicylic acid, but no quantitative methods were used in cardiology Aspirin has proven efficacy, but suboptimal adherence is widespread and difficult to accurately quantify. Little is currently known about aspirin adherence in pregnancy. RCTs evaluating aspirin effectiveness show over-reliance on qualitative adherence assessments vulnerable to inherent inaccuracies. Reliable adherence data is important to assess and optimise the clinical effectiveness of LDA. We propose that adherence should be formally assessed in future trials and that development of quantitative assessments may prove valuable for trial protocols. Copyright © 2017 Elsevier B.V. All rights reserved.
Spalenza, Veronica; Girolami, Flavia; Bevilacqua, Claudia; Riondato, Fulvio; Rasero, Roberto; Nebbia, Carlo; Sacchi, Paola; Martin, Patrice
2011-09-01
Gene expression studies in blood cells, particularly lymphocytes, are useful for monitoring potential exposure to toxicants or environmental pollutants in humans and livestock species. Quantitative PCR is the method of choice for obtaining accurate quantification of mRNA transcripts although variations in the amount of starting material, enzymatic efficiency, and the presence of inhibitors can lead to evaluation errors. As a result, normalization of data is of crucial importance. The most common approach is the use of endogenous reference genes as an internal control, whose expression should ideally not vary among individuals and under different experimental conditions. The accurate selection of reference genes is therefore an important step in interpreting quantitative PCR studies. Since no systematic investigation in bovine lymphocytes has been performed, the aim of the present study was to assess the expression stability of seven candidate reference genes in circulating lymphocytes collected from 15 dairy cows. Following the characterization by flow cytometric analysis of the cell populations obtained from blood through a density gradient procedure, three popular softwares were used to evaluate the gene expression data. The results showed that two genes are sufficient for normalization of quantitative PCR studies in cattle lymphocytes and that YWAHZ, S24 and PPIA are the most stable genes. Copyright © 2010 Elsevier Ltd. All rights reserved.
Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models
Anderson, Ryan; Clegg, Samuel M.; Frydenvang, Jens; Wiens, Roger C.; McLennan, Scott M.; Morris, Richard V.; Ehlmann, Bethany L.; Dyar, M. Darby
2017-01-01
Accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response of an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “sub-model” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. The sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.
Dolled-Filhart, Marisa P; Gustavson, Mark D
2012-11-01
Translational oncology has been improved by using tissue microarrays (TMAs), which facilitate biomarker analysis of large cohorts on a single slide. This has allowed for rapid analysis and validation of potential biomarkers for prognostic and predictive value, as well as for evaluation of biomarker prevalence. Coupled with quantitative analysis of immunohistochemical (IHC) staining, objective and standardized biomarker data from tumor samples can further advance companion diagnostic approaches for the identification of drug-responsive or resistant patient subpopulations. This review covers the advantages, disadvantages and applications of TMAs for biomarker research. Research literature and reviews of TMAs and quantitative image analysis methodology have been surveyed for this review (with an AQUA® analysis focus). Applications such as multi-marker diagnostic development and pathway-based biomarker subpopulation analyses are described. Tissue microarrays are a useful tool for biomarker analyses including prevalence surveys, disease progression assessment and addressing potential prognostic or predictive value. By combining quantitative image analysis with TMAs, analyses will be more objective and reproducible, allowing for more robust IHC-based diagnostic test development. Quantitative multi-biomarker IHC diagnostic tests that can predict drug response will allow for greater success of clinical trials for targeted therapies and provide more personalized clinical decision making.
2015-01-01
Changes in glycosylation have been shown to have a profound correlation with development/malignancy in many cancer types. Currently, two major enrichment techniques have been widely applied in glycoproteomics, namely, lectin affinity chromatography (LAC)-based and hydrazide chemistry (HC)-based enrichments. Here we report the LC–MS/MS quantitative analyses of human blood serum glycoproteins and glycopeptides associated with esophageal diseases by LAC- and HC-based enrichment. The separate and complementary qualitative and quantitative data analyses of protein glycosylation were performed using both enrichment techniques. Chemometric and statistical evaluations, PCA plots, or ANOVA test, respectively, were employed to determine and confirm candidate cancer-associated glycoprotein/glycopeptide biomarkers. Out of 139, 59 common glycoproteins (42% overlap) were observed in both enrichment techniques. This overlap is very similar to previously published studies. The quantitation and evaluation of significantly changed glycoproteins/glycopeptides are complementary between LAC and HC enrichments. LC–ESI–MS/MS analyses indicated that 7 glycoproteins enriched by LAC and 11 glycoproteins enriched by HC showed significantly different abundances between disease-free and disease cohorts. Multiple reaction monitoring quantitation resulted in 13 glycopeptides by LAC enrichment and 10 glycosylation sites by HC enrichment to be statistically different among disease cohorts. PMID:25134008
Lee, Wai-Yung; Nakamura, Shin-Ichi; Chung, Moon Ja; Chun, Young Ju; Fu, Meng; Liang, Shu-Chuan; Liu, Cui-Lian
2013-09-01
The purpose of this study was to explore variations in how contemporary couples from five different Asian regions negotiate disagreements. Video recordings of 50 couples (10 each from Japan, Korea, Mainland China, Taiwan, and Hong Kong) discussing unresolved disagreements provided raw data for quantitative and qualitative analyses. First, teams of coders from each region used a common protocol to make quantitative ratings of content themes and interaction patterns for couples from their own region. An interregional panel of investigators then performed in-depth qualitative reviews for half of these cases, noting cultural differences not only in observed patterns of couple behavior but also in their own perceptions of these patterns. Both quantitative and qualitative analyses revealed clear regional differences on dimensions such as overt negativity, demand-withdraw interaction, and collaboration. The qualitative results also provided a richer, more nuanced view of other (e.g., gender-linked) conflict management patterns that the quantitative analyses did not capture. Inconsistencies between qualitative and quantitative data and between the qualitative observations of investigators from different regions were most pronounced for couples from Korea and Japan, whose conflict styles were subtler and less direct than those of couples from the other regions. © FPI, Inc.
Quantitative EPMA of Nano-Phase Iron-Silicides in Apollo 16 Lunar Regolith
NASA Astrophysics Data System (ADS)
Gopon, P.; Fournelle, J.; Valley, J. W.; Pinard, P. T.; Sobol, P.; Horn, W.; Spicuzza, M.; Llovet, X.; Richter, S.
2013-12-01
Until recently, quantitative EPMA of phases under a few microns in size has been extremely difficult. In order to achieve analytical volumes to analyze sub-micron features, accelerating voltages between 5 and 8 keV need to be used. At these voltages the normally used K X-ray transitions (of higher Z elements) are no longer excited, and we must rely of outer shell transitions (L and M). These outer shell transitions are difficult to use for quantitative EPMA because they are strongly affected by different bonding environments, the error associated with their mass attenuation coefficients (MAC), and their proximity to absorption edges. These problems are especially prevalent for the transition metals, because of the unfilled M5 electron shell where the Lα transition originates. Previous studies have tried to overcome these limitations by using standards that almost exactly matched their unknowns. This, however, is cumbersome and requires accurate knowledge of the composition of your sample beforehand, as well as an exorbitant number of well characterized standards. Using a 5 keV electron beam and utilizing non-standard X-ray transitions (Ll) for the transition metals, we are able to conduct accurate quantitative analyses of phases down to ~300nm. The Ll transition in the transition metals behaves more like a core-state transition, and unlike the Lα/β lines, is unaffected by bonding effects and does not lie near an absorption edge. This allows for quantitative analysis using standards do not have to exactly match the unknown. In our case pure metal standards were used for all elements except phosphorus. We present here data on iron-silicides in two Apollo 16 regolith grains. These plagioclase grains (A6-7 and A6-8) were collected between North and South Ray Craters, in the lunar highlands, and thus are associated with one or more large impact events. We report the presence of carbon, nickel, and phosphorus (in order of abundance) in these iron-silicide phases. Although carbon is an especially difficult measurement, (with contamination from the lab environment, sample, and vacuum system being a large problem) we found that the iron-silicide phases contain a few weight percent carbon. X-ray mapping shows carbon to be concentrated within the silicide blebs. We conducted sample reference (i.e. baseline) carbon measurements in standards mounted in the same block as the sample, to establish a contamination baseline then any carbon measured above this baseline was assumed to be real. This finding seems to indicate that while the iron-silicide phases formed in the reducing conditions of the lunar surface, these conditions were not low enough to form the phases on their own and needed the presence of carbon to reduce them down to the much lower reducing conditions were native silicon is stable. The source of the carbon and nickel found in the iron-silicides is most likely form an impactor, rather than from the lunar surface.
NASA Technical Reports Server (NTRS)
Green, Sheldon; Boissoles, J.; Boulet, C.
1988-01-01
The first accurate theoretical values for off-diagonal (i.e., line-coupling) pressure-broadening cross sections are presented. Calculations were done for CO perturbed by He at thermal collision energies using an accurate ab initio potential energy surface. Converged close coupling, i.e., numerically exact values, were obtained for coupling to the R(0) and R(2) lines. These were used to test the coupled states (CS) and infinite order sudden (IOS) approximate scattering methods. CS was found to be of quantitative accuracy (a few percent) and has been used to obtain coupling values for lines to R(10). IOS values are less accurate, but, owing to their simplicity, may nonetheless prove useful as has been recently demonstrated.
An accurate method of extracting fat droplets in liver images for quantitative evaluation
NASA Astrophysics Data System (ADS)
Ishikawa, Masahiro; Kobayashi, Naoki; Komagata, Hideki; Shinoda, Kazuma; Yamaguchi, Masahiro; Abe, Tokiya; Hashiguchi, Akinori; Sakamoto, Michiie
2015-03-01
The steatosis in liver pathological tissue images is a promising indicator of nonalcoholic fatty liver disease (NAFLD) and the possible risk of hepatocellular carcinoma (HCC). The resulting values are also important for ensuring the automatic and accurate classification of HCC images, because the existence of many fat droplets is likely to create errors in quantifying the morphological features used in the process. In this study we propose a method that can automatically detect, and exclude regions with many fat droplets by using the feature values of colors, shapes and the arrangement of cell nuclei. We implement the method and confirm that it can accurately detect fat droplets and quantify the fat droplet ratio of actual images. This investigation also clarifies the effective characteristics that contribute to accurate detection.
Nielsen, Signe Smith; Hempler, Nana Folmann; Krasnik, Allan
2013-01-01
The relationship between migration and health is complex, yet, immigrant-related inequalities in health are largely influenced by socioeconomic position. Drawing upon previous findings, this paper discusses issues to consider when measuring and applying socioeconomic position in quantitative immigrant health research. When measuring socioeconomic position, it is important to be aware of four aspects: (1) there is a lack of clarity about how socioeconomic position should be measured; (2) different types of socioeconomic position may be relevant to immigrants compared with the native-born population; (3) choices of measures of socioeconomic position in quantitative analyses often rely on data availability; and (4) different measures of socioeconomic position have different effects in population groups. Therefore, caution should be used in the collection, presentation, analyses, and interpretation of data and researchers need to display their proposed conceptual models and data limitations as well as apply different approaches for analyses. PMID:24287857
Autelitano, François; Loyaux, Denis; Roudières, Sébastien; Déon, Catherine; Guette, Frédérique; Fabre, Philippe; Ping, Qinggong; Wang, Su; Auvergne, Romane; Badarinarayana, Vasudeo; Smith, Michael; Guillemot, Jean-Claude; Goldman, Steven A.; Natesan, Sridaran; Ferrara, Pascual; August, Paul
2014-01-01
Glioblastoma multiform (GBM) remains clinical indication with significant “unmet medical need”. Innovative new therapy to eliminate residual tumor cells and prevent tumor recurrences is critically needed for this deadly disease. A major challenge of GBM research has been the identification of novel molecular therapeutic targets and accurate diagnostic/prognostic biomarkers. Many of the current clinical therapeutic targets of immunotoxins and ligand-directed toxins for high-grade glioma (HGG) cells are surface sialylated glycoproteins. Therefore, methods that systematically and quantitatively analyze cell surface sialoglycoproteins in human clinical tumor samples would be useful for the identification of potential diagnostic markers and therapeutic targets for malignant gliomas. In this study, we used the bioorthogonal chemical reporter strategy (BOCR) in combination with label-free quantitative mass spectrometry (LFQ-MS) to characterize and accurately quantify the individual cell surface sialoproteome in human GBM tissues, in fetal, adult human astrocytes, and in human neural progenitor cells (NPCs). We identified and quantified a total of 843 proteins, including 801 glycoproteins. Among the 843 proteins, 606 (72%) are known cell surface or secreted glycoproteins, including 156 CD-antigens, all major classes of cell surface receptor proteins, transporters, and adhesion proteins. Our findings identified several known as well as new cell surface antigens whose expression is predominantly restricted to human GBM tumors as confirmed by microarray transcription profiling, quantitative RT-PCR and immunohistochemical staining. This report presents the comprehensive identification of new biomarkers and therapeutic targets for the treatment of malignant gliomas using quantitative sialoglycoproteomics with clinically relevant, patient derived primary glioma cells. PMID:25360666
Autelitano, François; Loyaux, Denis; Roudières, Sébastien; Déon, Catherine; Guette, Frédérique; Fabre, Philippe; Ping, Qinggong; Wang, Su; Auvergne, Romane; Badarinarayana, Vasudeo; Smith, Michael; Guillemot, Jean-Claude; Goldman, Steven A; Natesan, Sridaran; Ferrara, Pascual; August, Paul
2014-01-01
Glioblastoma multiform (GBM) remains clinical indication with significant "unmet medical need". Innovative new therapy to eliminate residual tumor cells and prevent tumor recurrences is critically needed for this deadly disease. A major challenge of GBM research has been the identification of novel molecular therapeutic targets and accurate diagnostic/prognostic biomarkers. Many of the current clinical therapeutic targets of immunotoxins and ligand-directed toxins for high-grade glioma (HGG) cells are surface sialylated glycoproteins. Therefore, methods that systematically and quantitatively analyze cell surface sialoglycoproteins in human clinical tumor samples would be useful for the identification of potential diagnostic markers and therapeutic targets for malignant gliomas. In this study, we used the bioorthogonal chemical reporter strategy (BOCR) in combination with label-free quantitative mass spectrometry (LFQ-MS) to characterize and accurately quantify the individual cell surface sialoproteome in human GBM tissues, in fetal, adult human astrocytes, and in human neural progenitor cells (NPCs). We identified and quantified a total of 843 proteins, including 801 glycoproteins. Among the 843 proteins, 606 (72%) are known cell surface or secreted glycoproteins, including 156 CD-antigens, all major classes of cell surface receptor proteins, transporters, and adhesion proteins. Our findings identified several known as well as new cell surface antigens whose expression is predominantly restricted to human GBM tumors as confirmed by microarray transcription profiling, quantitative RT-PCR and immunohistochemical staining. This report presents the comprehensive identification of new biomarkers and therapeutic targets for the treatment of malignant gliomas using quantitative sialoglycoproteomics with clinically relevant, patient derived primary glioma cells.
In silico method for modelling metabolism and gene product expression at genome scale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lerman, Joshua A.; Hyduke, Daniel R.; Latif, Haythem
2012-07-03
Transcription and translation use raw materials and energy generated metabolically to create the macromolecular machinery responsible for all cellular functions, including metabolism. A biochemically accurate model of molecular biology and metabolism will facilitate comprehensive and quantitative computations of an organism's molecular constitution as a function of genetic and environmental parameters. Here we formulate a model of metabolism and macromolecular expression. Prototyping it using the simple microorganism Thermotoga maritima, we show our model accurately simulates variations in cellular composition and gene expression. Moreover, through in silico comparative transcriptomics, the model allows the discovery of new regulons and improving the genome andmore » transcription unit annotations. Our method presents a framework for investigating molecular biology and cellular physiology in silico and may allow quantitative interpretation of multi-omics data sets in the context of an integrated biochemical description of an organism.« less
Shen, Xia; Liu, Shengyun; Li, Ran; Ou, Yangming
2014-09-01
Water temperature not only affects the solubility of gas in water but can also be an important factor in the dissipation process of supersaturated total dissolved gas (TDG). The quantitative relationship between the dissipation process and temperature has not been previously described. This relationship affects the accurate evaluation of the dissipation process and the subsequent biological effects. This article experimentally investigates the impact of temperature on supersaturated TDG dissipation in static and turbulent conditions. The results show that the supersaturated TDG dissipation coefficient increases with the temperature and turbulence intensity. The quantitative relationship was verified by straight flume experiments. This study enhances our understanding of the dissipation of supersaturated TDG. Furthermore, it provides a scientific foundation for the accurate prediction of the dissipation process of supersaturated TDG in the downstream area and the negative impacts of high dam projects on aquatic ecosystems. Copyright © 2014. Published by Elsevier B.V.
Lunar mineral feedstocks from rocks and soils: X-ray digital imaging in resource evaluation
NASA Technical Reports Server (NTRS)
Chambers, John G.; Patchen, Allan; Taylor, Lawrence A.; Higgins, Stefan J.; Mckay, David S.
1994-01-01
The rocks and soils of the Moon provide raw materials essential to the successful establishment of a lunar base. Efficient exploitation of these resources requires accurate characterization of mineral abundances, sizes/shapes, and association of 'ore' and 'gangue' phases, as well as the technology to generate high-yield/high-grade feedstocks. Only recently have x-ray mapping and digital imaging techniques been applied to lunar resource evaluation. The topics covered include inherent differences between lunar basalts and soils and quantitative comparison of rock-derived and soil-derived ilmenite concentrates. It is concluded that x-ray digital-imaging characterization of lunar raw materials provides a quantitative comparison that is unattainable by traditional petrographic techniques. These data are necessary for accurately determining mineral distributions of soil and crushed rock material. Application of these techniques will provide an important link to choosing the best raw material for mineral beneficiation.
Brotherton, Paul; Sanchez, Juan J.; Cooper, Alan; Endicott, Phillip
2010-01-01
The analysis of targeted genetic loci from ancient, forensic and clinical samples is usually built upon polymerase chain reaction (PCR)-generated sequence data. However, many studies have shown that PCR amplification from poor-quality DNA templates can create sequence artefacts at significant levels. With hominin (human and other hominid) samples, the pervasive presence of highly PCR-amplifiable human DNA contaminants in the vast majority of samples can lead to the creation of recombinant hybrids and other non-authentic artefacts. The resulting PCR-generated sequences can then be difficult, if not impossible, to authenticate. In contrast, single primer extension (SPEX)-based approaches can genotype single nucleotide polymorphisms from ancient fragments of DNA as accurately as modern DNA. A single SPEX-type assay can amplify just one of the duplex DNA strands at target loci and generate a multi-fold depth-of-coverage, with non-authentic recombinant hybrids reduced to undetectable levels. Crucially, SPEX-type approaches can preferentially access genetic information from damaged and degraded endogenous ancient DNA templates over modern human DNA contaminants. The development of SPEX-type assays offers the potential for highly accurate, quantitative genotyping from ancient hominin samples. PMID:19864251
A nonlinear generalization of the Savitzky-Golay filter and the quantitative analysis of saccades
Dai, Weiwei; Selesnick, Ivan; Rizzo, John-Ross; Rucker, Janet; Hudson, Todd
2017-01-01
The Savitzky-Golay (SG) filter is widely used to smooth and differentiate time series, especially biomedical data. However, time series that exhibit abrupt departures from their typical trends, such as sharp waves or steps, which are of physiological interest, tend to be oversmoothed by the SG filter. Hence, the SG filter tends to systematically underestimate physiological parameters in certain situations. This article proposes a generalization of the SG filter to more accurately track abrupt deviations in time series, leading to more accurate parameter estimates (e.g., peak velocity of saccadic eye movements). The proposed filtering methodology models a time series as the sum of two component time series: a low-frequency time series for which the conventional SG filter is well suited, and a second time series that exhibits instantaneous deviations (e.g., sharp waves, steps, or more generally, discontinuities in a higher order derivative). The generalized SG filter is then applied to the quantitative analysis of saccadic eye movements. It is demonstrated that (a) the conventional SG filter underestimates the peak velocity of saccades, especially those of small amplitude, and (b) the generalized SG filter estimates peak saccadic velocity more accurately than the conventional filter. PMID:28813566
A nonlinear generalization of the Savitzky-Golay filter and the quantitative analysis of saccades.
Dai, Weiwei; Selesnick, Ivan; Rizzo, John-Ross; Rucker, Janet; Hudson, Todd
2017-08-01
The Savitzky-Golay (SG) filter is widely used to smooth and differentiate time series, especially biomedical data. However, time series that exhibit abrupt departures from their typical trends, such as sharp waves or steps, which are of physiological interest, tend to be oversmoothed by the SG filter. Hence, the SG filter tends to systematically underestimate physiological parameters in certain situations. This article proposes a generalization of the SG filter to more accurately track abrupt deviations in time series, leading to more accurate parameter estimates (e.g., peak velocity of saccadic eye movements). The proposed filtering methodology models a time series as the sum of two component time series: a low-frequency time series for which the conventional SG filter is well suited, and a second time series that exhibits instantaneous deviations (e.g., sharp waves, steps, or more generally, discontinuities in a higher order derivative). The generalized SG filter is then applied to the quantitative analysis of saccadic eye movements. It is demonstrated that (a) the conventional SG filter underestimates the peak velocity of saccades, especially those of small amplitude, and (b) the generalized SG filter estimates peak saccadic velocity more accurately than the conventional filter.
Improving the geological interpretation of magnetic and gravity satellite anomalies
NASA Technical Reports Server (NTRS)
Hinze, William J.; Braile, Lawrence W.; Vonfrese, Ralph R. B.
1987-01-01
Quantitative analysis of the geologic component of observed satellite magnetic and gravity fields requires accurate isolation of the geologic component of the observations, theoretically sound and viable inversion techniques, and integration of collateral, constraining geologic and geophysical data. A number of significant contributions were made which make quantitative analysis more accurate. These include procedures for: screening and processing orbital data for lithospheric signals based on signal repeatability and wavelength analysis; producing accurate gridded anomaly values at constant elevations from the orbital data by three-dimensional least squares collocation; increasing the stability of equivalent point source inversion and criteria for the selection of the optimum damping parameter; enhancing inversion techniques through an iterative procedure based on the superposition theorem of potential fields; and modeling efficiently regional-scale lithospheric sources of satellite magnetic anomalies. In addition, these techniques were utilized to investigate regional anomaly sources of North and South America and India and to provide constraints to continental reconstruction. Since the inception of this research study, eleven papers were presented with associated published abstracts, three theses were completed, four papers were published or accepted for publication, and an additional manuscript was submitted for publication.
Karacan, C Özgen; Olea, Ricardo A
2018-03-01
Chemical properties of coal largely determine coal handling, processing, beneficiation methods, and design of coal-fired power plants. Furthermore, these properties impact coal strength, coal blending during mining, as well as coal's gas content, which is important for mining safety. In order for these processes and quantitative predictions to be successful, safer, and economically feasible, it is important to determine and map chemical properties of coals accurately in order to infer these properties prior to mining. Ultimate analysis quantifies principal chemical elements in coal. These elements are C, H, N, S, O, and, depending on the basis, ash, and/or moisture. The basis for the data is determined by the condition of the sample at the time of analysis, with an "as-received" basis being the closest to sampling conditions and thus to the in-situ conditions of the coal. The parts determined or calculated as the result of ultimate analyses are compositions, reported in weight percent, and pose the challenges of statistical analyses of compositional data. The treatment of parts using proper compositional methods may be even more important in mapping them, as most mapping methods carry uncertainty due to partial sampling as well. In this work, we map the ultimate analyses parts of the Springfield coal from an Indiana section of the Illinois basin, USA, using sequential Gaussian simulation of isometric log-ratio transformed compositions. We compare the results with those of direct simulations of compositional parts. We also compare the implications of these approaches in calculating other properties using correlations to identify the differences and consequences. Although the study here is for coal, the methods described in the paper are applicable to any situation involving compositional data and its mapping.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-26
... for public comment a second draft assessment document titled, Quantitative Health Risk Assessment for... quantitative analyses that are being conducted as part of the review of the national ambient air quality...-Focused Visibility Assessment--Second External Review Draft and Quantitative Health Risk Assessment for...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-08
..., Quantitative Health Risk Assessment for Particulate Matter and Particulate Matter Urban-Focused Visibility Assessment. These two documents describe the quantitative analyses that have been conducted as part of the..., Quantitative Health Risk Assessment for Particulate Matter, please contact Dr. Zachary Pekar, Office of Air...
Code of Federal Regulations, 2013 CFR
2013-01-01
... robust analytical methods. The Department seeks to use qualitative and quantitative analytical methods... uncertainties will be carried forward in subsequent analyses. The use of quantitative models will be... manufacturers and other interested parties. The use of quantitative models will be supplemented by qualitative...
Code of Federal Regulations, 2012 CFR
2012-01-01
... robust analytical methods. The Department seeks to use qualitative and quantitative analytical methods... uncertainties will be carried forward in subsequent analyses. The use of quantitative models will be... manufacturers and other interested parties. The use of quantitative models will be supplemented by qualitative...
Challenge in Enhancing the Teaching and Learning of Variable Measurements in Quantitative Research
ERIC Educational Resources Information Center
Kee, Chang Peng; Osman, Kamisah; Ahmad, Fauziah
2013-01-01
Statistical analysis is one component that cannot be avoided in a quantitative research. Initial observations noted that students in higher education institution faced difficulty analysing quantitative data which were attributed to the confusions of various variable measurements. This paper aims to compare the outcomes of two approaches applied in…
Code of Federal Regulations, 2014 CFR
2014-01-01
... robust analytical methods. The Department seeks to use qualitative and quantitative analytical methods... uncertainties will be carried forward in subsequent analyses. The use of quantitative models will be... manufacturers and other interested parties. The use of quantitative models will be supplemented by qualitative...
Lee, Ji-Won; Iimura, Tadahiro
2017-02-01
Digitalized fluorescence images contain numerical information such as color (wavelength), fluorescence intensity and spatial position. However, quantitative analyses of acquired data and their validation remained to be established. Our research group has applied quantitative fluorescence imaging on tissue sections and uncovered novel findings in skeletal biomedicine and biodentistry. This review paper includes a brief background of quantitative fluorescence imaging and discusses practical applications by introducing our previous research. Finally, the future perspectives of quantitative fluorescence imaging are discussed.
2017-05-10
repertoire-wide properties. Finally, through 75 the use of appropriate statistical analyses, the repertoire profiles can be quantitatively compared and 76...cell response to eVLP and 503 quantitatively compare GC B-cell repertoires from immunization conditions. We partitioned the 504 resulting clonotype... Quantitative analysis of repertoire-scale immunoglobulin properties in vaccine-induced B-cell responses Ilja V. Khavrutskii1, Sidhartha Chaudhury*1
NASA Astrophysics Data System (ADS)
Mirniaharikandehei, Seyedehnafiseh; Patil, Omkar; Aghaei, Faranak; Wang, Yunzhi; Zheng, Bin
2017-03-01
Accurately assessing the potential benefit of chemotherapy to cancer patients is an important prerequisite to developing precision medicine in cancer treatment. The previous study has shown that total psoas area (TPA) measured on preoperative cross-section CT image might be a good image marker to predict long-term outcome of pancreatic cancer patients after surgery. However, accurate and automated segmentation of TPA from the CT image is difficult due to the fuzzy boundary or connection of TPA to other muscle areas. In this study, we developed a new interactive computer-aided detection (ICAD) scheme aiming to segment TPA from the abdominal CT images more accurately and assess the feasibility of using this new quantitative image marker to predict the benefit of ovarian cancer patients receiving Bevacizumab-based chemotherapy. ICAD scheme was applied to identify a CT image slice of interest, which is located at the level of L3 (vertebral spines). The cross-sections of the right and left TPA are segmented using a set of adaptively adjusted boundary conditions. TPA is then quantitatively measured. In addition, recent studies have investigated that muscle radiation attenuation which reflects fat deposition in the tissue might be a good image feature for predicting the survival rate of cancer patients. The scheme and TPA measurement task were applied to a large national clinical trial database involving 1,247 ovarian cancer patients. By comparing with manual segmentation results, we found that ICAD scheme could yield higher accuracy and consistency for this task. Using a new ICAD scheme can provide clinical researchers a useful tool to more efficiently and accurately extract TPA as well as muscle radiation attenuation as new image makers, and allow them to investigate the discriminatory power of it to predict progression-free survival and/or overall survival of the cancer patients before and after taking chemotherapy.
NASA Astrophysics Data System (ADS)
Zhao, Huangxuan; Wang, Guangsong; Lin, Riqiang; Gong, Xiaojing; Song, Liang; Li, Tan; Wang, Wenjia; Zhang, Kunya; Qian, Xiuqing; Zhang, Haixia; Li, Lin; Liu, Zhicheng; Liu, Chengbo
2018-04-01
For the diagnosis and evaluation of ophthalmic diseases, imaging and quantitative characterization of vasculature in the iris are very important. The recently developed photoacoustic imaging, which is ultrasensitive in imaging endogenous hemoglobin molecules, provides a highly efficient label-free method for imaging blood vasculature in the iris. However, the development of advanced vascular quantification algorithms is still needed to enable accurate characterization of the underlying vasculature. We have developed a vascular information quantification algorithm by adopting a three-dimensional (3-D) Hessian matrix and applied for processing iris vasculature images obtained with a custom-built optical-resolution photoacoustic imaging system (OR-PAM). For the first time, we demonstrate in vivo 3-D vascular structures of a rat iris with a the label-free imaging method and also accurately extract quantitative vascular information, such as vessel diameter, vascular density, and vascular tortuosity. Our results indicate that the developed algorithm is capable of quantifying the vasculature in the 3-D photoacoustic images of the iris in-vivo, thus enhancing the diagnostic capability of the OR-PAM system for vascular-related ophthalmic diseases in vivo.
Guo, Canyong; Luo, Xuefang; Zhou, Xiaohua; Shi, Beijia; Wang, Juanjuan; Zhao, Jinqi; Zhang, Xiaoxia
2017-06-05
Vibrational spectroscopic techniques such as infrared, near-infrared and Raman spectroscopy have become popular in detecting and quantifying polymorphism of pharmaceutics since they are fast and non-destructive. This study assessed the ability of three vibrational spectroscopy combined with multivariate analysis to quantify a low-content undesired polymorph within a binary polymorphic mixture. Partial least squares (PLS) regression and support vector machine (SVM) regression were employed to build quantitative models. Fusidic acid, a steroidal antibiotic, was used as the model compound. It was found that PLS regression performed slightly better than SVM regression in all the three spectroscopic techniques. Root mean square errors of prediction (RMSEP) were ranging from 0.48% to 1.17% for diffuse reflectance FTIR spectroscopy and 1.60-1.93% for diffuse reflectance FT-NIR spectroscopy and 1.62-2.31% for Raman spectroscopy. The results indicate that diffuse reflectance FTIR spectroscopy offers significant advantages in providing accurate measurement of polymorphic content in the fusidic acid binary mixtures, while Raman spectroscopy is the least accurate technique for quantitative analysis of polymorphs. Copyright © 2017 Elsevier B.V. All rights reserved.
Huan, Tao; Li, Liang
2015-07-21
Generating precise and accurate quantitative information on metabolomic changes in comparative samples is important for metabolomics research where technical variations in the metabolomic data should be minimized in order to reveal biological changes. We report a method and software program, IsoMS-Quant, for extracting quantitative information from a metabolomic data set generated by chemical isotope labeling (CIL) liquid chromatography mass spectrometry (LC-MS). Unlike previous work of relying on mass spectral peak ratio of the highest intensity peak pair to measure relative quantity difference of a differentially labeled metabolite, this new program reconstructs the chromatographic peaks of the light- and heavy-labeled metabolite pair and then calculates the ratio of their peak areas to represent the relative concentration difference in two comparative samples. Using chromatographic peaks to perform relative quantification is shown to be more precise and accurate. IsoMS-Quant is integrated with IsoMS for picking peak pairs and Zero-fill for retrieving missing peak pairs in the initial peak pairs table generated by IsoMS to form a complete tool for processing CIL LC-MS data. This program can be freely downloaded from the www.MyCompoundID.org web site for noncommercial use.
UNiquant, a program for quantitative proteomics analysis using stable isotope labeling.
Huang, Xin; Tolmachev, Aleksey V; Shen, Yulei; Liu, Miao; Huang, Lin; Zhang, Zhixin; Anderson, Gordon A; Smith, Richard D; Chan, Wing C; Hinrichs, Steven H; Fu, Kai; Ding, Shi-Jian
2011-03-04
Stable isotope labeling (SIL) methods coupled with nanoscale liquid chromatography and high resolution tandem mass spectrometry are increasingly useful for elucidation of the proteome-wide differences between multiple biological samples. Development of more effective programs for the sensitive identification of peptide pairs and accurate measurement of the relative peptide/protein abundance are essential for quantitative proteomic analysis. We developed and evaluated the performance of a new program, termed UNiquant, for analyzing quantitative proteomics data using stable isotope labeling. UNiquant was compared with two other programs, MaxQuant and Mascot Distiller, using SILAC-labeled complex proteome mixtures having either known or unknown heavy/light ratios. For the SILAC-labeled Jeko-1 cell proteome digests with known heavy/light ratios (H/L = 1:1, 1:5, and 1:10), UNiquant quantified a similar number of peptide pairs as MaxQuant for the H/L = 1:1 and 1:5 mixtures. In addition, UNiquant quantified significantly more peptides than MaxQuant and Mascot Distiller in the H/L = 1:10 mixtures. UNiquant accurately measured relative peptide/protein abundance without the need for postmeasurement normalization of peptide ratios, which is required by the other programs.
UNiquant, a Program for Quantitative Proteomics Analysis Using Stable Isotope Labeling
Huang, Xin; Tolmachev, Aleksey V.; Shen, Yulei; Liu, Miao; Huang, Lin; Zhang, Zhixin; Anderson, Gordon A.; Smith, Richard D.; Chan, Wing C.; Hinrichs, Steven H.; Fu, Kai; Ding, Shi-Jian
2011-01-01
Stable isotope labeling (SIL) methods coupled with nanoscale liquid chromatography and high resolution tandem mass spectrometry are increasingly useful for elucidation of the proteome-wide differences between multiple biological samples. Development of more effective programs for the sensitive identification of peptide pairs and accurate measurement of the relative peptide/protein abundance are essential for quantitative proteomic analysis. We developed and evaluated the performance of a new program, termed UNiquant, for analyzing quantitative proteomics data using stable isotope labeling. UNiquant was compared with two other programs, MaxQuant and Mascot Distiller, using SILAC-labeled complex proteome mixtures having either known or unknown heavy/light ratios. For the SILAC-labeled Jeko-1 cell proteome digests with known heavy/light ratios (H/L = 1:1, 1:5, and 1:10), UNiquant quantified a similar number of peptide pairs as MaxQuant for the H/L = 1:1 and 1:5 mixtures. In addition, UNiquant quantified significantly more peptides than MaxQuant and Mascot Distiller in the H/L = 1:10 mixtures. UNiquant accurately measured relative peptide/protein abundance without the need for post-measurement normalization of peptide ratios, which is required by the other programs. PMID:21158445
Carroll, Dustin; Howard, Diana; Zhu, Haining; Paumi, Christian M; Vore, Mary; Bondada, Subbarao; Liang, Ying; Wang, Chi; St Clair, Daret K
2016-08-01
Cellular redox balance plays a significant role in the regulation of hematopoietic stem-progenitor cell (HSC/MPP) self-renewal and differentiation. Unregulated changes in cellular redox homeostasis are associated with the onset of most hematological disorders. However, accurate measurement of the redox state in stem cells is difficult because of the scarcity of HSC/MPPs. Glutathione (GSH) constitutes the most abundant pool of cellular antioxidants. Thus, GSH metabolism may play a critical role in hematological disease onset and progression. A major limitation to studying GSH metabolism in HSC/MPPs has been the inability to measure quantitatively GSH concentrations in small numbers of HSC/MPPs. Current methods used to measure GSH levels not only rely on large numbers of cells, but also rely on the chemical/structural modification or enzymatic recycling of GSH and therefore are likely to measure only total glutathione content accurately. Here, we describe the validation of a sensitive method used for the direct and simultaneous quantitation of both oxidized and reduced GSH via liquid chromatography followed by tandem mass spectrometry (LC-MS/MS) in HSC/MPPs isolated from bone marrow. The lower limit of quantitation (LLOQ) was determined to be 5.0ng/mL for GSH and 1.0ng/mL for GSSG with lower limits of detection at 0.5ng/mL for both glutathione species. Standard addition analysis utilizing mouse bone marrow shows that this method is both sensitive and accurate with reproducible analyte recovery. This method combines a simple extraction with a platform for the high-throughput analysis, allows for efficient determination of GSH/GSSG concentrations within the HSC/MPP populations in mouse, chemotherapeutic treatment conditions within cell culture, and human normal/leukemia patient samples. The data implicate the importance of the modulation of GSH/GSSG redox couple in stem cells related diseases. Copyright © 2016 Elsevier Inc. All rights reserved.
Distinguishing ferritin from apoferritin using magnetic force microscopy
NASA Astrophysics Data System (ADS)
Nocera, Tanya M.; Zeng, Yuzhi; Agarwal, Gunjan
2014-11-01
Estimating the amount of iron-replete ferritin versus iron-deficient apoferritin proteins is important in biomedical and nanotechnology applications. This work introduces a simple and novel approach to quantify ferritin by using magnetic force microscopy (MFM). We demonstrate how high magnetic moment probes enhance the magnitude of MFM signal, thus enabling accurate quantitative estimation of ferritin content in ferritin/apoferritin mixtures in vitro. We envisage MFM could be adapted to accurately determine ferritin content in protein mixtures or in small aliquots of clinical samples.
Study on index system of GPS interference effect evaluation
NASA Astrophysics Data System (ADS)
Zhang, Kun; Zeng, Fangling; Zhao, Yuan; Zeng, Ruiqi
2018-05-01
Satellite navigation interference effect evaluation is the key technology to break through the research of Navigation countermeasure. To evaluate accurately the interference degree and Anti-jamming ability of GPS receiver, this text based on the existing research results of Navigation interference effect evaluation, build the index system of GPS receiver effectiveness evaluation from four levels of signal acquisition, tracking, demodulation and positioning/timing and establish the model for each index. These indexes can accurately and quantitatively describe the interference effect at all levels.
Genomic Quantitative Genetics to Study Evolution in the Wild.
Gienapp, Phillip; Fior, Simone; Guillaume, Frédéric; Lasky, Jesse R; Sork, Victoria L; Csilléry, Katalin
2017-12-01
Quantitative genetic theory provides a means of estimating the evolutionary potential of natural populations. However, this approach was previously only feasible in systems where the genetic relatedness between individuals could be inferred from pedigrees or experimental crosses. The genomic revolution opened up the possibility of obtaining the realized proportion of genome shared among individuals in natural populations of virtually any species, which could promise (more) accurate estimates of quantitative genetic parameters in virtually any species. Such a 'genomic' quantitative genetics approach relies on fewer assumptions, offers a greater methodological flexibility, and is thus expected to greatly enhance our understanding of evolution in natural populations, for example, in the context of adaptation to environmental change, eco-evolutionary dynamics, and biodiversity conservation. Copyright © 2017 Elsevier Ltd. All rights reserved.
Jackson, Kathy; Lim, Seng Gee; Sulaiman, Ali; Pakasi, Levina S; Gani, Rino A; Hasan, Irsan; Sulaiman, Andri Sanityoso; Lesmana, Laurentius A; Hammond, Rachel; Revill, Peter; Locarnini, Stephen; Bowden, Scott David
2014-01-01
Background Clinical use of hepatitis B viral (HBV) quantitative seromarker\\s remains questionable since it is not precisely known whether they represent intrahepatic viral replication. Covalently closed circular DNA (cccDNA), relaxed circular DNA (rcDNA), and pregenomic RNA (pgRNA) are more likely to represent active HBV replication and their measurement can be used to derive virion productivity (VP; rcDNA/cccDNA), subviral particle (SVP) productivity (quantitative HBsAg/cccDNA), and replicative activity (RA; pgRNA/cccDNA). These can be used to compare relative HBV replication between HBeAg-negative and -positive patients. Objective To study the clinical significance of intrahepatic HBV replication phenomenon between HBeAg-negative and -positive patients and its correlation with quantitative HBV seromarkers. Method This was a prospective study between January 2010 and December 2011. Study subjects were naive chronic hepatitis B patients from Cipto Mangunkusumo and Medistra Hospitals. All patient samples underwent liver biochemistry and HBV seromarkers testing (HBeAg, quantitative HBsAg and HBV DNA levels), and patients underwent liver biopsy. Stored liver specimens were analysed for intrahepatic rcDNA, cccDNA, and pgRNA with quantification performed by real-time PCR. Comparison of HBV markers between HBsAg-positive and -negative patients was carried out using the Mann–Whitney U-test. Pearson’s correlation test was performed among HBV intrahepatic and seromarkers using their log-transformed values. Results A total of 104 patients were enrolled in this study; 54 (51.9%) were male. Patients’ mean age was 41.9 ± 11.63 years (range 19–70 years). Sixty-one patients (58.7%) were HBeAg-negative. All HBV markers were significantly higher in HBeAg-positive than HBeAg-negative patients, except for SVP productivity and RA. Serum HBV DNA was strongly correlated with intrahepatic total HBV DNA (r = 0.771), cccDNA (r = 0.774), and rcDNA (r = 0.780) while serum quantitative HBsAg showed only moderate correlation with intrahepatic total DNA (r = 0.671), cccDNA (r = 0.632), rcDNA (r = 0.675), and SVP productivity (r = 0.557). Conclusions Serum HBV DNA concentration and quantitative HBsAg might not accurately predict intrahepatic viral activity. Virion and SVP production do not occur in parallel with replicative activity. PMID:24918014
Microfluidics-based digital quantitative PCR for single-cell small RNA quantification.
Yu, Tian; Tang, Chong; Zhang, Ying; Zhang, Ruirui; Yan, Wei
2017-09-01
Quantitative analyses of small RNAs at the single-cell level have been challenging because of limited sensitivity and specificity of conventional real-time quantitative PCR methods. A digital quantitative PCR (dqPCR) method for miRNA quantification has been developed, but it requires the use of proprietary stem-loop primers and only applies to miRNA quantification. Here, we report a microfluidics-based dqPCR (mdqPCR) method, which takes advantage of the Fluidigm BioMark HD system for both template partition and the subsequent high-throughput dqPCR. Our mdqPCR method demonstrated excellent sensitivity and reproducibility suitable for quantitative analyses of not only miRNAs but also all other small RNA species at the single-cell level. Using this method, we discovered that each sperm has a unique miRNA profile. © The Authors 2017. Published by Oxford University Press on behalf of Society for the Study of Reproduction. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
GAS CHROMATOGRAPHIC TECHNIQUES FOR THE MEASUREMENT OF ISOPRENE IN AIR
The chapter discusses gas chromatographic techniques for measuring isoprene in air. Such measurement basically consists of three parts: (1) collection of sufficient sample volume for representative and accurate quantitation, (2) separation (if necessary) of isoprene from interfer...
The quantitative control and matching of an optical false color composite imaging system
NASA Astrophysics Data System (ADS)
Zhou, Chengxian; Dai, Zixin; Pan, Xizhe; Li, Yinxi
1993-10-01
Design of an imaging system for optical false color composite (OFCC) capable of high-precision density-exposure time control and color balance is presented. The system provides high quality FCC image data that can be analyzed using a quantitative calculation method. The quality requirement to each part of the image generation system is defined, and the distribution of satellite remote sensing image information is analyzed. The proposed technology makes it possible to present the remote sensing image data more effectively and accurately.
Quantitative characterisation of sedimentary grains
NASA Astrophysics Data System (ADS)
Tunwal, Mohit; Mulchrone, Kieran F.; Meere, Patrick A.
2016-04-01
Analysis of sedimentary texture helps in determining the formation, transportation and deposition processes of sedimentary rocks. Grain size analysis is traditionally quantitative, whereas grain shape analysis is largely qualitative. A semi-automated approach to quantitatively analyse shape and size of sand sized sedimentary grains is presented. Grain boundaries are manually traced from thin section microphotographs in the case of lithified samples and are automatically identified in the case of loose sediments. Shape and size paramters can then be estimated using a software package written on the Mathematica platform. While automated methodology already exists for loose sediment analysis, the available techniques for the case of lithified samples are limited to cases of high definition thin section microphotographs showing clear contrast between framework grains and matrix. Along with the size of grain, shape parameters such as roundness, angularity, circularity, irregularity and fractal dimension are measured. A new grain shape parameter developed using Fourier descriptors has also been developed. To test this new approach theoretical examples were analysed and produce high quality results supporting the accuracy of the algorithm. Furthermore sandstone samples from known aeolian and fluvial environments from the Dingle Basin, County Kerry, Ireland were collected and analysed. Modern loose sediments from glacial till from County Cork, Ireland and aeolian sediments from Rajasthan, India have also been collected and analysed. A graphical summary of the data is presented and allows for quantitative distinction between samples extracted from different sedimentary environments.
Quantitative X-ray diffraction and fluorescence analysis of paint pigment systems : final report.
DOT National Transportation Integrated Search
1978-01-01
This study attempted to correlate measured X-ray intensities with concentrations of each member of paint pigment systems, thereby establishing calibration curves for the quantitative analyses of such systems.
USING MICROSOFT OFFICE EXCEL® 2007 TO CONDUCT GENERALIZED MATCHING ANALYSES
Reed, Derek D
2009-01-01
The generalized matching equation is a robust and empirically supported means of analyzing relations between reinforcement and behavior. Unfortunately, no simple task analysis is available to behavior analysts interested in using the matching equation to evaluate data in clinical or applied settings. This technical article presents a task analysis for the use of Microsoft Excel to analyze and plot the generalized matching equation. Using a data-based case example and a step-by-step guide for completing the analysis, these instructions are intended to promote the use of quantitative analyses by researchers with little to no experience in quantitative analyses or the matching law. PMID:20514196
Using Microsoft Office Excel 2007 to conduct generalized matching analyses.
Reed, Derek D
2009-01-01
The generalized matching equation is a robust and empirically supported means of analyzing relations between reinforcement and behavior. Unfortunately, no simple task analysis is available to behavior analysts interested in using the matching equation to evaluate data in clinical or applied settings. This technical article presents a task analysis for the use of Microsoft Excel to analyze and plot the generalized matching equation. Using a data-based case example and a step-by-step guide for completing the analysis, these instructions are intended to promote the use of quantitative analyses by researchers with little to no experience in quantitative analyses or the matching law.
Sung, Yun Ju; Di, Yanming; Fu, Audrey Q; Rothstein, Joseph H; Sieh, Weiva; Tong, Liping; Thompson, Elizabeth A; Wijsman, Ellen M
2007-01-01
We performed multipoint linkage analyses with multiple programs and models for several gene expression traits in the Centre d'Etude du Polymorphisme Humain families. All analyses provided consistent results for both peak location and shape. Variance-components (VC) analysis gave wider peaks and Bayes factors gave fewer peaks. Among programs from the MORGAN package, lm_multiple performed better than lm_markers, resulting in less Markov-chain Monte Carlo (MCMC) variability between runs, and the program lm_twoqtl provided higher LOD scores by also including either a polygenic component or an additional quantitative trait locus.
Sung, Yun Ju; Di, Yanming; Fu, Audrey Q; Rothstein, Joseph H; Sieh, Weiva; Tong, Liping; Thompson, Elizabeth A; Wijsman, Ellen M
2007-01-01
We performed multipoint linkage analyses with multiple programs and models for several gene expression traits in the Centre d'Etude du Polymorphisme Humain families. All analyses provided consistent results for both peak location and shape. Variance-components (VC) analysis gave wider peaks and Bayes factors gave fewer peaks. Among programs from the MORGAN package, lm_multiple performed better than lm_markers, resulting in less Markov-chain Monte Carlo (MCMC) variability between runs, and the program lm_twoqtl provided higher LOD scores by also including either a polygenic component or an additional quantitative trait locus. PMID:18466597
Mapping of quantitative trait loci controlling adaptive traits in coastal Douglas-fir
Nicholas C. Wheeler; Kathleen D. Jermstad; Konstantin V. Krutovsky; Sally N. Aitken; Glenn T. Howe; Jodie Krakowski; David B. Neale
2005-01-01
Quantitative trait locus (QTL) analyses are used by geneticists to characterize the genetic architecture of quantitative traits, provide a foundation for marker-aided-selection (MAS), and provide a framework for positional selection of candidate genes. The most useful QTL for breeding applications are those that have been verified in time, space, and/or genetic...
Precocious quantitative cognition in monkeys.
Ferrigno, Stephen; Hughes, Kelly D; Cantlon, Jessica F
2016-02-01
Basic quantitative abilities are thought to have an innate basis in humans partly because the ability to discriminate quantities emerges early in child development. If humans and nonhuman primates share this developmentally primitive foundation of quantitative reasoning, then this ability should be present early in development across species and should emerge earlier in monkeys than in humans because monkeys mature faster than humans. We report that monkeys spontaneously make accurate quantity choices by 1 year of age in a task that human children begin to perform only at 2.5 to 3 years of age. Additionally, we report that the quantitative sensitivity of infant monkeys is equal to that of the adult animals in their group and that rates of learning do not differ between infant and adult animals. This novel evidence of precocious quantitative reasoning in infant monkeys suggests that human quantitative reasoning shares its early developing foundation with other primates. The data further suggest that early developing components of primate quantitative reasoning are constrained by maturational factors related to genetic development as opposed to learning experience alone.
Assimilation of Stratospheric Meteorological and Constituent Observations: A Review
NASA Technical Reports Server (NTRS)
Rood, Richard B.; Pawson, Steven
2004-01-01
This talk reviews the assimilation of meteorological and constituent observations of the stratosphere. The first efforts to assimilate observations into stratospheric models were during the early 1980s, and a number of research studies followed during the next decade. Since the launch of the Upper Atmospheric Research Satellite (UARS) in 1991, model-assimilated data sets of the stratospheric meteorological state have been routinely available. These assimilated data sets were critical in bringing together observations from the different instruments on UARS as well as linking UARS observations to measurements from other platforms. Using trajectory-mapping techniques, meteorological assimilation analyses are, now, widely used in the analysis of constituent observations and have increased the level of quantitative study of stratospheric chemistry and transport. During the 1990s the use of winds and temperatures from assimilated data sets became standard for offline chemistry and transport modeling. variability in middle latitudes. The transport experiments, however, reveal a set of shortcomings that become obvious as systematic errors are integrated over time. Generally, the tropics are not well represented, mixing between the tropics and middle latitudes is overestimated, and the residual circulation is not accurate. These shortcomings reveal underlying fundamental challenges related to bias and noise. Current studies using model simulation and data assimilation in controlled experimentation are highlighting the issues that must be addressed if assimilated data sets are to be convincingly used to study interannual variability and decadal change. observations. The primary focus has been on stratospheric ozone, but there are efforts that investigate a suite of reactive chemical constituents. Recent progress in ozone assimilation shows the potential of assimilation to contribute to the validation of ozone observations and, ultimately, the retrieval of ozone profiles from space-based radiance measurements. Assimilated data sets provide accurate analyses of synoptic and planetary Scale At the same time, stratospheric assimilation is evolving to include constituent
Global, long-term surface reflectance records from Landsat
USDA-ARS?s Scientific Manuscript database
Global, long-term monitoring of changes in Earth’s land surface requires quantitative comparisons of satellite images acquired under widely varying atmospheric conditions. Although physically based estimates of surface reflectance (SR) ultimately provide the most accurate representation of Earth’s s...
Advanced Technologies for Structural and Functional Optical Coherence Tomography
2015-01-07
vertical scale bar: 500 um. 9 OCT speckle noise can significantly affect polarimetry measurement and must be reduced for birefringence...shown in Figure 7. This technique enables more accurate polarimetry measurement and quantitative assessment of tissue birefringence. Figure 7
A method for the measurement of physiologic evaporative water loss.
DOT National Transportation Integrated Search
1963-10-01
The precise measurement of evaporative water loss is essential to an accurate evaluation of this avenue of heat loss in acute and chronic exposures to heat. In psychological studies, the quantitative measurement of palmar sweating plays an equally im...
Revisiting soil carbon and nitrogen sampling: quantitative pits versus rotary cores
USDA-ARS?s Scientific Manuscript database
Increasing atmospheric carbon dioxide and its feedbacks with global climate have sparked renewed interest in quantifying ecosystem carbon (C) budgets, including quantifying belowground pools. Belowground nutrient budgets require accurate estimates of soil mass, coarse fragment content, and nutrient ...
Remote In-Situ Quantitative Mineralogical Analysis Using XRD/XRF
NASA Technical Reports Server (NTRS)
Blake, D. F.; Bish, D.; Vaniman, D.; Chipera, S.; Sarrazin, P.; Collins, S. A.; Elliott, S. T.
2001-01-01
X-Ray Diffraction (XRD) is the most direct and accurate method for determining mineralogy. The CHEMIN XRD/XRF instrument has shown promising results on a variety of mineral and rock samples. Additional information is contained in the original extended abstract.
Accurate phase measurements for thick spherical objects using optical quadrature microscopy
NASA Astrophysics Data System (ADS)
Warger, William C., II; DiMarzio, Charles A.
2009-02-01
In vitro fertilization (IVF) procedures have resulted in the birth of over three million babies since 1978. Yet the live birth rate in the United States was only 34% in 2005, with 32% of the successful pregnancies resulting in multiple births. These multiple pregnancies were directly attributed to the transfer of multiple embryos to increase the probability that a single, healthy embryo was included. Current viability markers used for IVF, such as the cell number, symmetry, size, and fragmentation, are analyzed qualitatively with differential interference contrast (DIC) microscopy. However, this method is not ideal for quantitative measures beyond the 8-cell stage of development because the cells overlap and obstruct the view within and below the cluster of cells. We have developed the phase-subtraction cell-counting method that uses the combination of DIC and optical quadrature microscopy (OQM) to count the number of cells accurately in live mouse embryos beyond the 8-cell stage. We have also created a preliminary analysis to measure the cell symmetry, size, and fragmentation quantitatively by analyzing the relative dry mass from the OQM image in conjunction with the phase-subtraction count. In this paper, we will discuss the characterization of OQM with respect to measuring the phase accurately for spherical samples that are much larger than the depth of field. Once fully characterized and verified with human embryos, this methodology could provide the means for a more accurate method to score embryo viability.
Computational Methods for Configurational Entropy Using Internal and Cartesian Coordinates.
Hikiri, Simon; Yoshidome, Takashi; Ikeguchi, Mitsunori
2016-12-13
The configurational entropy of solute molecules is a crucially important quantity to study various biophysical processes. Consequently, it is necessary to establish an efficient quantitative computational method to calculate configurational entropy as accurately as possible. In the present paper, we investigate the quantitative performance of the quasi-harmonic and related computational methods, including widely used methods implemented in popular molecular dynamics (MD) software packages, compared with the Clausius method, which is capable of accurately computing the change of the configurational entropy upon temperature change. Notably, we focused on the choice of the coordinate systems (i.e., internal or Cartesian coordinates). The Boltzmann-quasi-harmonic (BQH) method using internal coordinates outperformed all the six methods examined here. The introduction of improper torsions in the BQH method improves its performance, and anharmonicity of proper torsions in proteins is identified to be the origin of the superior performance of the BQH method. In contrast, widely used methods implemented in MD packages show rather poor performance. In addition, the enhanced sampling of replica-exchange MD simulations was found to be efficient for the convergent behavior of entropy calculations. Also in folding/unfolding transitions of a small protein, Chignolin, the BQH method was reasonably accurate. However, the independent term without the correlation term in the BQH method was most accurate for the folding entropy among the methods considered in this study, because the QH approximation of the correlation term in the BQH method was no longer valid for the divergent unfolded structures.
Niu, Xiaoping; Qi, Jianmin; Zhang, Gaoyang; Xu, Jiantang; Tao, Aifen; Fang, Pingping; Su, Jianguang
2015-01-01
To accurately measure gene expression using quantitative reverse transcription PCR (qRT-PCR), reliable reference gene(s) are required for data normalization. Corchorus capsularis, an annual herbaceous fiber crop with predominant biodegradability and renewability, has not been investigated for the stability of reference genes with qRT-PCR. In this study, 11 candidate reference genes were selected and their expression levels were assessed using qRT-PCR. To account for the influence of experimental approach and tissue type, 22 different jute samples were selected from abiotic and biotic stress conditions as well as three different tissue types. The stability of the candidate reference genes was evaluated using geNorm, NormFinder, and BestKeeper programs, and the comprehensive rankings of gene stability were generated by aggregate analysis. For the biotic stress and NaCl stress subsets, ACT7 and RAN were suitable as stable reference genes for gene expression normalization. For the PEG stress subset, UBC, and DnaJ were sufficient for accurate normalization. For the tissues subset, four reference genes TUBβ, UBI, EF1α, and RAN were sufficient for accurate normalization. The selected genes were further validated by comparing expression profiles of WRKY15 in various samples, and two stable reference genes were recommended for accurate normalization of qRT-PCR data. Our results provide researchers with appropriate reference genes for qRT-PCR in C. capsularis, and will facilitate gene expression study under these conditions. PMID:26528312
Quantitative and Sensitive Detection of Chloramphenicol by Surface-Enhanced Raman Scattering
Ding, Yufeng; Yin, Hongjun; Meng, Qingyun; Zhao, Yongmei; Liu, Luo; Wu, Zhenglong; Xu, Haijun
2017-01-01
We used surface-enhanced Raman scattering (SERS) for the quantitative and sensitive detection of chloramphenicol (CAP). Using 30 nm colloidal Au nanoparticles (NPs), a low detection limit for CAP of 10−8 M was obtained. The characteristic Raman peak of CAP centered at 1344 cm−1 was used for the rapid quantitative detection of CAP in three different types of CAP eye drops, and the accuracy of the measurement result was verified by high-performance liquid chromatography (HPLC). The experimental results reveal that the SERS technique based on colloidal Au NPs is accurate and sensitive, and can be used for the rapid detection of various antibiotics. PMID:29261161
NASA Astrophysics Data System (ADS)
Ito, Reika; Yoshidome, Takashi
2018-01-01
Markov state models (MSMs) are a powerful approach for analyzing the long-time behaviors of protein motion using molecular dynamics simulation data. However, their quantitative performance with respect to the physical quantities is poor. We believe that this poor performance is caused by the failure to appropriately classify protein conformations into states when constructing MSMs. Herein, we show that the quantitative performance of an order parameter is improved when a manifold-learning technique is employed for the classification in the MSM. The MSM construction using the K-center method, which has been previously used for classification, has a poor quantitative performance.
End-to-end deep neural network for optical inversion in quantitative photoacoustic imaging.
Cai, Chuangjian; Deng, Kexin; Ma, Cheng; Luo, Jianwen
2018-06-15
An end-to-end deep neural network, ResU-net, is developed for quantitative photoacoustic imaging. A residual learning framework is used to facilitate optimization and to gain better accuracy from considerably increased network depth. The contracting and expanding paths enable ResU-net to extract comprehensive context information from multispectral initial pressure images and, subsequently, to infer a quantitative image of chromophore concentration or oxygen saturation (sO 2 ). According to our numerical experiments, the estimations of sO 2 and indocyanine green concentration are accurate and robust against variations in both optical property and object geometry. An extremely short reconstruction time of 22 ms is achieved.
Facile and quantitative electrochemical detection of yeast cell apoptosis
NASA Astrophysics Data System (ADS)
Yue, Qiulin; Xiong, Shiquan; Cai, Dongqing; Wu, Zhengyan; Zhang, Xin
2014-03-01
An electrochemical method based on square wave anodic stripping voltammetry (SWASV) was developed to detect the apoptosis of yeast cells conveniently and quantitatively through the high affinity between Cu2+ and phosphatidylserine (PS) translocated from the inner to the outer plasma membrane of the apoptotic cells. The combination of negatively charged PS and Cu2+ could decrease the electrochemical response of Cu2+ on the electrode. The results showed that the apoptotic rates of cells could be detected quantitatively through the variations of peak currents of Cu2+ by SWASV, and agreed well with those obtained through traditional flow cytometry detection. This work thus may provide a novel, simple, immediate and accurate detection method for cell apoptosis.
Li, Mengjia; Zhou, Junyi; Gu, Xue; Wang, Yan; Huang, Xiaojing; Yan, Chao
2009-01-01
A quantitative CE (qCE) system with high precision has been developed, in which a 4-port nano-valve was isolated from the electric field and served as sample injector. The accurate amount of sample was introduced into the CE system with high reproducibility. Based on this system, consecutive injections and separations were performed without voltage interruption. Reproducibilities in terms of RSD lower than 0.8% for retention time and 1.7% for peak area were achieved. The effectiveness of the system was demonstrated by the quantitative analysis of caffeine, theobromine, and theophylline in real samples, such as tea leaf, roasted coffee, coca cola, and theophylline tablets.
Impacts of Water Stress on Forest Recovery and Its Interaction with Canopy Height.
Xu, Peipei; Zhou, Tao; Yi, Chuixiang; Luo, Hui; Zhao, Xiang; Fang, Wei; Gao, Shan; Liu, Xia
2018-06-13
Global climate change is leading to an increase in the frequency, intensity, and duration of drought events, which can affect the functioning of forest ecosystems. Because human activities such as afforestation and forest attributes such as canopy height may exhibit considerable spatial differences, such differences may alter the recovery paths of drought-impacted forests. To accurately assess how climate affects forest recovery, a quantitative evaluation on the effects of forest attributes and their possible interaction with the intensity of water stress is required. Here, forest recovery following extreme drought events was analyzed for Yunnan Province, southwest China. The variation in the recovery of forests with different water availability and canopy heights was quantitatively assessed at the regional scale by using canopy height data based on light detection and ranging (LiDAR) measurements, enhanced vegetation index data, and standardized precipitation evapotranspiration index (SPEI) data. Our results indicated that forest recovery was affected by water availability and canopy height. Based on the enhanced vegetation index measures, shorter trees were more likely to recover than taller ones after drought. Further analyses demonstrated that the effect of canopy height on recovery rates after drought also depends on water availability—the effect of canopy height on recovery diminished as water availability increased after drought. Additional analyses revealed that when the water availability exceeded a threshold (SPEI > 0.85), no significant difference in the recovery was found between short and tall trees ( p > 0.05). In the context of global climate change, future climate scenarios of RCP2.6 and RCP8.5 showed more frequent water stress in Yunnan by the end of the 21st century. In summary, our results indicated that canopy height casts an important influence on forest recovery and tall trees have greater vulnerability and risk to dieback and mortality from drought. These results may have broad implications for policies and practices of forest management.
Ristov, Strahil; Brajkovic, Vladimir; Cubric-Curik, Vlatka; Michieli, Ivan; Curik, Ino
2016-09-10
Identification of genes or even nucleotides that are responsible for quantitative and adaptive trait variation is a difficult task due to the complex interdependence between a large number of genetic and environmental factors. The polymorphism of the mitogenome is one of the factors that can contribute to quantitative trait variation. However, the effects of the mitogenome have not been comprehensively studied, since large numbers of mitogenome sequences and recorded phenotypes are required to reach the adequate power of analysis. Current research in our group focuses on acquiring the necessary mitochondria sequence information and analysing its influence on the phenotype of a quantitative trait. To facilitate these tasks we have produced software for processing pedigrees that is optimised for maternal lineage analysis. We present MaGelLAn 1.0 (maternal genealogy lineage analyser), a suite of four Python scripts (modules) that is designed to facilitate the analysis of the impact of mitogenome polymorphism on quantitative trait variation by combining molecular and pedigree information. MaGelLAn 1.0 is primarily used to: (1) optimise the sampling strategy for molecular analyses; (2) identify and correct pedigree inconsistencies; and (3) identify maternal lineages and assign the corresponding mitogenome sequences to all individuals in the pedigree, this information being used as input to any of the standard software for quantitative genetic (association) analysis. In addition, MaGelLAn 1.0 allows computing the mitogenome (maternal) effective population sizes and probability of mitogenome (maternal) identity that are useful for conservation management of small populations. MaGelLAn is the first tool for pedigree analysis that focuses on quantitative genetic analyses of mitogenome data. It is conceived with the purpose to significantly reduce the effort in handling and preparing large pedigrees for processing the information linked to maternal lines. The software source code, along with the manual and the example files can be downloaded at http://lissp.irb.hr/software/magellan-1-0/ and https://github.com/sristov/magellan .
Conceptual development and retention within the learning cycle
NASA Astrophysics Data System (ADS)
McWhirter, Lisa Jo
1998-12-01
This research was designed to achieve two goals: (1) examine concept development and retention within the learning cycle and (2) examine how students' concept development is mediated by classroom discussions and the students' small cooperative learning group. Forty-eight sixth-grade students and one teacher at an urban middle school participated in the study. The research utilized both quantitative and qualitative analyses. Quantitative assessments included a concept mapping technique as well as teacher generated multiple choice tests. Preliminary quantitative analysis found that students' reading levels had an effect on students' pretest scores in both the concept mapping and the multiple-choice assessment. Therefore, a covariant design was implemented for the quantitative analyses. Quantitative analysis techniques were used to examine concept development and retention, it was discovered that the students' concept knowledge increased significantly from the time of the conclusion of the term introduction phase to the conclusion of the expansion phase. These findings would indicate that all three phases of the learning cycle are necessary for conceptual development. However, quantitative analyses of concept maps indicated that this is not true for all students. Individual students showed evidence of concept development and integration at each phase. Therefore, concept development is individualized and all phases of the learning cycle are not necessary for all students. As a result, individual's assimilation, disequilibration, accommodation and organization may not correlate with the phases of the learning cycle. Quantitative analysis also indicated a significant decrease in the retention of concepts over time. Qualitative analyses were used to examine how students' concept development is mediated by classroom discussions and the students' small cooperative learning group. It was discovered that there was a correlation between teacher-student interaction and small-group interaction and concept mediation. Therefore, students who had a high level of teacher-student dialogue which utilized teacher led discussions with integrated scaffolding techniques where the same students who mediated the ideas within the small group discussions. Those students whose teacher-student interactions consisted of dialogue with little positive teacher feedback made no contributions within the small group regardless of their level of concept development.
Lee, Der-Yen; Huang, Wei-Chieh; Gu, Ting-Jia; Chang, Geen-Dong
2018-06-01
Hydrogen sulfide (H 2 S), previously known as a toxic gas, is now recognized as a gasotransmitter along with nitric oxide and carbon monoxide. However, only few methods are available for quantitative determination of H 2 S in biological samples. 2-Iodoacetanilide (2-IAN), a thiol-reacting agent, has been used to tag the reduced cysteine residues of proteins for quantitative proteomics and for detection of cysteine oxidation modification. In this article, we proposed a new method for quantitative analyses of H 2 S and thiol metabolites using the procedure of pre-column 2-IAN derivatization coupled with liquid chromatography-electrospray ionization-mass spectrometry (LC-ESI-MS). 13 C 6 -Labeled and label-free 2-IAN efficiently react with H 2 S and thiol compounds at pH 9.5 and 65 °C. The derivatives exhibit excellent stability at alkaline conditions, high resolution on reverse phase liquid chromatography and great sensitivity for ESI-MS detection. The measurement of H 2 S, l-cysteine, glutathione, and DL-homocysteine derivatives was validated using 13 C 6 -labeled standard in LC-ESI-MS analyses and exhibited 10 nM-1 μM linear ranges for DL-homocysteine and glutathione and 1 nM-1 μM linear ranges for l-cysteine and H 2 S. In addition, the sequence of derivatization and extraction of metabolites is important in the quantification of thiol metabolites suggesting the presence of matrix effects. Most importantly, labeling with 2-IAN and 13 C 6 -2-IAN isotopologues could achieve quantitative and matched sample comparative analyses with minimal bias using our extraction and labeling procedures before LC-MS analysis. Copyright © 2018 Elsevier B.V. All rights reserved.
Prostate cancer-associated gene expression alterations determined from needle biopsies.
Qian, David Z; Huang, Chung-Ying; O'Brien, Catherine A; Coleman, Ilsa M; Garzotto, Mark; True, Lawrence D; Higano, Celestia S; Vessella, Robert; Lange, Paul H; Nelson, Peter S; Beer, Tomasz M
2009-05-01
To accurately identify gene expression alterations that differentiate neoplastic from normal prostate epithelium using an approach that avoids contamination by unwanted cellular components and is not compromised by acute gene expression changes associated with tumor devascularization and resulting ischemia. Approximately 3,000 neoplastic and benign prostate epithelial cells were isolated using laser capture microdissection from snap-frozen prostate biopsy specimens provided by 31 patients who subsequently participated in a clinical trial of preoperative chemotherapy. cDNA synthesized from amplified total RNA was hybridized to custom-made microarrays composed of 6,200 clones derived from the Prostate Expression Database. Expression differences for selected genes were verified using quantitative reverse transcription-PCR. Comparative analyses identified 954 transcript alterations associated with cancer (q < 0.01%), including 149 differentially expressed genes with no known functional roles. Gene expression changes associated with ischemia and surgical removal of the prostate gland were absent. Genes up-regulated in prostate cancer were statistically enriched in categories related to cellular metabolism, energy use, signal transduction, and molecular transport. Genes down-regulated in prostate cancers were enriched in categories related to immune response, cellular responses to pathogens, and apoptosis. A heterogeneous pattern of androgen receptor expression changes was noted. In exploratory analyses, androgen receptor down-regulation was associated with a lower probability of cancer relapse after neoadjuvant chemotherapy followed by radical prostatectomy. Assessments of tumor phenotypes based on gene expression for treatment stratification and drug targeting of oncogenic alterations may best be ascertained using biopsy-based analyses where the effects of ischemia do not complicate interpretation.
Prostate Cancer-Associated Gene Expression Alterations Determined from Needle Biopsies
Qian, David Z.; Huang, Chung-Ying; O'Brien, Catherine A.; Coleman, Ilsa M.; Garzotto, Mark; True, Lawrence D.; Higano, Celestia S.; Vessella, Robert; Lange, Paul H.; Nelson, Peter S.; Beer, Tomasz M.
2010-01-01
Purpose To accurately identify gene expression alterations that differentiate neoplastic from normal prostate epithelium using an approach that avoids contamination by unwanted cellular components and is not compromised by acute gene expression changes associated with tumor devascularization and resulting ischemia. Experimental Design Approximately 3,000 neoplastic and benign prostate epithelial cells were isolated using laser capture microdissection from snap-frozen prostate biopsy specimens provided by 31 patients who subsequently participated in a clinical trial of preoperative chemotherapy. cDNA synthesized from amplified total RNA was hybridized to custom-made microarrays comprised of 6200 clones derived from the Prostate Expression Database. Expression differences for selected genes were verified using quantitative RT-PCR. Results Comparative analyses identified 954 transcript alterations associated with cancer (q value <0.01%) including 149 differentially expressed genes with no known functional roles. Gene expression changes associated with ischemia and surgical removal of the prostate gland were absent. Genes up-regulated in prostate cancer were statistically enriched in categories related to cellular metabolism, energy utilization, signal transduction, and molecular transport. Genes down-regulated in prostate cancers were enriched in categories related to immune response, cellular responses to pathogens, and apoptosis. A heterogeneous pattern of AR expression changes was noted. In exploratory analyses, AR down regulation was associated with a lower probability of cancer relapse after neoadjuvant chemotherapy followed by radical prostatectomy. Conclusions Assessments of tumor phenotypes based on gene expression for treatment stratification and drug targeting of oncogenic alterations may best be ascertained using biopsy-based analyses where the effects of ischemia do not complicate interpretation. PMID:19366833
Inverse methods for 3D quantitative optical coherence elasticity imaging (Conference Presentation)
NASA Astrophysics Data System (ADS)
Dong, Li; Wijesinghe, Philip; Hugenberg, Nicholas; Sampson, David D.; Munro, Peter R. T.; Kennedy, Brendan F.; Oberai, Assad A.
2017-02-01
In elastography, quantitative elastograms are desirable as they are system and operator independent. Such quantification also facilitates more accurate diagnosis, longitudinal studies and studies performed across multiple sites. In optical elastography (compression, surface-wave or shear-wave), quantitative elastograms are typically obtained by assuming some form of homogeneity. This simplifies data processing at the expense of smearing sharp transitions in elastic properties, and/or introducing artifacts in these regions. Recently, we proposed an inverse problem-based approach to compression OCE that does not assume homogeneity, and overcomes the drawbacks described above. In this approach, the difference between the measured and predicted displacement field is minimized by seeking the optimal distribution of elastic parameters. The predicted displacements and recovered elastic parameters together satisfy the constraint of the equations of equilibrium. This approach, which has been applied in two spatial dimensions assuming plane strain, has yielded accurate material property distributions. Here, we describe the extension of the inverse problem approach to three dimensions. In addition to the advantage of visualizing elastic properties in three dimensions, this extension eliminates the plane strain assumption and is therefore closer to the true physical state. It does, however, incur greater computational costs. We address this challenge through a modified adjoint problem, spatially adaptive grid resolution, and three-dimensional decomposition techniques. Through these techniques the inverse problem is solved on a typical desktop machine within a wall clock time of 20 hours. We present the details of the method and quantitative elasticity images of phantoms and tissue samples.
Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Ryan B.; Clegg, Samuel M.; Frydenvang, Jens
We report that accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response ofmore » an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “submodel” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. Lastly, the sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.« less
Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models
Anderson, Ryan B.; Clegg, Samuel M.; Frydenvang, Jens; ...
2016-12-15
We report that accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response ofmore » an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “submodel” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. Lastly, the sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.« less
Comparison of salivary collection and processing methods for quantitative HHV-8 detection.
Speicher, D J; Johnson, N W
2014-10-01
Saliva is a proved diagnostic fluid for the qualitative detection of infectious agents, but the accuracy of viral load determinations is unknown. Stabilising fluids impede nucleic acid degradation, compared with collection onto ice and then freezing, and we have shown that the DNA Genotek P-021 prototype kit (P-021) can produce high-quality DNA after 14 months of storage at room temperature. Here we evaluate the quantitative capability of 10 collection/processing methods. Unstimulated whole mouth fluid was spiked with a mixture of HHV-8 cloned constructs, 10-fold serial dilutions were produced, and samples were extracted and then examined with quantitative PCR (qPCR). Calibration curves were compared by linear regression and qPCR dynamics. All methods extracted with commercial spin columns produced linear calibration curves with large dynamic range and gave accurate viral loads. Ethanol precipitation of the P-021 does not produce a linear standard curve, and virus is lost in the cell pellet. DNA extractions from the P-021 using commercial spin columns produced linear standard curves with wide dynamic range and excellent limit of detection. When extracted with spin columns, the P-021 enables accurate viral loads down to 23 copies μl(-1) DNA. The quantitative and long-term storage capability of this system makes it ideal for study of salivary DNA viruses in resource-poor settings. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Quantitative analysis of benzodiazepines in vitreous humor by high-performance liquid chromatography
Bazmi, Elham; Behnoush, Behnam; Akhgari, Maryam; Bahmanabadi, Leila
2016-01-01
Objective: Benzodiazepines are frequently screened drugs in emergency toxicology, drugs of abuse testing, and in forensic cases. As the variations of benzodiazepines concentrations in biological samples during bleeding, postmortem changes, and redistribution could be biasing forensic medicine examinations, hence selecting a suitable sample and a validated accurate method is essential for the quantitative analysis of these main drug categories. The aim of this study was to develop a valid method for the determination of four benzodiazepines (flurazepam, lorazepam, alprazolam, and diazepam) in vitreous humor using liquid–liquid extraction and high-performance liquid chromatography. Methods: Sample preparation was carried out using liquid–liquid extraction with n-hexane: ethyl acetate and subsequent detection by high-performance liquid chromatography method coupled to diode array detector. This method was applied to quantify benzodiazepines in 21 authentic vitreous humor samples. Linear curve for each drug was obtained within the range of 30–3000 ng/mL with coefficient of correlation higher than 0.99. Results: The limit of detection and quantitation were 30 and 100 ng/mL respectively for four drugs. The method showed an appropriate intra- and inter-day precision (coefficient of variation < 10%). Benzodiazepines recoveries were estimated to be over 80%. The method showed high selectivity; no additional peak due to interfering substances in samples was observed. Conclusion: The present method was selective, sensitive, accurate, and precise for the quantitative analysis of benzodiazepines in vitreous humor samples in forensic toxicology laboratory. PMID:27635251
Technical and financial evaluation of assays for progesterone in canine practice in the UK.
Moxon, R; Copley, D; England, G C W
2010-10-02
The concentration of progesterone was measured in 60 plasma samples from bitches at various stages of the oestrous cycle, using commercially available quantitative and semi-quantitative ELISA test kits, as well as by two commercial laboratories undertaking radioimmunoassay (RIA). The RIA, which was assumed to be the 'gold standard' in terms of reliability and accuracy, was the most expensive method when analysing more than one sample per week, and had the longest delay in obtaining results, but had minimal requirements for practice staff time. When compared with the RIA, the quantitative ELISA had a strong positive correlation (r=0.97, P<0.05) and a sensitivity and specificity of 70.6 per cent and 100.0 per cent, respectively, and positive and negative predictive values of 100.0 per cent and 71.0 per cent, respectively, with an overall accuracy of 90.0 per cent. This method was the least expensive when analysing five or more samples per week, but had longer turnaround times than that of the semi-quantitative ELISA and required more staff time. When compared with the RIA, the semi-quantitative ELISA had a sensitivity and specificity of 100.0 per cent and 95.5 per cent, respectively, and positive and negative predictive values of 73.9 per cent and 77.8 per cent, respectively, with an overall accuracy of 89.2 per cent. This method was more expensive than the quantitative ELISA when analysing five or more samples per week, but had the shortest turnaround time and low requirements in terms of staff time.
NASA Astrophysics Data System (ADS)
Dator, Romel; Carrà, Andrea; Maertens, Laura; Guidolin, Valeria; Villalta, Peter W.; Balbo, Silvia
2017-04-01
Reactive carbonyl compounds (RCCs) are ubiquitous in the environment and are generated endogenously as a result of various physiological and pathological processes. These compounds can react with biological molecules inducing deleterious processes believed to be at the basis of their toxic effects. Several of these compounds are implicated in neurotoxic processes, aging disorders, and cancer. Therefore, a method characterizing exposures to these chemicals will provide insights into how they may influence overall health and contribute to disease pathogenesis. Here, we have developed a high resolution accurate mass (HRAM) screening strategy allowing simultaneous identification and relative quantitation of DNPH-derivatized carbonyls in human biological fluids. The screening strategy involves the diagnostic neutral loss of hydroxyl radical triggering MS3 fragmentation, which is only observed in positive ionization mode of DNPH-derivatized carbonyls. Unique fragmentation pathways were used to develop a classification scheme for characterizing known and unanticipated/unknown carbonyl compounds present in saliva. Furthermore, a relative quantitation strategy was implemented to assess variations in the levels of carbonyl compounds before and after exposure using deuterated d 3 -DNPH. This relative quantitation method was tested on human samples before and after exposure to specific amounts of alcohol. The nano-electrospray ionization (nano-ESI) in positive mode afforded excellent sensitivity with detection limits on-column in the high-attomole levels. To the best of our knowledge, this is the first report of a method using HRAM neutral loss screening of carbonyl compounds. In addition, the method allows simultaneous characterization and relative quantitation of DNPH-derivatized compounds using nano-ESI in positive mode.
Krummen, David E; Patel, Mitul; Nguyen, Hong; Ho, Gordon; Kazi, Dhruv S; Clopton, Paul; Holland, Marian C; Greenberg, Scott L; Feld, Gregory K; Faddis, Mitchell N; Narayan, Sanjiv M
2010-11-01
Quantitative ECG Analysis. Optimal atrial tachyarrhythmia management is facilitated by accurate electrocardiogram interpretation, yet typical atrial flutter (AFl) may present without sawtooth F-waves or RR regularity, and atrial fibrillation (AF) may be difficult to separate from atypical AFl or rapid focal atrial tachycardia (AT). We analyzed whether improved diagnostic accuracy using a validated analysis tool significantly impacts costs and patient care. We performed a prospective, blinded, multicenter study using a novel quantitative computerized algorithm to identify atrial tachyarrhythmia mechanism from the surface ECG in patients referred for electrophysiology study (EPS). In 122 consecutive patients (age 60 ± 12 years) referred for EPS, 91 sustained atrial tachyarrhythmias were studied. ECGs were also interpreted by 9 physicians from 3 specialties for comparison and to allow healthcare system modeling. Diagnostic accuracy was compared to the diagnosis at EPS. A Markov model was used to estimate the impact of improved arrhythmia diagnosis. We found 13% of typical AFl ECGs had neither sawtooth flutter waves nor RR regularity, and were misdiagnosed by the majority of clinicians (0/6 correctly diagnosed by consensus visual interpretation) but correctly by quantitative analysis in 83% (5/6, P = 0.03). AF diagnosis was also improved through use of the algorithm (92%) versus visual interpretation (primary care: 76%, P < 0.01). Economically, we found that these improvements in diagnostic accuracy resulted in an average cost-savings of $1,303 and 0.007 quality-adjusted-life-years per patient. Typical AFl and AF are frequently misdiagnosed using visual criteria. Quantitative analysis improves diagnostic accuracy and results in improved healthcare costs and patient outcomes. © 2010 Wiley Periodicals, Inc.
Gross, Colin A; Reddy, Chandan K; Dazzo, Frank B
2010-02-01
Quantitative microscopy and digital image analysis are underutilized in microbial ecology largely because of the laborious task to segment foreground object pixels from background, especially in complex color micrographs of environmental samples. In this paper, we describe an improved computing technology developed to alleviate this limitation. The system's uniqueness is its ability to edit digital images accurately when presented with the difficult yet commonplace challenge of removing background pixels whose three-dimensional color space overlaps the range that defines foreground objects. Image segmentation is accomplished by utilizing algorithms that address color and spatial relationships of user-selected foreground object pixels. Performance of the color segmentation algorithm evaluated on 26 complex micrographs at single pixel resolution had an overall pixel classification accuracy of 99+%. Several applications illustrate how this improved computing technology can successfully resolve numerous challenges of complex color segmentation in order to produce images from which quantitative information can be accurately extracted, thereby gain new perspectives on the in situ ecology of microorganisms. Examples include improvements in the quantitative analysis of (1) microbial abundance and phylotype diversity of single cells classified by their discriminating color within heterogeneous communities, (2) cell viability, (3) spatial relationships and intensity of bacterial gene expression involved in cellular communication between individual cells within rhizoplane biofilms, and (4) biofilm ecophysiology based on ribotype-differentiated radioactive substrate utilization. The stand-alone executable file plus user manual and tutorial images for this color segmentation computing application are freely available at http://cme.msu.edu/cmeias/ . This improved computing technology opens new opportunities of imaging applications where discriminating colors really matter most, thereby strengthening quantitative microscopy-based approaches to advance microbial ecology in situ at individual single-cell resolution.
Bruzzone, Bianca; Bisio, Francesca; Caligiuri, Patrizia; Mboungou, Franc A Mayinda; Nigro, Nicola; Sticchi, Laura; Ventura, Agostina; Saladini, Francesco; Zazzi, Maurizio; Icardi, Giancarlo; Viscoli, Claudio
2014-07-01
Accurate HIV-1 RNA quantitation is required to support the scale up of antiretroviral therapy in African countries. Extreme HIV-1 genetic variability in Africa may affect the ability of commercially available assays to detect and quantify HIV-1 RNA accurately. The aim of this study was to compare three real-time PCR assays for quantitation of plasma HIV-1 RNA levels in patients from the Republic of Congo, an area with highly diversified HIV-1 subtypes and recombinants. The Abbott RealTime HIV-1, BioMérieux HIV-1 EasyQ test 1.2 and Cobas AmpliPrep/Cobas TaqMan HIV-1 1.0 were compared for quantitation of HIV-1 RNA in 37 HIV-1 seropositive pregnant women enrolled in the Kento-Mwana project for prevention of mother-to-child transmission in Pointe-Noire, Republic of Congo. The sample panel included a variety of HIV-1 subtypes with as many as 21 (56.8%) putative unique recombinant forms. Qualitative detection of HIV-1 RNA was concordant by all three assays in 33/37 (89.2%) samples. Of the remaining 4 (10.8%) samples, all were positive by Roche, three by Abbott and none by BioMérieux. Differences exceeding 1Log in positive samples were found in 4/31 (12.9%), 10/31 (32.3%) and 5/31 (16.1%) cases between Abbott and BioMérieux, Roche and BioMérieux, and Abbott and Roche, respectively. In this sample panel representative of highly polymorphic HIV-1 in Congo, the agreement among the three assays was moderate in terms of HIV-1 RNA detectability and rather inconsistent in terms of quantitation. Copyright © 2014. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Marquet, P.; Rothenfusser, K.; Rappaz, B.; Depeursinge, C.; Jourdain, P.; Magistretti, P. J.
2016-03-01
Quantitative phase microscopy (QPM) has recently emerged as a powerful label-free technique in the field of living cell imaging allowing to non-invasively measure with a nanometric axial sensitivity cell structure and dynamics. Since the phase retardation of a light wave when transmitted through the observed cells, namely the quantitative phase signal (QPS), is sensitive to both cellular thickness and intracellular refractive index related to the cellular content, its accurate analysis allows to derive various cell parameters and monitor specific cell processes, which are very likely to identify new cell biomarkers. Specifically, quantitative phase-digital holographic microscopy (QP-DHM), thanks to its numerical flexibility facilitating parallelization and automation processes, represents an appealing imaging modality to both identify original cellular biomarkers of diseases as well to explore the underlying pathophysiological processes.
An autoanalyzer test for the quantitation of platelet-associated IgG
NASA Technical Reports Server (NTRS)
Levitan, Nathan; Teno, Richard A.; Szymanski, Irma O.
1986-01-01
A new quantitative antiglobulin consumption (QAC) test for the measurement of platelet-associated IgG is described. In this test washed platelets are incubated with anti-IgG at a final dilution of 1:2 million. The unneutralized fraction of anti-IgG remaining in solution is then measured with an Autoanalyzer and soluble IgG is used for calibration. The dose-response curves depicting the percent neutralization of anti-IgG by platelets and by soluble IgG were compared in detail and found to be nearly identical, indicating that platelet-associated IgG can be accurately quantitated by this method. The mean IgG values were 2287 molecules/platelet for normal adults and 38,112 molecules/platelet for ITP patients. The Autoanalyzer QAC test is a sensitive and reproducible assay for the quantitation of platelet-associated IgG.
75 FR 79370 - Official Release of the MOVES2010a and EMFAC2007 Motor Vehicle Emissions Models for...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-20
...: This Notice announces the availability of two new EPA guidance documents for: completing quantitative... of the MOVES model (MOVES2010a) for official use for quantitative CO, PM 2.5, and PM 10 hot-spot... emissions model is required to be used in quantitative CO and PM hot-spot analyses for project-level...
Gould, Francois D. H.
2014-01-01
Improvements in three-dimensional imaging technologies have renewed interest in the study of functional and ecological morphology. Quantitative approaches to shape analysis are used increasingly to study form-function relationships. These methods are computationally intensive, technically demanding, and time-consuming, which may limit sampling potential. There have been few side-by-side comparisons of the effectiveness of such approaches relative to more traditional analyses using linear measurements and ratios. Morphological variation in the distal femur of mammals has been shown to reflect differences in locomotor modes across clades. Thus I tested whether a geometric morphometric analysis of surface shape was superior to a multivariate analysis of ratios for describing ecomorphological patterns in distal femoral variation. A sample of 164 mammalian specimens from 44 genera was assembled. Each genus was assigned to one of six locomotor categories. The same hypotheses were tested using two methods. Six linear measurements of the distal femur were taken with calipers, from which four ratios were calculated. A 3D model was generated with a laser scanner, and analyzed using three dimensional geometric morphometrics. Locomotor category significantly predicted variation in distal femoral morphology in both analyses. Effect size was larger in the geometric morphometric analysis than in the analysis of ratios. Ordination reveals a similar pattern with arboreal and cursorial taxa as extremes on a continuum of morphologies in both analyses. Discriminant functions calculated from the geometric morphometric analysis were more accurate than those calculated from ratios. Both analysis of ratios and geometric morphometric surface analysis reveal similar, biologically meaningful relationships between distal femoral shape and locomotor mode. The functional signal from the morphology is slightly higher in the geometric morphometric analysis. The practical costs of conducting these sorts of analyses should be weighed against potentially slight increases in power when designing protocols for ecomorphological studies. PMID:24633081
Gu, Y R; Li, M Z; Zhang, K; Chen, L; Jiang, A A; Wang, J Y; Li, X W
2011-08-01
To normalize a set of quantitative real-time PCR (q-PCR) data, it is essential to determine an optimal number/set of housekeeping genes, as the abundance of housekeeping genes can vary across tissues or cells during different developmental stages, or even under certain environmental conditions. In this study, of the 20 commonly used endogenous control genes, 13, 18 and 17 genes exhibited credible stability in 56 different tissues, 10 types of adipose tissue and five types of muscle tissue, respectively. Our analysis clearly showed that three optimal housekeeping genes are adequate for an accurate normalization, which correlated well with the theoretical optimal number (r ≥ 0.94). In terms of economical and experimental feasibility, we recommend the use of the three most stable housekeeping genes for calculating the normalization factor. Based on our results, the three most stable housekeeping genes in all analysed samples (TOP2B, HSPCB and YWHAZ) are recommended for accurate normalization of q-PCR data. We also suggest that two different sets of housekeeping genes are appropriate for 10 types of adipose tissue (the HSPCB, ALDOA and GAPDH genes) and five types of muscle tissue (the TOP2B, HSPCB and YWHAZ genes), respectively. Our report will serve as a valuable reference for other studies aimed at measuring tissue-specific mRNA abundance in porcine samples. © 2011 Blackwell Verlag GmbH.
NASA Astrophysics Data System (ADS)
Liu, Q.; Jing, L.; Li, Y.; Tang, Y.; Li, H.; Lin, Q.
2016-04-01
For the purpose of forest management, high resolution LIDAR and optical remote sensing imageries are used for treetop detection, tree crown delineation, and classification. The purpose of this study is to develop a self-adjusted dominant scales calculation method and a new crown horizontal cutting method of tree canopy height model (CHM) to detect and delineate tree crowns from LIDAR, under the hypothesis that a treetop is radiometric or altitudinal maximum and tree crowns consist of multi-scale branches. The major concept of the method is to develop an automatic selecting strategy of feature scale on CHM, and a multi-scale morphological reconstruction-open crown decomposition (MRCD) to get morphological multi-scale features of CHM by: cutting CHM from treetop to the ground; analysing and refining the dominant multiple scales with differential horizontal profiles to get treetops; segmenting LiDAR CHM using watershed a segmentation approach marked with MRCD treetops. This method has solved the problems of false detection of CHM side-surface extracted by the traditional morphological opening canopy segment (MOCS) method. The novel MRCD delineates more accurate and quantitative multi-scale features of CHM, and enables more accurate detection and segmentation of treetops and crown. Besides, the MRCD method can also be extended to high optical remote sensing tree crown extraction. In an experiment on aerial LiDAR CHM of a forest of multi-scale tree crowns, the proposed method yielded high-quality tree crown maps.
Ferrario, J; Byrne, C; Dupuy, A E
1997-06-01
The addition of the "dioxin-like" polychlorinated biphenyl (PCB) congeners to the assessment of risk associated with the 2,3,7,8-chlorine substituted dioxins and furans has dramatically increased the number of laboratories worldwide that are developing analytical procedures for their detection and quantitation. Most of these procedures are based on established sample preparation and analytical techniques employing high resolution gas chromatography/high resolution mass spectrometry (HRGC/HRMS), which are used for the analyses of dioxin/furans at low parts-per-trillion (ppt) levels. A significant and widespread problem that arises when using these sample preparation procedures for the analysis of coplanar PCBs is the presence of background levels of these congeners. Industrial processes, urban incineration, leaking electrical transformers, hazardous waste accidents, and improper waste disposal practices have released appreciable quantities of PCBs into the environment. This contamination has resulted in the global distribution of these compounds via the atmosphere and their ubiquitous presence in ambient air. The background presence of these compounds in method blanks must be addressed when determining the exact concentrations of these and other congeners in environmental samples. In this study reliable procedures were developed to accurately define these background levels and assess their variability over the course of the study. The background subtraction procedures developed and employed increase the probability that the values reported accurately represent the concentrations found in the samples and were not biased due to this background contamination.
Study on color difference estimation method of medicine biochemical analysis
NASA Astrophysics Data System (ADS)
Wang, Chunhong; Zhou, Yue; Zhao, Hongxia; Sun, Jiashi; Zhou, Fengkun
2006-01-01
The biochemical analysis in medicine is an important inspection and diagnosis method in hospital clinic. The biochemical analysis of urine is one important item. The Urine test paper shows corresponding color with different detection project or different illness degree. The color difference between the standard threshold and the test paper color of urine can be used to judge the illness degree, so that further analysis and diagnosis to urine is gotten. The color is a three-dimensional physical variable concerning psychology, while reflectance is one-dimensional variable; therefore, the estimation method of color difference in urine test can have better precision and facility than the conventional test method with one-dimensional reflectance, it can make an accurate diagnose. The digital camera is easy to take an image of urine test paper and is used to carry out the urine biochemical analysis conveniently. On the experiment, the color image of urine test paper is taken by popular color digital camera and saved in the computer which installs a simple color space conversion (RGB -> XYZ -> L *a *b *)and the calculation software. Test sample is graded according to intelligent detection of quantitative color. The images taken every time were saved in computer, and the whole illness process will be monitored. This method can also use in other medicine biochemical analyses that have relation with color. Experiment result shows that this test method is quick and accurate; it can be used in hospital, calibrating organization and family, so its application prospect is extensive.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christien, F., E-mail: frederic.christien@univ-nantes.fr; Telling, M.T.F.; Department of Materials, University of Oxford, Parks Road, Oxford
2013-08-15
Phase transformations in the 17-4PH martensitic stainless steel have been studied using different in-situ techniques, including dilatometry and high resolution neutron diffraction. Neutron diffraction patterns were quantitatively processed using the Rietveld refinement method, allowing the determination of the temperature-dependence of martensite (α′, bcc) and austenite (γ, fcc) phase fractions and lattice parameters on heating to 1000 °C and then cooling to room temperature. It is demonstrated in this work that dilatometry doesn't permit an accurate determination of the end temperature (Ac3) of the α′ → γ transformation which occurs upon heating to high temperature. The analysis of neutron diffraction datamore » has shown that the respective volumes of the two phases become very close to each other at high temperature, thus making the dilatometric technique almost insensitive in that temperature range. However, there is a very good agreement between neutron diffraction and dilatometry at lower temperature. The martensitic transformation occurring upon cooling has been analysed using the Koistinen–Marburger equation. The thermal expansion coefficients of the two phases have been determined in addition. A comparison of the results obtained in this work with data from literature is presented. - Highlights: • Martensite is still present at very high temperature (> 930 °C) upon heating. • The end of austenitisation cannot be accurately monitored by dilatometry. • The martensite and austenite volumes become similar at high temperature (> ∼ 850 °C)« less
NASA Technical Reports Server (NTRS)
Ferrario, J.; Byrne, C.; Dupuy, A. E. Jr
1997-01-01
The addition of the "dioxin-like" polychlorinated biphenyl (PCB) congeners to the assessment of risk associated with the 2,3,7,8-chlorine substituted dioxins and furans has dramatically increased the number of laboratories worldwide that are developing analytical procedures for their detection and quantitation. Most of these procedures are based on established sample preparation and analytical techniques employing high resolution gas chromatography/high resolution mass spectrometry (HRGC/HRMS), which are used for the analyses of dioxin/furans at low parts-per-trillion (ppt) levels. A significant and widespread problem that arises when using these sample preparation procedures for the analysis of coplanar PCBs is the presence of background levels of these congeners. Industrial processes, urban incineration, leaking electrical transformers, hazardous waste accidents, and improper waste disposal practices have released appreciable quantities of PCBs into the environment. This contamination has resulted in the global distribution of these compounds via the atmosphere and their ubiquitous presence in ambient air. The background presence of these compounds in method blanks must be addressed when determining the exact concentrations of these and other congeners in environmental samples. In this study reliable procedures were developed to accurately define these background levels and assess their variability over the course of the study. The background subtraction procedures developed and employed increase the probability that the values reported accurately represent the concentrations found in the samples and were not biased due to this background contamination.
Accurate and fast multiple-testing correction in eQTL studies.
Sul, Jae Hoon; Raj, Towfique; de Jong, Simone; de Bakker, Paul I W; Raychaudhuri, Soumya; Ophoff, Roel A; Stranger, Barbara E; Eskin, Eleazar; Han, Buhm
2015-06-04
In studies of expression quantitative trait loci (eQTLs), it is of increasing interest to identify eGenes, the genes whose expression levels are associated with variation at a particular genetic variant. Detecting eGenes is important for follow-up analyses and prioritization because genes are the main entities in biological processes. To detect eGenes, one typically focuses on the genetic variant with the minimum p value among all variants in cis with a gene and corrects for multiple testing to obtain a gene-level p value. For performing multiple-testing correction, a permutation test is widely used. Because of growing sample sizes of eQTL studies, however, the permutation test has become a computational bottleneck in eQTL studies. In this paper, we propose an efficient approach for correcting for multiple testing and assess eGene p values by utilizing a multivariate normal distribution. Our approach properly takes into account the linkage-disequilibrium structure among variants, and its time complexity is independent of sample size. By applying our small-sample correction techniques, our method achieves high accuracy in both small and large studies. We have shown that our method consistently produces extremely accurate p values (accuracy > 98%) for three human eQTL datasets with different sample sizes and SNP densities: the Genotype-Tissue Expression pilot dataset, the multi-region brain dataset, and the HapMap 3 dataset. Copyright © 2015 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.
Personal exposure sampling provides the most accurate and representative assessment of exposure to a pollutant, but only if measures are implemented to minimize exposure misclassification and reduce confounders that may cause misinterpretation of the collected data. Poor complian...
EVALUATION OF METHODS FOR SAMPLING, RECOVERY, AND ENUMERATION OF BACTERIA APPLIED TO THE PHYLLOPANE
Determining the fate and survival of genetically engineered microorganisms released into the environment requires the development and application of accurate and practical methods of detection and enumeration. everal experiments were performed to examine quantitative recovery met...
Qualitative and quantitative studies of chemical composition of sandarac resin by GC-MS.
Kononenko, I; de Viguerie, L; Rochut, S; Walter, Ph
2017-01-01
The chemical composition of sandarac resin was investigated qualitatively and quantitatively by gas chromatography-mass spectrometry (GC-MS). Six compounds with labdane and pimarane skeletons were identified in the resin. The obtained mass spectra were interpreted and the mass spectrometric behaviour of these diterpenoids under EI conditions was described. Quantitative analysis by the method of internal standard revealed that identified diterpenoids represent only 10-30% of the analysed sample. The sandarac resin from different suppliers was analysed (from Kremer, Okhra, Color Rare, La Marchande de Couleurs, L'Atelier Montessori, Hevea). The analysis of different lumps of resins showed that the chemical composition differs from one lump to another, varying mainly in the relative distributions of the components.
Using Qualitative Metasummary to Synthesize Qualitative and Quantitative Descriptive Findings
Sandelowski, Margarete; Barroso, Julie; Voils, Corrine I.
2008-01-01
The new imperative in the health disciplines to be more methodologically inclusive has generated a growing interest in mixed research synthesis, or the integration of qualitative and quantitative research findings. Qualitative metasummary is a quantitatively oriented aggregation of qualitative findings originally developed to accommodate the distinctive features of qualitative surveys. Yet these findings are similar in form and mode of production to the descriptive findings researchers often present in addition to the results of bivariate and multivariable analyses. Qualitative metasummary, which includes the extraction, grouping, and formatting of findings, and the calculation of frequency and intensity effect sizes, can be used to produce mixed research syntheses and to conduct a posteriori analyses of the relationship between reports and findings. PMID:17243111