Science.gov

Sample records for accurate quantitative analysis

  1. Quantitation and accurate mass analysis of pesticides in vegetables by LC/TOF-MS.

    PubMed

    Ferrer, Imma; Thurman, E Michael; Fernández-Alba, Amadeo R

    2005-05-01

    A quantitative method consisting of solvent extraction followed by liquid chromatography/time-of-flight mass spectrometry (LC/TOF-MS) analysis was developed for the identification and quantitation of three chloronicotinyl pesticides (imidacloprid, acetamiprid, thiacloprid) commonly used on salad vegetables. Accurate mass measurements within 3 ppm error were obtained for all the pesticides studied in various vegetable matrixes (cucumber, tomato, lettuce, pepper), which allowed an unequivocal identification of the target pesticides. Calibration curves covering 2 orders of magnitude were linear over the concentration range studied, thus showing the quantitative ability of TOF-MS as a monitoring tool for pesticides in vegetables. Matrix effects were also evaluated using matrix-matched standards showing no significant interferences between matrixes and clean extracts. Intraday reproducibility was 2-3% relative standard deviation (RSD) and interday values were 5% RSD. The precision (standard deviation) of the mass measurements was evaluated and it was less than 0.23 mDa between days. Detection limits of the chloronicotinyl insecticides in salad vegetables ranged from 0.002 to 0.01 mg/kg. These concentrations are equal to or better than the EU directives for controlled pesticides in vegetables showing that LC/TOF-MS analysis is a powerful tool for identification of pesticides in vegetables. Robustness and applicability of the method was validated for the analysis of market vegetable samples. Concentrations found in these samples were in the range of 0.02-0.17 mg/kg of vegetable. PMID:15859598

  2. Restriction Site Tiling Analysis: accurate discovery and quantitative genotyping of genome-wide polymorphisms using nucleotide arrays

    PubMed Central

    2010-01-01

    High-throughput genotype data can be used to identify genes important for local adaptation in wild populations, phenotypes in lab stocks, or disease-related traits in human medicine. Here we advance microarray-based genotyping for population genomics with Restriction Site Tiling Analysis. The approach simultaneously discovers polymorphisms and provides quantitative genotype data at 10,000s of loci. It is highly accurate and free from ascertainment bias. We apply the approach to uncover genomic differentiation in the purple sea urchin. PMID:20403197

  3. Simple, fast, and accurate methodology for quantitative analysis using Fourier transform infrared spectroscopy, with bio-hybrid fuel cell examples

    PubMed Central

    Mackie, David M.; Jahnke, Justin P.; Benyamin, Marcus S.; Sumner, James J.

    2016-01-01

    The standard methodologies for quantitative analysis (QA) of mixtures using Fourier transform infrared (FTIR) instruments have evolved until they are now more complicated than necessary for many users’ purposes. We present a simpler methodology, suitable for widespread adoption of FTIR QA as a standard laboratory technique across disciplines by occasional users.•Algorithm is straightforward and intuitive, yet it is also fast, accurate, and robust.•Relies on component spectra, minimization of errors, and local adaptive mesh refinement.•Tested successfully on real mixtures of up to nine components. We show that our methodology is robust to challenging experimental conditions such as similar substances, component percentages differing by three orders of magnitude, and imperfect (noisy) spectra. As examples, we analyze biological, chemical, and physical aspects of bio-hybrid fuel cells. PMID:26977411

  4. Toward Accurate and Quantitative Comparative Metagenomics.

    PubMed

    Nayfach, Stephen; Pollard, Katherine S

    2016-08-25

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  5. Wavelet prism decomposition analysis applied to CARS spectroscopy: a tool for accurate and quantitative extraction of resonant vibrational responses.

    PubMed

    Kan, Yelena; Lensu, Lasse; Hehl, Gregor; Volkmer, Andreas; Vartiainen, Erik M

    2016-05-30

    We propose an approach, based on wavelet prism decomposition analysis, for correcting experimental artefacts in a coherent anti-Stokes Raman scattering (CARS) spectrum. This method allows estimating and eliminating a slowly varying modulation error function in the measured normalized CARS spectrum and yields a corrected CARS line-shape. The main advantage of the approach is that the spectral phase and amplitude corrections are avoided in the retrieved Raman line-shape spectrum, thus significantly simplifying the quantitative reconstruction of the sample's Raman response from a normalized CARS spectrum in the presence of experimental artefacts. Moreover, the approach obviates the need for assumptions about the modulation error distribution and the chemical composition of the specimens under study. The method is quantitatively validated on normalized CARS spectra recorded for equimolar aqueous solutions of D-fructose, D-glucose, and their disaccharide combination sucrose. PMID:27410113

  6. Quantitative Proteome Analysis of Human Plasma Following in vivo Lipopolysaccharide Administration using 16O/18O Labeling and the Accurate Mass and Time Tag Approach

    PubMed Central

    Qian, Wei-Jun; Monroe, Matthew E.; Liu, Tao; Jacobs, Jon M.; Anderson, Gordon A.; Shen, Yufeng; Moore, Ronald J.; Anderson, David J.; Zhang, Rui; Calvano, Steve E.; Lowry, Stephen F.; Xiao, Wenzhong; Moldawer, Lyle L.; Davis, Ronald W.; Tompkins, Ronald G.; Camp, David G.; Smith, Richard D.

    2007-01-01

    SUMMARY Identification of novel diagnostic or therapeutic biomarkers from human blood plasma would benefit significantly from quantitative measurements of the proteome constituents over a range of physiological conditions. Herein we describe an initial demonstration of proteome-wide quantitative analysis of human plasma. The approach utilizes post-digestion trypsin-catalyzed 16O/18O peptide labeling, two-dimensional liquid chromatography (LC)-Fourier transform ion cyclotron resonance ((FTICR) mass spectrometry, and the accurate mass and time (AMT) tag strategy to identify and quantify peptides/proteins from complex samples. A peptide accurate mass and LC-elution time AMT tag database was initially generated using tandem mass spectrometry (MS/MS) following extensive multidimensional LC separations to provide the basis for subsequent peptide identifications. The AMT tag database contains >8,000 putative identified peptides, providing 938 confident plasma protein identifications. The quantitative approach was applied without depletion for high abundant proteins for comparative analyses of plasma samples from an individual prior to and 9 h after lipopolysaccharide (LPS) administration. Accurate quantification of changes in protein abundance was demonstrated by both 1:1 labeling of control plasma and the comparison between the plasma samples following LPS administration. A total of 429 distinct plasma proteins were quantified from the comparative analyses and the protein abundances for 25 proteins, including several known inflammatory response mediators, were observed to change significantly following LPS administration. PMID:15753121

  7. Groundtruth approach to accurate quantitation of fluorescence microarrays

    SciTech Connect

    Mascio-Kegelmeyer, L; Tomascik-Cheeseman, L; Burnett, M S; van Hummelen, P; Wyrobek, A J

    2000-12-01

    To more accurately measure fluorescent signals from microarrays, we calibrated our acquisition and analysis systems by using groundtruth samples comprised of known quantities of red and green gene-specific DNA probes hybridized to cDNA targets. We imaged the slides with a full-field, white light CCD imager and analyzed them with our custom analysis software. Here we compare, for multiple genes, results obtained with and without preprocessing (alignment, color crosstalk compensation, dark field subtraction, and integration time). We also evaluate the accuracy of various image processing and analysis techniques (background subtraction, segmentation, quantitation and normalization). This methodology calibrates and validates our system for accurate quantitative measurement of microarrays. Specifically, we show that preprocessing the images produces results significantly closer to the known ground-truth for these samples.

  8. Quantitative Proteome Analysis of Human Plasma Following in vivo Lipopolysaccharide Administration using O-16/O-18 Labeling and the Accurate Mass and Time Tag Approach

    SciTech Connect

    Qian, Weijun; Monroe, Matthew E.; Liu, Tao; Jacobs, Jon M.; Anderson, Gordon A.; Shen, Yufeng; Moore, Ronald J.; Anderson, David J.; Zhang, Rui; Calvano, Steven E.; Lowry, Stephen F.; Xiao, Wenzhong; Moldawer, Lyle L.; Davis, Ronald W.; Tompkins, Ronald G.; Camp, David G.; Smith, Richard D.

    2005-05-01

    Identification of novel diagnostic or therapeutic biomarkers from human blood plasma would benefit significantly from quantitative measurements of the proteome constituents over a range of physiological conditions. We describe here an initial demonstration of proteome-wide quantitative analysis of human plasma. The approach utilizes post-digestion trypsin-catalyzed 16O/18O labeling, two-dimensional liquid chromatography (LC)-Fourier transform ion cyclotron resonance ((FTICR) mass spectrometry, and the accurate mass and time (AMT) tag strategy for identification and quantification of peptides/proteins from complex samples. A peptide mass and time tag database was initially generated using tandem mass spectrometry (MS/MS) following extensive multidimensional LC separations and the database serves as a ‘look-up’ table for peptide identification. The mass and time tag database contains >8,000 putative identified peptides, which yielded 938 confident plasma protein identifications. The quantitative approach was applied to the comparative analyses of plasma samples from an individual prior to and 9 hours after lipopolysaccharide (LPS) administration without depletion of high abundant proteins. Accurate quantification of changes in protein abundance was demonstrated with both 1:1 labeling of control plasma and the comparison between the plasma samples following LPS administration. A total of 429 distinct plasma proteins were quantified from the comparative analyses and the protein abundances for 28 proteins were observed to be significantly changed following LPS administration, including several known inflammatory response mediators.

  9. Fast and Accurate Detection of Multiple Quantitative Trait Loci

    PubMed Central

    Nettelblad, Carl; Holmgren, Sverker

    2013-01-01

    Abstract We present a new computational scheme that enables efficient and reliable quantitative trait loci (QTL) scans for experimental populations. Using a standard brute-force exhaustive search effectively prohibits accurate QTL scans involving more than two loci to be performed in practice, at least if permutation testing is used to determine significance. Some more elaborate global optimization approaches, for example, DIRECT have been adopted earlier to QTL search problems. Dramatic speedups have been reported for high-dimensional scans. However, since a heuristic termination criterion must be used in these types of algorithms, the accuracy of the optimization process cannot be guaranteed. Indeed, earlier results show that a small bias in the significance thresholds is sometimes introduced. Our new optimization scheme, PruneDIRECT, is based on an analysis leading to a computable (Lipschitz) bound on the slope of a transformed objective function. The bound is derived for both infinite- and finite-size populations. Introducing a Lipschitz bound in DIRECT leads to an algorithm related to classical Lipschitz optimization. Regions in the search space can be permanently excluded (pruned) during the optimization process. Heuristic termination criteria can thus be avoided. Hence, PruneDIRECT has a well-defined error bound and can in practice be guaranteed to be equivalent to a corresponding exhaustive search. We present simulation results that show that for simultaneous mapping of three QTLS using permutation testing, PruneDIRECT is typically more than 50 times faster than exhaustive search. The speedup is higher for stronger QTL. This could be used to quickly detect strong candidate eQTL networks. PMID:23919387

  10. Differential Label-free Quantitative Proteomic Analysis of Shewanella oneidensis Cultured under Aerobic and Suboxic Conditions by Accurate Mass and Time Tag Approach

    SciTech Connect

    Fang, Ruihua; Elias, Dwayne A.; Monroe, Matthew E.; Shen, Yufeng; McIntosh, Martin; Wang, Pei; Goddard, Carrie D.; Callister, Stephen J.; Moore, Ronald J.; Gorby, Yuri A.; Adkins, Joshua N.; Fredrickson, Jim K.; Lipton, Mary S.; Smith, Richard D.

    2006-04-01

    We describe the application of liquid chromatography coupled to mass spectrometry (LC/MS) without the use of stable isotope labeling for differential quantitative proteomics analysis of whole cell lysates of Shewanella oneidensis MR-1 cultured under aerobic and sub-oxic conditions. Liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS) was used to initially identify peptide sequences, and LC coupled to Fourier transform ion cyclotron resonance mass spectrometry (LC-FTICR) was used to confirm these identifications, as well as measure relative peptide abundances. 2343 peptides, covering 668 proteins were identified with high confidence and quantified. Among these proteins, a subset of 56 changed significantly using statistical approaches such as SAM, while another subset of 56 that were annotated as performing housekeeping functions remained essentially unchanged in relative abundance. Numerous proteins involved in anaerobic energy metabolism exhibited up to a 10-fold increase in relative abundance when S. oneidensis is transitioned from aerobic to sub-oxic conditions.

  11. DEVELOPMENT OF AN IN SITU THERMAL EXTRACTION DETECTION SYSTEM (TEDS) FOR RAPID, ACCURATE, QUANTITATIVE ANALYSIS OF ENVIRONMENTAL POLLUTANTS IN THE SUBSURFACE - PHASE I

    EPA Science Inventory

    Ion Signature Technology, Inc. (IST) will develop and market a collection and analysis system that will retrieve soil-bound pollutants as well as soluble and non-soluble contaminants from groundwater as the probe is pushed by cone penetrometry of Geoprobe into the subsurface. ...

  12. Quantitative Hydrocarbon Surface Analysis

    NASA Technical Reports Server (NTRS)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  13. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  14. Quantitative analysis of blood vessel geometry

    NASA Astrophysics Data System (ADS)

    Fuhrman, Michael G.; Abdul-Karim, Othman; Shah, Sujal; Gilbert, Steven G.; Van Bibber, Richard

    2001-07-01

    Re-narrowing or restenosis of a human coronary artery occurs within six months in one third of balloon angioplasty procedures. Accurate and repeatable quantitative analysis of vessel shape is important to characterize the progression and type of restenosis, and to evaluate effects new therapies might have. A combination of complicated geometry and image variability, and the need for high resolution and large image size makes visual/manual analysis slow, difficult, and prone to error. The image processing and analysis described here was developed to automate feature extraction of the lumen, internal elastic lamina, neointima, external elastic lamina, and tunica adventitia and to enable an objective, quantitative definition of blood vessel geometry. The quantitative geometrical analysis enables the measurement of several features including perimeter, area, and other metrics of vessel damage. Automation of feature extraction creates a high throughput capability that enables analysis of serial sections for more accurate measurement of restenosis dimensions. Measurement results are input into a relational database where they can be statistically analyzed compared across studies. As part of the integrated process, results are also imprinted on the images themselves to facilitate auditing of the results. The analysis is fast, repeatable and accurate while allowing the pathologist to control the measurement process.

  15. Quantitative analysis of endogenous compounds.

    PubMed

    Thakare, Rhishikesh; Chhonker, Yashpal S; Gautam, Nagsen; Alamoudi, Jawaher Abdullah; Alnouti, Yazen

    2016-09-01

    Accurate quantitative analysis of endogenous analytes is essential for several clinical and non-clinical applications. LC-MS/MS is the technique of choice for quantitative analyses. Absolute quantification by LC/MS requires preparing standard curves in the same matrix as the study samples so that the matrix effect and the extraction efficiency for analytes are the same in both the standard and study samples. However, by definition, analyte-free biological matrices do not exist for endogenous compounds. To address the lack of blank matrices for the quantification of endogenous compounds by LC-MS/MS, four approaches are used including the standard addition, the background subtraction, the surrogate matrix, and the surrogate analyte methods. This review article presents an overview these approaches, cite and summarize their applications, and compare their advantages and disadvantages. In addition, we discuss in details, validation requirements and compatibility with FDA guidelines to ensure method reliability in quantifying endogenous compounds. The standard addition, background subtraction, and the surrogate analyte approaches allow the use of the same matrix for the calibration curve as the one to be analyzed in the test samples. However, in the surrogate matrix approach, various matrices such as artificial, stripped, and neat matrices are used as surrogate matrices for the actual matrix of study samples. For the surrogate analyte approach, it is required to demonstrate similarity in matrix effect and recovery between surrogate and authentic endogenous analytes. Similarly, for the surrogate matrix approach, it is required to demonstrate similar matrix effect and extraction recovery in both the surrogate and original matrices. All these methods represent indirect approaches to quantify endogenous compounds and regardless of what approach is followed, it has to be shown that none of the validation criteria have been compromised due to the indirect analyses. PMID

  16. Designer cantilevers for even more accurate quantitative measurements of biological systems with multifrequency AFM

    NASA Astrophysics Data System (ADS)

    Contera, S.

    2016-04-01

    Multifrequency excitation/monitoring of cantilevers has made it possible both to achieve fast, relatively simple, nanometre-resolution quantitative mapping of mechanical of biological systems in solution using atomic force microscopy (AFM), and single molecule resolution detection by nanomechanical biosensors. A recent paper by Penedo et al [2015 Nanotechnology 26 485706] has made a significant contribution by developing simple methods to improve the signal to noise ratio in liquid environments, by selectively enhancing cantilever modes, which will lead to even more accurate quantitative measurements.

  17. Pyrosequencing for Accurate Imprinted Allele Expression Analysis

    PubMed Central

    Yang, Bing; Damaschke, Nathan; Yao, Tianyu; McCormick, Johnathon; Wagner, Jennifer; Jarrard, David

    2016-01-01

    Genomic imprinting is an epigenetic mechanism that restricts gene expression to one inherited allele. Improper maintenance of imprinting has been implicated in a number of human diseases and developmental syndromes. Assays are needed that can quantify the contribution of each paternal allele to a gene expression profile. We have developed a rapid, sensitive quantitative assay for the measurement of individual allelic ratios termed Pyrosequencing for Imprinted Expression (PIE). Advantages of PIE over other approaches include shorter experimental time, decreased labor, avoiding the need for restriction endonuclease enzymes at polymorphic sites, and prevent heteroduplex formation which is problematic in quantitative PCR-based methods. We demonstrate the improved sensitivity of PIE including the ability to detect differences in allelic expression down to 1%. The assay is capable of measuring genomic heterozygosity as well as imprinting in a single run. PIE is applied to determine the status of Insulin-like Growth Factor-2 (IGF2) imprinting in human and mouse tissues. PMID:25581900

  18. Mobile app-based quantitative scanometric analysis.

    PubMed

    Wong, Jessica X H; Liu, Frank S F; Yu, Hua-Zhong

    2014-12-16

    The feasibility of using smartphones and other mobile devices as the detection platform for quantitative scanometric assays is demonstrated. The different scanning modes (color, grayscale, black/white) and grayscale converting protocols (average, weighted average/luminosity, and software specific) have been compared in determining the optical darkness ratio (ODR) values, a conventional quantitation measure for scanometric assays. A mobile app was developed to image and analyze scanometric assays, as demonstrated by paper-printed tests and a biotin-streptavidin assay on a plastic substrate. Primarily for ODR analysis, the app has been shown to perform as well as a traditional desktop scanner, augmenting that smartphones (and other mobile devices) promise to be a practical platform for accurate, quantitative chemical analysis and medical diagnostics. PMID:25420202

  19. A correlative imaging based methodology for accurate quantitative assessment of bone formation in additive manufactured implants.

    PubMed

    Geng, Hua; Todd, Naomi M; Devlin-Mullin, Aine; Poologasundarampillai, Gowsihan; Kim, Taek Bo; Madi, Kamel; Cartmell, Sarah; Mitchell, Christopher A; Jones, Julian R; Lee, Peter D

    2016-06-01

    A correlative imaging methodology was developed to accurately quantify bone formation in the complex lattice structure of additive manufactured implants. Micro computed tomography (μCT) and histomorphometry were combined, integrating the best features from both, while demonstrating the limitations of each imaging modality. This semi-automatic methodology registered each modality using a coarse graining technique to speed the registration of 2D histology sections to high resolution 3D μCT datasets. Once registered, histomorphometric qualitative and quantitative bone descriptors were directly correlated to 3D quantitative bone descriptors, such as bone ingrowth and bone contact. The correlative imaging allowed the significant volumetric shrinkage of histology sections to be quantified for the first time (~15 %). This technique demonstrated the importance of location of the histological section, demonstrating that up to a 30 % offset can be introduced. The results were used to quantitatively demonstrate the effectiveness of 3D printed titanium lattice implants. PMID:27153828

  20. Quantitative intracerebral brain hemorrhage analysis

    NASA Astrophysics Data System (ADS)

    Loncaric, Sven; Dhawan, Atam P.; Cosic, Dubravko; Kovacevic, Domagoj; Broderick, Joseph; Brott, Thomas

    1999-05-01

    In this paper a system for 3-D quantitative analysis of human spontaneous intracerebral brain hemorrhage (ICH) is described. The purpose of the developed system is to perform quantitative 3-D measurements of the parameters of ICH region and from computed tomography (CT) images. The measured parameter in this phase of the system development is volume of the hemorrhage region. The goal of the project is to measure parameters for a large number of patients having ICH and to correlate measured parameters to patient morbidity and mortality.

  1. Software for quantitative trait analysis

    PubMed Central

    2005-01-01

    This paper provides a brief overview of software currently available for the genetic analysis of quantitative traits in humans. Programs that implement variance components, Markov Chain Monte Carlo (MCMC), Haseman-Elston (H-E) and penetrance model-based linkage analyses are discussed, as are programs for measured genotype association analyses and quantitative trait transmission disequilibrium tests. The software compared includes LINKAGE, FASTLINK, PAP, SOLAR, SEGPATH, ACT, Mx, MERLIN, GENEHUNTER, Loki, Mendel, SAGE, QTDT and FBAT. Where possible, the paper provides URLs for acquiring these programs through the internet, details of the platforms for which the software is available and the types of analyses performed. PMID:16197737

  2. Quantitative spectroscopy of hot stars: accurate atomic data applied on a large scale as driver of recent breakthroughs

    NASA Astrophysics Data System (ADS)

    Przybilla, Norbert; Schaffenroth, Veronika; Nieva, Maria-Fernanda

    2015-08-01

    OB-type stars present hotbeds for non-LTE physics because of their strong radiation fields that drive the atmospheric plasma out of local thermodynamic equilibrium. We report on recent breakthroughs in the quantitative analysis of the optical and UV-spectra of OB-type stars that were facilitated by application of accurate and precise atomic data on a large scale. An astophysicist's dream has come true, by bringing observed and model spectra into close match over wide parts of the observed wavelength ranges. This facilitates tight observational constraints to be derived from OB-type stars for wide applications in astrophysics. However, despite the progress made, many details of the modelling may be improved further. We discuss atomic data needs in terms of laboratory measurements and also ab-initio calculations. Particular emphasis is given to quantitative spectroscopy in the near-IR, which will be in focus in the era of the upcoming extremely large telescopes.

  3. Accurate and molecular-size-tolerant NMR quantitation of diverse components in solution

    PubMed Central

    Okamura, Hideyasu; Nishimura, Hiroshi; Nagata, Takashi; Kigawa, Takanori; Watanabe, Takashi; Katahira, Masato

    2016-01-01

    Determining the amount of each component of interest in a mixture is a fundamental first step in characterizing the nature of the solution and to develop possible means of utilization of its components. Similarly, determining the composition of units in complex polymers, or polymer mixtures, is crucial. Although NMR is recognized as one of the most powerful methods to achieve this and is widely used in many fields, variation in the molecular sizes or the relative mobilities of components skews quantitation due to the size-dependent decay of magnetization. Here, a method to accurately determine the amount of each component by NMR was developed. This method was validated using a solution that contains biomass-related components in which the molecular sizes greatly differ. The method is also tolerant of other factors that skew quantitation such as variation in the one-bond C–H coupling constant. The developed method is the first and only way to reliably overcome the skewed quantitation caused by several different factors to provide basic information on the correct amount of each component in a solution. PMID:26883279

  4. Image analysis and quantitative morphology.

    PubMed

    Mandarim-de-Lacerda, Carlos Alberto; Fernandes-Santos, Caroline; Aguila, Marcia Barbosa

    2010-01-01

    Quantitative studies are increasingly found in the literature, particularly in the fields of development/evolution, pathology, and neurosciences. Image digitalization converts tissue images into a numeric form by dividing them into very small regions termed picture elements or pixels. Image analysis allows automatic morphometry of digitalized images, and stereology aims to understand the structural inner three-dimensional arrangement based on the analysis of slices showing two-dimensional information. To quantify morphological structures in an unbiased and reproducible manner, appropriate isotropic and uniform random sampling of sections, and updated stereological tools are needed. Through the correct use of stereology, a quantitative study can be performed with little effort; efficiency in stereology means as little counting as possible (little work), low cost (section preparation), but still good accuracy. This short text provides a background guide for non-expert morphologists. PMID:19960334

  5. Accurate and quantitative polarization-sensitive OCT by unbiased birefringence estimator with noise-stochastic correction

    NASA Astrophysics Data System (ADS)

    Kasaragod, Deepa; Sugiyama, Satoshi; Ikuno, Yasushi; Alonso-Caneiro, David; Yamanari, Masahiro; Fukuda, Shinichi; Oshika, Tetsuro; Hong, Young-Joo; Li, En; Makita, Shuichi; Miura, Masahiro; Yasuno, Yoshiaki

    2016-03-01

    Polarization sensitive optical coherence tomography (PS-OCT) is a functional extension of OCT that contrasts the polarization properties of tissues. It has been applied to ophthalmology, cardiology, etc. Proper quantitative imaging is required for a widespread clinical utility. However, the conventional method of averaging to improve the signal to noise ratio (SNR) and the contrast of the phase retardation (or birefringence) images introduce a noise bias offset from the true value. This bias reduces the effectiveness of birefringence contrast for a quantitative study. Although coherent averaging of Jones matrix tomography has been widely utilized and has improved the image quality, the fundamental limitation of nonlinear dependency of phase retardation and birefringence to the SNR was not overcome. So the birefringence obtained by PS-OCT was still not accurate for a quantitative imaging. The nonlinear effect of SNR to phase retardation and birefringence measurement was previously formulated in detail for a Jones matrix OCT (JM-OCT) [1]. Based on this, we had developed a maximum a-posteriori (MAP) estimator and quantitative birefringence imaging was demonstrated [2]. However, this first version of estimator had a theoretical shortcoming. It did not take into account the stochastic nature of SNR of OCT signal. In this paper, we present an improved version of the MAP estimator which takes into account the stochastic property of SNR. This estimator uses a probability distribution function (PDF) of true local retardation, which is proportional to birefringence, under a specific set of measurements of the birefringence and SNR. The PDF was pre-computed by a Monte-Carlo (MC) simulation based on the mathematical model of JM-OCT before the measurement. A comparison between this new MAP estimator, our previous MAP estimator [2], and the standard mean estimator is presented. The comparisons are performed both by numerical simulation and in vivo measurements of anterior and

  6. Mass Spectrometry Provides Accurate and Sensitive Quantitation of A2E

    PubMed Central

    Gutierrez, Danielle B.; Blakeley, Lorie; Goletz, Patrice W.; Schey, Kevin L.; Hanneken, Anne; Koutalos, Yiannis; Crouch, Rosalie K.; Ablonczy, Zsolt

    2010-01-01

    Summary Orange autofluorescence from lipofuscin in the lysosomes of the retinal pigment epithelium (RPE) is a hallmark of aging in the eye. One of the major components of lipofuscin is A2E, the levels of which increase with age and in pathologic conditions, such as Stargardt disease or age-related macular degeneration. In vitro studies have suggested that A2E is highly phototoxic and, more specifically, that A2E and its oxidized derivatives contribute to RPE damage and subsequent photoreceptor cell death. To date, absorption spectroscopy has been the primary method to identify and quantitate A2E. Here, a new mass spectrometric method was developed for the specific detection of low levels of A2E and compared to a traditional method of analysis. The new mass spectrometry method allows the detection and quantitation of approximately 10,000-fold less A2E than absorption spectroscopy and the detection and quantitation of low levels of oxidized A2E, with localization of the oxidation sites. This study suggests that identification and quantitation of A2E from tissue extracts by chromatographic absorption spectroscopyoverestimates the amount of A2E. This mass spectrometry approach makes it possible to detect low levels of A2E and its oxidized metabolites with greater accuracy than traditional methods, thereby facilitating a more exact analysis of bis-retinoids in animal models of inherited retinal degeneration as well as in normal and diseased human eyes. PMID:20931136

  7. Bioimaging for quantitative phenotype analysis.

    PubMed

    Chen, Weiyang; Xia, Xian; Huang, Yi; Chen, Xingwei; Han, Jing-Dong J

    2016-06-01

    With the development of bio-imaging techniques, an increasing number of studies apply these techniques to generate a myriad of image data. Its applications range from quantification of cellular, tissue, organismal and behavioral phenotypes of model organisms, to human facial phenotypes. The bio-imaging approaches to automatically detect, quantify, and profile phenotypic changes related to specific biological questions open new doors to studying phenotype-genotype associations and to precisely evaluating molecular changes associated with quantitative phenotypes. Here, we review major applications of bioimage-based quantitative phenotype analysis. Specifically, we describe the biological questions and experimental needs addressable by these analyses, computational techniques and tools that are available in these contexts, and the new perspectives on phenotype-genotype association uncovered by such analyses. PMID:26850283

  8. Optimization of quantitative infrared analysis

    NASA Astrophysics Data System (ADS)

    Duerst, Richard W.; Breneman, W. E.; Dittmar, Rebecca M.; Drugge, Richard E.; Gagnon, Jim E.; Pranis, Robert A.; Spicer, Colleen K.; Stebbings, William L.; Westberg, J. W.; Duerst, Marilyn D.

    1994-01-01

    A number of industrial processes, especially quality assurance procedures, accept information on relative quantities of components in mixtures, whenever absolute values for the quantitative analysis are unavailable. These relative quantities may be determined from infrared intensity ratios even though known standards are unavailable. Repeatability [vs precisionhl in quantitative analysis is a critical parameter for meaningful results. In any given analysis, multiple runs provide "answers" with a certain standard deviation. Obviously, the lower the standard deviation, the better the precision. In attempting to minimize the standard deviation and thus improve precision, we need to delineate which contributing factors we have control over (such as sample preparation techniques, data analysis methodology) and which factors we have little control over (environmental and instrument noise, for example). For a given set of conditions, the best instrumental precision achievable on an IR instrument should be determinable. Traditionally, the term "signal-to-noise" (S/N) has been used for a single spectrum, realizing that S/N improves with an increase in number of scans coadded for generation of that single spectrum. However, the S/N ratio does not directly reflect the precision achievable for an absorbing band. We prefer to use the phrase "maximum achievable instrument precision" (MAIP), which is equivalent to the minimum relative standard deviation for a given peak (either height or area) in spectra. For a specific analysis, the analyst should have in mind the desired precision. Only if the desired precision is less than the MA1P will the analysis be feasible. Once the MAIP is established, other experimental procedures may be modified to improve the analytical precision, if it is below that which is expected (the MAIP).

  9. Method for accurate quantitation of background tissue optical properties in the presence of emission from a strong fluorescence marker

    NASA Astrophysics Data System (ADS)

    Bravo, Jaime; Davis, Scott C.; Roberts, David W.; Paulsen, Keith D.; Kanick, Stephen C.

    2015-03-01

    Quantification of targeted fluorescence markers during neurosurgery has the potential to improve and standardize surgical distinction between normal and cancerous tissues. However, quantitative analysis of marker fluorescence is complicated by tissue background absorption and scattering properties. Correction algorithms that transform raw fluorescence intensity into quantitative units, independent of absorption and scattering, require a paired measurement of localized white light reflectance to provide estimates of the optical properties. This study focuses on the unique problem of developing a spectral analysis algorithm to extract tissue absorption and scattering properties from white light spectra that contain contributions from both elastically scattered photons and fluorescence emission from a strong fluorophore (i.e. fluorescein). A fiber-optic reflectance device was used to perform measurements in a small set of optical phantoms, constructed with Intralipid (1% lipid), whole blood (1% volume fraction) and fluorescein (0.16-10 μg/mL). Results show that the novel spectral analysis algorithm yields accurate estimates of tissue parameters independent of fluorescein concentration, with relative errors of blood volume fraction, blood oxygenation fraction (BOF), and the reduced scattering coefficient (at 521 nm) of <7%, <1%, and <22%, respectively. These data represent a first step towards quantification of fluorescein in tissue in vivo.

  10. Bright-field quantitative phase microscopy (BFQPM) for accurate phase imaging using conventional microscopy hardware

    NASA Astrophysics Data System (ADS)

    Jenkins, Micah; Gaylord, Thomas K.

    2015-03-01

    Most quantitative phase microscopy methods require the use of custom-built or modified microscopic configurations which are not typically available to most bio/pathologists. There are, however, phase retrieval algorithms which utilize defocused bright-field images as input data and are therefore implementable in existing laboratory environments. Among these, deterministic methods such as those based on inverting the transport-of-intensity equation (TIE) or a phase contrast transfer function (PCTF) are particularly attractive due to their compatibility with Köhler illuminated systems and numerical simplicity. Recently, a new method has been proposed, called multi-filter phase imaging with partially coherent light (MFPI-PC), which alleviates the inherent noise/resolution trade-off in solving the TIE by utilizing a large number of defocused bright-field images spaced equally about the focal plane. Despite greatly improving the state-ofthe- art, the method has many shortcomings including the impracticality of high-speed acquisition, inefficient sampling, and attenuated response at high frequencies due to aperture effects. In this report, we present a new method, called bright-field quantitative phase microscopy (BFQPM), which efficiently utilizes a small number of defocused bright-field images and recovers frequencies out to the partially coherent diffraction limit. The method is based on a noiseminimized inversion of a PCTF derived for each finite defocus distance. We present simulation results which indicate nanoscale optical path length sensitivity and improved performance over MFPI-PC. We also provide experimental results imaging live bovine mesenchymal stem cells at sub-second temporal resolution. In all, BFQPM enables fast and accurate phase imaging with unprecedented spatial resolution using widely available bright-field microscopy hardware.

  11. Automated quantitative analysis for pneumoconiosis

    NASA Astrophysics Data System (ADS)

    Kondo, Hiroshi; Zhao, Bin; Mino, Masako

    1998-09-01

    Automated quantitative analysis for pneumoconiosis is presented. In this paper Japanese standard radiographs of pneumoconiosis are categorized by measuring the area density and the number density of small rounded opacities. And furthermore the classification of the size and shape of the opacities is made from the measuring of the equivalent radiuses of each opacity. The proposed method includes a bi- level unsharp masking filter with a 1D uniform impulse response in order to eliminate the undesired parts such as the images of blood vessels and ribs in the chest x-ray photo. The fuzzy contrast enhancement is also introduced in this method for easy and exact detection of small rounded opacities. Many simulation examples show that the proposed method is more reliable than the former method.

  12. A Quantitative Fitness Analysis Workflow

    PubMed Central

    Lydall, D.A.

    2012-01-01

    Quantitative Fitness Analysis (QFA) is an experimental and computational workflow for comparing fitnesses of microbial cultures grown in parallel1,2,3,4. QFA can be applied to focused observations of single cultures but is most useful for genome-wide genetic interaction or drug screens investigating up to thousands of independent cultures. The central experimental method is the inoculation of independent, dilute liquid microbial cultures onto solid agar plates which are incubated and regularly photographed. Photographs from each time-point are analyzed, producing quantitative cell density estimates, which are used to construct growth curves, allowing quantitative fitness measures to be derived. Culture fitnesses can be compared to quantify and rank genetic interaction strengths or drug sensitivities. The effect on culture fitness of any treatments added into substrate agar (e.g. small molecules, antibiotics or nutrients) or applied to plates externally (e.g. UV irradiation, temperature) can be quantified by QFA. The QFA workflow produces growth rate estimates analogous to those obtained by spectrophotometric measurement of parallel liquid cultures in 96-well or 200-well plate readers. Importantly, QFA has significantly higher throughput compared with such methods. QFA cultures grow on a solid agar surface and are therefore well aerated during growth without the need for stirring or shaking. QFA throughput is not as high as that of some Synthetic Genetic Array (SGA) screening methods5,6. However, since QFA cultures are heavily diluted before being inoculated onto agar, QFA can capture more complete growth curves, including exponential and saturation phases3. For example, growth curve observations allow culture doubling times to be estimated directly with high precision, as discussed previously1. Here we present a specific QFA protocol applied to thousands of S. cerevisiae cultures which are automatically handled by robots during inoculation, incubation and imaging

  13. A quantitative fitness analysis workflow.

    PubMed

    Banks, A P; Lawless, C; Lydall, D A

    2012-01-01

    Quantitative Fitness Analysis (QFA) is an experimental and computational workflow for comparing fitnesses of microbial cultures grown in parallel(1,2,3,4). QFA can be applied to focused observations of single cultures but is most useful for genome-wide genetic interaction or drug screens investigating up to thousands of independent cultures. The central experimental method is the inoculation of independent, dilute liquid microbial cultures onto solid agar plates which are incubated and regularly photographed. Photographs from each time-point are analyzed, producing quantitative cell density estimates, which are used to construct growth curves, allowing quantitative fitness measures to be derived. Culture fitnesses can be compared to quantify and rank genetic interaction strengths or drug sensitivities. The effect on culture fitness of any treatments added into substrate agar (e.g. small molecules, antibiotics or nutrients) or applied to plates externally (e.g. UV irradiation, temperature) can be quantified by QFA. The QFA workflow produces growth rate estimates analogous to those obtained by spectrophotometric measurement of parallel liquid cultures in 96-well or 200-well plate readers. Importantly, QFA has significantly higher throughput compared with such methods. QFA cultures grow on a solid agar surface and are therefore well aerated during growth without the need for stirring or shaking. QFA throughput is not as high as that of some Synthetic Genetic Array (SGA) screening methods(5,6). However, since QFA cultures are heavily diluted before being inoculated onto agar, QFA can capture more complete growth curves, including exponential and saturation phases(3). For example, growth curve observations allow culture doubling times to be estimated directly with high precision, as discussed previously(1). Here we present a specific QFA protocol applied to thousands of S. cerevisiae cultures which are automatically handled by robots during inoculation, incubation and

  14. Multiobjective optimization in quantitative structure-activity relationships: deriving accurate and interpretable QSARs.

    PubMed

    Nicolotti, Orazio; Gillet, Valerie J; Fleming, Peter J; Green, Darren V S

    2002-11-01

    Deriving quantitative structure-activity relationship (QSAR) models that are accurate, reliable, and easily interpretable is a difficult task. In this study, two new methods have been developed that aim to find useful QSAR models that represent an appropriate balance between model accuracy and complexity. Both methods are based on genetic programming (GP). The first method, referred to as genetic QSAR (or GPQSAR), uses a penalty function to control model complexity. GPQSAR is designed to derive a single linear model that represents an appropriate balance between the variance and the number of descriptors selected for the model. The second method, referred to as multiobjective genetic QSAR (MoQSAR), is based on multiobjective GP and represents a new way of thinking of QSAR. Specifically, QSAR is considered as a multiobjective optimization problem that comprises a number of competitive objectives. Typical objectives include model fitting, the total number of terms, and the occurrence of nonlinear terms. MoQSAR results in a family of equivalent QSAR models where each QSAR represents a different tradeoff in the objectives. A practical consideration often overlooked in QSAR studies is the need for the model to promote an understanding of the biochemical response under investigation. To accomplish this, chemically intuitive descriptors are needed but do not always give rise to statistically robust models. This problem is addressed by the addition of a further objective, called chemical desirability, that aims to reward models that consist of descriptors that are easily interpretable by chemists. GPQSAR and MoQSAR have been tested on various data sets including the Selwood data set and two different solubility data sets. The study demonstrates that the MoQSAR method is able to find models that are at least as good as models derived using standard statistical approaches and also yields models that allow a medicinal chemist to trade statistical robustness for chemical

  15. Quantitative analysis of sandstone porosity

    SciTech Connect

    Ferrell, R.E. Jr.; Carpenter, P.K.

    1988-01-01

    A quantitative analysis of changes in porosity associated with sandstone diagenesis was accomplished with digital back-scattered electron image analysis techniques. The volume percent (vol. %) of macroporosity, quartz, clay minerals, feldspar, and other constituents combined with stereological parameters, such as the size and shape of the analyzed features, permitted the determination of cement volumes, the ratio of primary to secondary porosity, and the relative abundance of detrital and authigenic clay minerals. The analyses were produced with a JEOL 733 Superprobe and a TRACOR/NORTHERN 5700 Image Analyzer System. The results provided a numerical evaluation of sedimentological facies controls and diagenetic effects on the permeabilities of potential reservoirs. In a typical application, subtle differences in the diagnetic development of porosity were detected in Wilcox sandstones from central Louisiana. Mechanical compaction of these shoreface sandstones has reduced the porosity to approximately 20%. In most samples with permeabilities greater than 10 md, the measured ratio of macroporosity to microporosity associated with pore-filling kaolinite was 3:1. In other sandstones with lower permeabilities, the measured ratio was higher, but the volume of pore-filling clay was essentially the same. An analysis of the frequency distribution of pore diameters and shapes revealed that the latter samples contained 2-3 vol% of grain-dissolution or moldic porosity. Fluid entry to these large pores was restricted and the clays produced from the grain dissolution products reduced the observed permeability. The image analysis technique provided valuable data for the distinction of productive and nonproductive intervals in this reservoir.

  16. Quantitative transverse flow measurement using OCT speckle decorrelation analysis

    PubMed Central

    Liu, Xuan; Huang, Yong; Ramella-Roman, Jessica C.; Mathews, Scott A.; Kang, Jin U.

    2014-01-01

    We propose an inter-Ascan speckle decorrelation based method that can quantitatively assess blood flow normal to the direction of the OCT imaging beam. To validate this method, we performed a systematic study using both phantom and in vivo animal models. Results show that our speckle analysis method can accurately extract transverse flow speed with high spatial and temporal resolution. PMID:23455305

  17. Quantitative Analysis of Glaciated Landscapes

    NASA Astrophysics Data System (ADS)

    Huerta, A. D.

    2005-12-01

    The evolution of glaciated mountains is at the heart of the debate over Late Cenozoic linkages between climate and tectonics. Traditionally, the development of high summit elevations is attributed to tectonic processes. However, much of the high elevation of the Transantarctic Mountains can be attributed solely to uplift in response to glacial erosion (Stern et al., 2005). The Transantarctic Mountains (TAM) provide an unparalleled opportunity to study glacial erosion. The mountain range has experienced glacial conditions since Oligocene time. In the higher and dryer regions of the TAM there is only a thin veneer of ice and snow draping the topography. In these regions landforms that were shaped during earlier climatic conditions are preserved. In fact, both glacial and fluvial landforms dating as far back as 18 Ma are preserved locally. In addition, the TAM are ideal for studying glacial erosion since the range has experienced minimal tectonic uplift since late Oligocene time, thus isolating the erosion signal from any tectonic signal. With the advent of digital data sets and GIS methodologies, quantitative analysis can identify key aspects of glaciated landscape morphology, and thus develop powerful analytical techniques for objective study of glaciation. Inspection of USGS topographic maps of the TAM reveals that mountain tops display an extreme range of glacial modification. For example, in the Mt. Rabot region (83°-84° S), mountain peaks are strongly affected by glaciation; cirque development is advanced with cirque diameters on the range of several kilometers, and cirque confluence has resulted in the formation of ``knife-edge'' arêtes up to 10 km long. In contrast, in the Mt. Murchison area (73°-74° S) cirque development is youthful, and there is minimal development of arêtes. Preliminary work indicates that analysis of DEM's and contour lines can be used to distinguish degree of glaciation. In particular, slope, curvature, and power spectrum analysis

  18. Quantitative analysis of retinal OCT.

    PubMed

    Sonka, Milan; Abràmoff, Michael D

    2016-10-01

    Clinical acceptance of 3-D OCT retinal imaging brought rapid development of quantitative 3-D analysis of retinal layers, vasculature, retinal lesions as well as facilitated new research in retinal diseases. One of the cornerstones of many such analyses is segmentation and thickness quantification of retinal layers and the choroid, with an inherently 3-D simultaneous multi-layer LOGISMOS (Layered Optimal Graph Image Segmentation for Multiple Objects and Surfaces) segmentation approach being extremely well suited for the task. Once retinal layers are segmented, regional thickness, brightness, or texture-based indices of individual layers can be easily determined and thus contribute to our understanding of retinal or optic nerve head (ONH) disease processes and can be employed for determination of disease status, treatment responses, visual function, etc. Out of many applications, examples provided in this paper focus on image-guided therapy and outcome prediction in age-related macular degeneration and on assessing visual function from retinal layer structure in glaucoma. PMID:27503080

  19. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    PubMed Central

    Jensen, Thomas; Holten-Rossing, Henrik; Svendsen, Ida M H; Jacobsen, Christina; Vainer, Ben

    2016-01-01

    Background: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including staining may benefit. Methods: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm are presented. Results: It is shown that the autofluorescence intensity of unstained microsections at two different wavelengths is a suitable starting point for automated digital analysis of myocytes, fibrous tissue, lipofuscin, and the extracellular compartment. The output of the method is absolute quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. Conclusion: The presented method is amply described as a prestain multicomponent quantitation and outlining tool for histological sections of cardiac tissue. The main perspective is the opportunity for combination with digital analysis of stained microsections, for which the method may provide an accurate digital framework. PMID:27141321

  20. Accurate analysis of EBSD data for phase identification

    NASA Astrophysics Data System (ADS)

    Palizdar, Y.; Cochrane, R. C.; Brydson, R.; Leary, R.; Scott, A. J.

    2010-07-01

    This paper aims to investigate the reliability of software default settings in the analysis of EBSD results. To study the effect of software settings on the EBSD results, the presence of different phases in high Al steel has been investigated by EBSD. The results show the importance of appropriate automated analysis parameters for valid and reliable phase discrimination. Specifically, the importance of the minimum number of indexed bands and the maximum solution error have been investigated with values of 7-9 and 1.0-1.5° respectively, found to be needed for accurate analysis.

  1. Quantitative Analysis of Radar Returns from Insects

    NASA Technical Reports Server (NTRS)

    Riley, J. R.

    1979-01-01

    When a number of flying insects is low enough to permit their resolution as individual radar targets, quantitative estimates of their aerial density are developed. Accurate measurements of heading distribution using a rotating polarization radar to enhance the wingbeat frequency method of identification are presented.

  2. Development and evaluation of a liquid chromatography-mass spectrometry method for rapid, accurate quantitation of malondialdehyde in human plasma.

    PubMed

    Sobsey, Constance A; Han, Jun; Lin, Karen; Swardfager, Walter; Levitt, Anthony; Borchers, Christoph H

    2016-09-01

    Malondialdhyde (MDA) is a commonly used marker of lipid peroxidation in oxidative stress. To provide a sensitive analytical method that is compatible with high throughput, we developed a multiple reaction monitoring-mass spectrometry (MRM-MS) approach using 3-nitrophenylhydrazine chemical derivatization, isotope-labeling, and liquid chromatography (LC) with electrospray ionization (ESI)-tandem mass spectrometry assay to accurately quantify MDA in human plasma. A stable isotope-labeled internal standard was used to compensate for ESI matrix effects. The assay is linear (R(2)=0.9999) over a 20,000-fold concentration range with a lower limit of quantitation of 30fmol (on-column). Intra- and inter-run coefficients of variation (CVs) were <2% and ∼10% respectively. The derivative was stable for >36h at 5°C. Standards spiked into plasma had recoveries of 92-98%. When compared to a common LC-UV method, the LC-MS method found near-identical MDA concentrations. A pilot project to quantify MDA in patient plasma samples (n=26) in a study of major depressive disorder with winter-type seasonal pattern (MDD-s) confirmed known associations between MDA concentrations and obesity (p<0.02). The LC-MS method provides high sensitivity and high reproducibility for quantifying MDA in human plasma. The simple sample preparation and rapid analysis time (5x faster than LC-UV) offers high throughput for large-scale clinical applications. PMID:27437618

  3. Importance of housekeeping gene selection for accurate reverse transcription-quantitative polymerase chain reaction in a wound healing model.

    PubMed

    Turabelidze, Anna; Guo, Shujuan; DiPietro, Luisa A

    2010-01-01

    Studies in the field of wound healing have utilized a variety of different housekeeping genes for reverse transcription-quantitative polymerase chain reaction (RT-qPCR) analysis. However, nearly all of these studies assume that the selected normalization gene is stably expressed throughout the course of the repair process. The purpose of our current investigation was to identify the most stable housekeeping genes for studying gene expression in mouse wound healing using RT-qPCR. To identify which housekeeping genes are optimal for studying gene expression in wound healing, we examined all articles published in Wound Repair and Regeneration that cited RT-qPCR during the period of January/February 2008 until July/August 2009. We determined that ACTβ, GAPDH, 18S, and β2M were the most frequently used housekeeping genes in human, mouse, and pig studies. We also investigated nine commonly used housekeeping genes that are not generally used in wound healing models: GUS, TBP, RPLP2, ATP5B, SDHA, UBC, CANX, CYC1, and YWHAZ. We observed that wounded and unwounded tissues have contrasting housekeeping gene expression stability. The results demonstrate that commonly used housekeeping genes must be validated as accurate normalizing genes for each individual experimental condition. PMID:20731795

  4. An accurate method of extracting fat droplets in liver images for quantitative evaluation

    NASA Astrophysics Data System (ADS)

    Ishikawa, Masahiro; Kobayashi, Naoki; Komagata, Hideki; Shinoda, Kazuma; Yamaguchi, Masahiro; Abe, Tokiya; Hashiguchi, Akinori; Sakamoto, Michiie

    2015-03-01

    The steatosis in liver pathological tissue images is a promising indicator of nonalcoholic fatty liver disease (NAFLD) and the possible risk of hepatocellular carcinoma (HCC). The resulting values are also important for ensuring the automatic and accurate classification of HCC images, because the existence of many fat droplets is likely to create errors in quantifying the morphological features used in the process. In this study we propose a method that can automatically detect, and exclude regions with many fat droplets by using the feature values of colors, shapes and the arrangement of cell nuclei. We implement the method and confirm that it can accurately detect fat droplets and quantify the fat droplet ratio of actual images. This investigation also clarifies the effective characteristics that contribute to accurate detection.

  5. Accurate interlaminar stress recovery from finite element analysis

    NASA Technical Reports Server (NTRS)

    Tessler, Alexander; Riggs, H. Ronald

    1994-01-01

    The accuracy and robustness of a two-dimensional smoothing methodology is examined for the problem of recovering accurate interlaminar shear stress distributions in laminated composite and sandwich plates. The smoothing methodology is based on a variational formulation which combines discrete least-squares and penalty-constraint functionals in a single variational form. The smoothing analysis utilizes optimal strains computed at discrete locations in a finite element analysis. These discrete strain data are smoothed with a smoothing element discretization, producing superior accuracy strains and their first gradients. The approach enables the resulting smooth strain field to be practically C1-continuous throughout the domain of smoothing, exhibiting superconvergent properties of the smoothed quantity. The continuous strain gradients are also obtained directly from the solution. The recovered strain gradients are subsequently employed in the integration o equilibrium equations to obtain accurate interlaminar shear stresses. The problem is a simply-supported rectangular plate under a doubly sinusoidal load. The problem has an exact analytic solution which serves as a measure of goodness of the recovered interlaminar shear stresses. The method has the versatility of being applicable to the analysis of rather general and complex structures built of distinct components and materials, such as found in aircraft design. For these types of structures, the smoothing is achieved with 'patches', each patch covering the domain in which the smoothed quantity is physically continuous.

  6. Quantitative Analysis of Face Symmetry.

    PubMed

    Tamir, Abraham

    2015-06-01

    The major objective of this article was to report quantitatively the degree of human face symmetry for reported images taken from the Internet. From the original image of a certain person that appears in the center of each triplet, 2 symmetric combinations were constructed that are based on the left part of the image and its mirror image (left-left) and on the right part of the image and its mirror image (right-right). By applying a computer software that enables to determine length, surface area, and perimeter of any geometric shape, the following measurements were obtained for each triplet: face perimeter and area; distance between the pupils; mouth length; its perimeter and area; nose length and face length, usually below the ears; as well as the area and perimeter of the pupils. Then, for each of the above measurements, the value C, which characterizes the degree of symmetry of the real image with respect to the combinations right-right and left-left, was calculated. C appears on the right-hand side below each image. A high value of C indicates a low symmetry, and as the value is decreasing, the symmetry is increasing. The magnitude on the left relates to the pupils and compares the difference between the area and perimeter of the 2 pupils. The major conclusion arrived at here is that the human face is asymmetric to some degree; the degree of asymmetry is reported quantitatively under each portrait. PMID:26080172

  7. Quantitative analysis of digital microscope images.

    PubMed

    Wolf, David E; Samarasekera, Champika; Swedlow, Jason R

    2013-01-01

    This chapter discusses quantitative analysis of digital microscope images and presents several exercises to provide examples to explain the concept. This chapter also presents the basic concepts in quantitative analysis for imaging, but these concepts rest on a well-established foundation of signal theory and quantitative data analysis. This chapter presents several examples for understanding the imaging process as a transformation from sample to image and the limits and considerations of quantitative analysis. This chapter introduces to the concept of digitally correcting the images and also focuses on some of the more critical types of data transformation and some of the frequently encountered issues in quantization. Image processing represents a form of data processing. There are many examples of data processing such as fitting the data to a theoretical curve. In all these cases, it is critical that care is taken during all steps of transformation, processing, and quantization. PMID:23931513

  8. Quantitative analysis of qualitative images

    NASA Astrophysics Data System (ADS)

    Hockney, David; Falco, Charles M.

    2005-03-01

    We show optical evidence that demonstrates artists as early as Jan van Eyck and Robert Campin (c1425) used optical projections as aids for producing their paintings. We also have found optical evidence within works by later artists, including Bermejo (c1475), Lotto (c1525), Caravaggio (c1600), de la Tour (c1650), Chardin (c1750) and Ingres (c1825), demonstrating a continuum in the use of optical projections by artists, along with an evolution in the sophistication of that use. However, even for paintings where we have been able to extract unambiguous, quantitative evidence of the direct use of optical projections for producing certain of the features, this does not mean that paintings are effectively photographs. Because the hand and mind of the artist are intimately involved in the creation process, understanding these complex images requires more than can be obtained from only applying the equations of geometrical optics.

  9. Automated and quantitative headspace in-tube extraction for the accurate determination of highly volatile compounds from wines and beers.

    PubMed

    Zapata, Julián; Mateo-Vivaracho, Laura; Lopez, Ricardo; Ferreira, Vicente

    2012-03-23

    An automatic headspace in-tube extraction (ITEX) method for the accurate determination of acetaldehyde, ethyl acetate, diacetyl and other volatile compounds from wine and beer has been developed and validated. Method accuracy is based on the nearly quantitative transference of volatile compounds from the sample to the ITEX trap. For achieving that goal most methodological aspects and parameters have been carefully examined. The vial and sample sizes and the trapping materials were found to be critical due to the pernicious saturation effects of ethanol. Small 2 mL vials containing very small amounts of sample (20 μL of 1:10 diluted sample) and a trap filled with 22 mg of Bond Elut ENV resins could guarantee a complete trapping of sample vapors. The complete extraction requires 100 × 0.5 mL pumping strokes at 60 °C and takes 24 min. Analytes are further desorbed at 240 °C into the GC injector under a 1:5 split ratio. The proportion of analytes finally transferred to the trap ranged from 85 to 99%. The validation of the method showed satisfactory figures of merit. Determination coefficients were better than 0.995 in all cases and good repeatability was also obtained (better than 7% in all cases). Reproducibility was better than 8.3% except for acetaldehyde (13.1%). Detection limits were below the odor detection thresholds of these target compounds in wine and beer and well below the normal ranges of occurrence. Recoveries were not significantly different to 100%, except in the case of acetaldehyde. In such a case it could be determined that the method is not able to break some of the adducts that this compound forms with sulfites. However, such problem was avoided after incubating the sample with glyoxal. The method can constitute a general and reliable alternative for the analysis of very volatile compounds in other difficult matrixes. PMID:22340891

  10. An Analysis of Critical Factors for Quantitative Immunoblotting

    PubMed Central

    Janes, Kevin A.

    2015-01-01

    Immunoblotting (also known as Western blotting) combined with digital image analysis can be a reliable method for analyzing the abundance of proteins and protein modifications, but not every immunoblot-analysis combination produces an accurate result. Here, I illustrate how sample preparation, protocol implementation, detection scheme, and normalization approach profoundly affect the quantitative performance of immunoblotting. This study implemented diagnostic experiments that assess an immunoblot-analysis workflow for accuracy and precision. The results showed that ignoring such diagnostics can lead to pseudoquantitative immunoblot data that dramatically overestimate or underestimate true differences in protein abundance. PMID:25852189

  11. There's plenty of gloom at the bottom: the many challenges of accurate quantitation in size-based oligomeric separations.

    PubMed

    Striegel, André M

    2013-11-01

    There is a variety of small-molecule species (e.g., tackifiers, plasticizers, oligosaccharides) the size-based characterization of which is of considerable scientific and industrial importance. Likewise, quantitation of the amount of oligomers in a polymer sample is crucial for the import and export of substances into the USA and European Union (EU). While the characterization of ultra-high molar mass macromolecules by size-based separation techniques is generally considered a challenge, it is this author's contention that a greater challenge is encountered when trying to perform, for quantitation purposes, separations in and of the oligomeric region. The latter thesis is expounded herein, by detailing the various obstacles encountered en route to accurate, quantitative oligomeric separations by entropically dominated techniques such as size-exclusion chromatography, hydrodynamic chromatography, and asymmetric flow field-flow fractionation, as well as by methods which are, principally, enthalpically driven such as liquid adsorption and temperature gradient interaction chromatography. These obstacles include, among others, the diminished sensitivity of static light scattering (SLS) detection at low molar masses, the non-constancy of the response of SLS and of commonly employed concentration-sensitive detectors across the oligomeric region, and the loss of oligomers through the accumulation wall membrane in asymmetric flow field-flow fractionation. The battle is not lost, however, because, with some care and given a sufficient supply of sample, the quantitation of both individual oligomeric species and of the total oligomeric region is often possible. PMID:23887277

  12. Qualitative and Quantitative Analysis: Interpretation of Electropherograms

    NASA Astrophysics Data System (ADS)

    Szumski, Michał; Buszewski, Bogusław

    In this chapter the basic information on qualitative and quantitative analysis in CE is provided. Migration time and spectral data are described as the most important parameters used for identification of compounds. The parameters that negatively influence qualitative analysis are briefly mentioned. In the quantitative analysis section the external standard and internal standard calibration methods are described. Variables influencing peak height and peak area in capillary electrophoresis are briefly summarized. Also, a discussion on electrodisperssion and its influence on a observed peak shape is provided.

  13. Highly accurate thermal flow microsensor for continuous and quantitative measurement of cerebral blood flow.

    PubMed

    Li, Chunyan; Wu, Pei-ming; Wu, Zhizhen; Limnuson, Kanokwan; Mehan, Neal; Mozayan, Cameron; Golanov, Eugene V; Ahn, Chong H; Hartings, Jed A; Narayan, Raj K

    2015-10-01

    Cerebral blood flow (CBF) plays a critical role in the exchange of nutrients and metabolites at the capillary level and is tightly regulated to meet the metabolic demands of the brain. After major brain injuries, CBF normally decreases and supporting the injured brain with adequate CBF is a mainstay of therapy after traumatic brain injury. Quantitative and localized measurement of CBF is therefore critically important for evaluation of treatment efficacy and also for understanding of cerebral pathophysiology. We present here an improved thermal flow microsensor and its operation which provides higher accuracy compared to existing devices. The flow microsensor consists of three components, two stacked-up thin film resistive elements serving as composite heater/temperature sensor and one remote resistive element for environmental temperature compensation. It operates in constant-temperature mode (~2 °C above the medium temperature) providing 20 ms temporal resolution. Compared to previous thermal flow microsensor based on self-heating and self-sensing design, the sensor presented provides at least two-fold improvement in accuracy in the range from 0 to 200 ml/100 g/min. This is mainly achieved by using the stacked-up structure, where the heating and sensing are separated to improve the temperature measurement accuracy by minimization of errors introduced by self-heating. PMID:26256480

  14. Quantitative calcium resistivity based method for accurate and scalable water vapor transmission rate measurement.

    PubMed

    Reese, Matthew O; Dameron, Arrelaine A; Kempe, Michael D

    2011-08-01

    The development of flexible organic light emitting diode displays and flexible thin film photovoltaic devices is dependent on the use of flexible, low-cost, optically transparent and durable barriers to moisture and/or oxygen. It is estimated that this will require high moisture barriers with water vapor transmission rates (WVTR) between 10(-4) and 10(-6) g/m(2)/day. Thus there is a need to develop a relatively fast, low-cost, and quantitative method to evaluate such low permeation rates. Here, we demonstrate a method where the resistance changes of patterned Ca films, upon reaction with moisture, enable one to calculate a WVTR between 10 and 10(-6) g/m(2)/day or better. Samples are configured with variable aperture size such that the sensitivity and/or measurement time of the experiment can be controlled. The samples are connected to a data acquisition system by means of individual signal cables permitting samples to be tested under a variety of conditions in multiple environmental chambers. An edge card connector is used to connect samples to the measurement wires enabling easy switching of samples in and out of test. This measurement method can be conducted with as little as 1 h of labor time per sample. Furthermore, multiple samples can be measured in parallel, making this an inexpensive and high volume method for measuring high moisture barriers. PMID:21895269

  15. Quantitative histogram analysis of images

    NASA Astrophysics Data System (ADS)

    Holub, Oliver; Ferreira, Sérgio T.

    2006-11-01

    A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for

  16. Allele-Specific Quantitative PCR for Accurate, Rapid, and Cost-Effective Genotyping.

    PubMed

    Lee, Han B; Schwab, Tanya L; Koleilat, Alaa; Ata, Hirotaka; Daby, Camden L; Cervera, Roberto Lopez; McNulty, Melissa S; Bostwick, Hannah S; Clark, Karl J

    2016-06-01

    Customizable endonucleases such as transcription activator-like effector nucleases (TALENs) and clustered regularly interspaced short palindromic repeats/CRISPR-associated protein 9 (CRISPR/Cas9) enable rapid generation of mutant strains at genomic loci of interest in animal models and cell lines. With the accelerated pace of generating mutant alleles, genotyping has become a rate-limiting step to understanding the effects of genetic perturbation. Unless mutated alleles result in distinct morphological phenotypes, mutant strains need to be genotyped using standard methods in molecular biology. Classic restriction fragment length polymorphism (RFLP) or sequencing is labor-intensive and expensive. Although simpler than RFLP, current versions of allele-specific PCR may still require post-polymerase chain reaction (PCR) handling such as sequencing, or they are more expensive if allele-specific fluorescent probes are used. Commercial genotyping solutions can take weeks from assay design to result, and are often more expensive than assembling reactions in-house. Key components of commercial assay systems are often proprietary, which limits further customization. Therefore, we developed a one-step open-source genotyping method based on quantitative PCR. The allele-specific qPCR (ASQ) does not require post-PCR processing and can genotype germline mutants through either threshold cycle (Ct) or end-point fluorescence reading. ASQ utilizes allele-specific primers, a locus-specific reverse primer, universal fluorescent probes and quenchers, and hot start DNA polymerase. Individual laboratories can further optimize this open-source system as we completely disclose the sequences, reagents, and thermal cycling protocol. We have tested the ASQ protocol to genotype alleles in five different genes. ASQ showed a 98-100% concordance in genotype scoring with RFLP or Sanger sequencing outcomes. ASQ is time-saving because a single qPCR without post-PCR handling suffices to score

  17. Allele-Specific Quantitative PCR for Accurate, Rapid, and Cost-Effective Genotyping

    PubMed Central

    Lee, Han B.; Schwab, Tanya L.; Koleilat, Alaa; Ata, Hirotaka; Daby, Camden L.; Cervera, Roberto Lopez; McNulty, Melissa S.; Bostwick, Hannah S.; Clark, Karl J.

    2016-01-01

    Customizable endonucleases such as transcription activator-like effector nucleases (TALENs) and clustered regularly interspaced short palindromic repeats/CRISPR-associated protein 9 (CRISPR/Cas9) enable rapid generation of mutant strains at genomic loci of interest in animal models and cell lines. With the accelerated pace of generating mutant alleles, genotyping has become a rate-limiting step to understanding the effects of genetic perturbation. Unless mutated alleles result in distinct morphological phenotypes, mutant strains need to be genotyped using standard methods in molecular biology. Classic restriction fragment length polymorphism (RFLP) or sequencing is labor-intensive and expensive. Although simpler than RFLP, current versions of allele-specific PCR may still require post-polymerase chain reaction (PCR) handling such as sequencing, or they are more expensive if allele-specific fluorescent probes are used. Commercial genotyping solutions can take weeks from assay design to result, and are often more expensive than assembling reactions in-house. Key components of commercial assay systems are often proprietary, which limits further customization. Therefore, we developed a one-step open-source genotyping method based on quantitative PCR. The allele-specific qPCR (ASQ) does not require post-PCR processing and can genotype germline mutants through either threshold cycle (Ct) or end-point fluorescence reading. ASQ utilizes allele-specific primers, a locus-specific reverse primer, universal fluorescent probes and quenchers, and hot start DNA polymerase. Individual laboratories can further optimize this open-source system as we completely disclose the sequences, reagents, and thermal cycling protocol. We have tested the ASQ protocol to genotype alleles in five different genes. ASQ showed a 98–100% concordance in genotype scoring with RFLP or Sanger sequencing outcomes. ASQ is time-saving because a single qPCR without post-PCR handling suffices to score

  18. Accurate analysis of multicomponent fuel spray evaporation in turbulent flow

    NASA Astrophysics Data System (ADS)

    Rauch, Bastian; Calabria, Raffaela; Chiariello, Fabio; Le Clercq, Patrick; Massoli, Patrizio; Rachner, Michael

    2012-04-01

    The aim of this paper is to perform an accurate analysis of the evaporation of single component and binary mixture fuels sprays in a hot weakly turbulent pipe flow by means of experimental measurement and numerical simulation. This gives a deeper insight into the relationship between fuel composition and spray evaporation. The turbulence intensity in the test section is equal to 10%, and the integral length scale is three orders of magnitude larger than the droplet size while the turbulence microscale (Kolmogorov scales) is of same order as the droplet diameter. The spray produced by means of a calibrated droplet generator was injected in a gas flow electrically preheated. N-nonane, isopropanol, and their mixtures were used in the tests. The generalized scattering imaging technique was applied to simultaneously determine size, velocity, and spatial location of the droplets carried by the turbulent flow in the quartz tube. The spray evaporation was computed using a Lagrangian particle solver coupled to a gas-phase solver. Computations of spray mean diameter and droplet size distributions at different locations along the pipe compare very favorably with the measurement results. This combined research tool enabled further investigation concerning the influencing parameters upon the evaporation process such as the turbulence, droplet internal mixing, and liquid-phase thermophysical properties.

  19. Development and Validation of a Highly Accurate Quantitative Real-Time PCR Assay for Diagnosis of Bacterial Vaginosis.

    PubMed

    Hilbert, David W; Smith, William L; Chadwick, Sean G; Toner, Geoffrey; Mordechai, Eli; Adelson, Martin E; Aguin, Tina J; Sobel, Jack D; Gygax, Scott E

    2016-04-01

    Bacterial vaginosis (BV) is the most common gynecological infection in the United States. Diagnosis based on Amsel's criteria can be challenging and can be aided by laboratory-based testing. A standard method for diagnosis in research studies is enumeration of bacterial morphotypes of a Gram-stained vaginal smear (i.e., Nugent scoring). However, this technique is subjective, requires specialized training, and is not widely available. Therefore, a highly accurate molecular assay for the diagnosis of BV would be of great utility. We analyzed 385 vaginal specimens collected prospectively from subjects who were evaluated for BV by clinical signs and Nugent scoring. We analyzed quantitative real-time PCR (qPCR) assays on DNA extracted from these specimens to quantify nine organisms associated with vaginal health or disease:Gardnerella vaginalis,Atopobium vaginae, BV-associated bacteria 2 (BVAB2, an uncultured member of the orderClostridiales),Megasphaeraphylotype 1 or 2,Lactobacillus iners,Lactobacillus crispatus,Lactobacillus gasseri, andLactobacillus jensenii We generated a logistic regression model that identifiedG. vaginalis,A. vaginae, andMegasphaeraphylotypes 1 and 2 as the organisms for which quantification provided the most accurate diagnosis of symptomatic BV, as defined by Amsel's criteria and Nugent scoring, with 92% sensitivity, 95% specificity, 94% positive predictive value, and 94% negative predictive value. The inclusion ofLactobacillusspp. did not contribute sufficiently to the quantitative model for symptomatic BV detection. This molecular assay is a highly accurate laboratory tool to assist in the diagnosis of symptomatic BV. PMID:26818677

  20. PLIF: A rapid, accurate method to detect and quantitatively assess protein-lipid interactions.

    PubMed

    Ceccato, Laurie; Chicanne, Gaëtan; Nahoum, Virginie; Pons, Véronique; Payrastre, Bernard; Gaits-Iacovoni, Frédérique; Viaud, Julien

    2016-01-01

    Phosphoinositides are a type of cellular phospholipid that regulate signaling in a wide range of cellular and physiological processes through the interaction between their phosphorylated inositol head group and specific domains in various cytosolic proteins. These lipids also influence the activity of transmembrane proteins. Aberrant phosphoinositide signaling is associated with numerous diseases, including cancer, obesity, and diabetes. Thus, identifying phosphoinositide-binding partners and the aspects that define their specificity can direct drug development. However, current methods are costly, time-consuming, or technically challenging and inaccessible to many laboratories. We developed a method called PLIF (for "protein-lipid interaction by fluorescence") that uses fluorescently labeled liposomes and tethered, tagged proteins or peptides to enable fast and reliable determination of protein domain specificity for given phosphoinositides in a membrane environment. We validated PLIF against previously known phosphoinositide-binding partners for various proteins and obtained relative affinity profiles. Moreover, PLIF analysis of the sorting nexin (SNX) family revealed not only that SNXs bound most strongly to phosphatidylinositol 3-phosphate (PtdIns3P or PI3P), which is known from analysis with other methods, but also that they interacted with other phosphoinositides, which had not previously been detected using other techniques. Different phosphoinositide partners, even those with relatively weak binding affinity, could account for the diverse functions of SNXs in vesicular trafficking and protein sorting. Because PLIF is sensitive, semiquantitative, and performed in a high-throughput manner, it may be used to screen for highly specific protein-lipid interaction inhibitors. PMID:27025878

  1. Quantitative WDS analysis using electron probe microanalyzer

    SciTech Connect

    Ul-Hamid, Anwar . E-mail: anwar@kfupm.edu.sa; Tawancy, Hani M.; Mohammed, Abdul-Rashid I.; Al-Jaroudi, Said S.; Abbas, Nureddin M.

    2006-04-15

    In this paper, the procedure for conducting quantitative elemental analysis by ZAF correction method using wavelength dispersive X-ray spectroscopy (WDS) in an electron probe microanalyzer (EPMA) is elaborated. Analysis of a thermal barrier coating (TBC) system formed on a Ni-based single crystal superalloy is presented as an example to illustrate the analysis of samples consisting of a large number of major and minor elements. The analysis was performed by known standards and measured peak-to-background intensity ratios. The procedure for using separate set of acquisition conditions for major and minor element analysis is explained and its importance is stressed.

  2. Seniors' Online Communities: A Quantitative Content Analysis

    ERIC Educational Resources Information Center

    Nimrod, Galit

    2010-01-01

    Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…

  3. Method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, James S.; Gjerde, Douglas T.; Schmuckler, Gabriella

    1981-06-09

    An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  4. Quantitative Proteomics Analysis of Leukemia Cells.

    PubMed

    Halbach, Sebastian; Dengjel, Jörn; Brummer, Tilman

    2016-01-01

    Chronic myeloid leukemia (CML) is driven by the oncogenic fusion kinase Bcr-Abl, which organizes its own signaling network with various proteins. These proteins, their interactions, and their role in relevant signaling pathways can be analyzed by quantitative mass spectrometry (MS) approaches in various models systems, e.g., in cell culture models. In this chapter, we describe in detail immunoprecipitations and quantitative proteomics analysis using stable isotope labeling by amino acids in cell culture (SILAC) of components of the Bcr-Abl signaling pathway in the human CML cell line K562. PMID:27581145

  5. Comprehensive quantitative analysis on privacy leak behavior.

    PubMed

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  6. Comprehensive Quantitative Analysis on Privacy Leak Behavior

    PubMed Central

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  7. Quantitative Proteomic Analysis of the Human Nucleolus.

    PubMed

    Bensaddek, Dalila; Nicolas, Armel; Lamond, Angus I

    2016-01-01

    Recent years have witnessed spectacular progress in the field of mass spectrometry (MS)-based quantitative proteomics, including advances in instrumentation, chromatography, sample preparation methods, and experimental design for multidimensional analyses. It is now possible not only to identify most of the protein components of a cell proteome in a single experiment, but also to describe additional proteome dimensions, such as protein turnover rates, posttranslational modifications, and subcellular localization. Furthermore, by comparing the proteome at different time points, it is possible to create a "time-lapse" view of proteome dynamics. By combining high-throughput quantitative proteomics with detailed subcellular fractionation protocols and data analysis techniques it is also now possible to characterize in detail the proteomes of specific subcellular organelles, providing important insights into cell regulatory mechanisms and physiological responses. In this chapter we present a reliable workflow and protocol for MS-based analysis and quantitation of the proteome of nucleoli isolated from human cells. The protocol presented is based on a SILAC analysis of human MCF10A-Src-ER cells with analysis performed on a Q-Exactive Plus Orbitrap MS instrument (Thermo Fisher Scientific). The subsequent chapter describes how to process the resulting raw MS files from this experiment using MaxQuant software and data analysis procedures to evaluate the nucleolar proteome using customized R scripts. PMID:27576725

  8. Quantitative image analysis of celiac disease

    PubMed Central

    Ciaccio, Edward J; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2015-01-01

    We outline the use of quantitative techniques that are currently used for analysis of celiac disease. Image processing techniques can be useful to statistically analyze the pixular data of endoscopic images that is acquired with standard or videocapsule endoscopy. It is shown how current techniques have evolved to become more useful for gastroenterologists who seek to understand celiac disease and to screen for it in suspected patients. New directions for focus in the development of methodology for diagnosis and treatment of this disease are suggested. It is evident that there are yet broad areas where there is potential to expand the use of quantitative techniques for improved analysis in suspected or known celiac disease patients. PMID:25759524

  9. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  10. Global analysis of the Deinococcus radiodurans proteome by using accurate mass tags

    PubMed Central

    Lipton, Mary S.; Paša-Tolić, Ljiljana; Anderson, Gordon A.; Anderson, David J.; Auberry, Deanna L.; Battista, John R.; Daly, Michael J.; Fredrickson, Jim; Hixson, Kim K.; Kostandarithes, Heather; Masselon, Christophe; Markillie, Lye Meng; Moore, Ronald J.; Romine, Margaret F.; Shen, Yufeng; Stritmatter, Eric; Tolić, Nikola; Udseth, Harold R.; Venkateswaran, Amudhan; Wong, Kwong-Kwok; Zhao, Rui; Smith, Richard D.

    2002-01-01

    Understanding biological systems and the roles of their constituents is facilitated by the ability to make quantitative, sensitive, and comprehensive measurements of how their proteome changes, e.g., in response to environmental perturbations. To this end, we have developed a high-throughput methodology to characterize an organism's dynamic proteome based on the combination of global enzymatic digestion, high-resolution liquid chromatographic separations, and analysis by Fourier transform ion cyclotron resonance mass spectrometry. The peptides produced serve as accurate mass tags for the proteins and have been used to identify with high confidence >61% of the predicted proteome for the ionizing radiation-resistant bacterium Deinococcus radiodurans. This fraction represents the broadest proteome coverage for any organism to date and includes 715 proteins previously annotated as either hypothetical or conserved hypothetical. PMID:12177431

  11. Quantitative resilience analysis through control design.

    SciTech Connect

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  12. Quantitative Bias Analysis in Regulatory Settings.

    PubMed

    Lash, Timothy L; Fox, Matthew P; Cooney, Darryl; Lu, Yun; Forshee, Richard A

    2016-07-01

    Nonrandomized studies are essential in the postmarket activities of the US Food and Drug Administration, which, however, must often act on the basis of imperfect data. Systematic errors can lead to inaccurate inferences, so it is critical to develop analytic methods that quantify uncertainty and bias and ensure that these methods are implemented when needed. "Quantitative bias analysis" is an overarching term for methods that estimate quantitatively the direction, magnitude, and uncertainty associated with systematic errors influencing measures of associations. The Food and Drug Administration sponsored a collaborative project to develop tools to better quantify the uncertainties associated with postmarket surveillance studies used in regulatory decision making. We have described the rationale, progress, and future directions of this project. PMID:27196652

  13. Quantitative Northern Blot Analysis of Mammalian rRNA Processing.

    PubMed

    Wang, Minshi; Pestov, Dimitri G

    2016-01-01

    Assembly of eukaryotic ribosomes is an elaborate biosynthetic process that begins in the nucleolus and requires hundreds of cellular factors. Analysis of rRNA processing has been instrumental for studying the mechanisms of ribosome biogenesis and effects of stress conditions on the molecular milieu of the nucleolus. Here, we describe the quantitative analysis of the steady-state levels of rRNA precursors, applicable to studies in mammalian cells and other organisms. We include protocols for gel electrophoresis and northern blotting of rRNA precursors using procedures optimized for the large size of these RNAs. We also describe the ratio analysis of multiple precursors, a technique that facilitates the accurate assessment of changes in the efficiency of individual pre-rRNA processing steps. PMID:27576717

  14. A simple and accurate protocol for absolute polar metabolite quantification in cell cultures using quantitative nuclear magnetic resonance.

    PubMed

    Goldoni, Luca; Beringhelli, Tiziana; Rocchia, Walter; Realini, Natalia; Piomelli, Daniele

    2016-05-15

    Absolute analyte quantification by nuclear magnetic resonance (NMR) spectroscopy is rarely pursued in metabolomics, even though this would allow researchers to compare results obtained using different techniques. Here we report on a new protocol that permits, after pH-controlled serum protein removal, the sensitive quantification (limit of detection [LOD] = 5-25 μM) of hydrophilic nutrients and metabolites in the extracellular medium of cells in cultures. The method does not require the use of databases and uses PULCON (pulse length-based concentration determination) quantitative NMR to obtain results that are significantly more accurate and reproducible than those obtained by CPMG (Carr-Purcell-Meiboom-Gill) sequence or post-processing filtering approaches. Three practical applications of the method highlight its flexibility under different cell culture conditions. We identified and quantified (i) metabolic differences between genetically engineered human cell lines, (ii) alterations in cellular metabolism induced by differentiation of mouse myoblasts into myotubes, and (iii) metabolic changes caused by activation of neurotransmitter receptors in mouse myoblasts. Thus, the new protocol offers an easily implementable, efficient, and versatile tool for the investigation of cellular metabolism and signal transduction. PMID:26898303

  15. Accurate, Fast and Cost-Effective Diagnostic Test for Monosomy 1p36 Using Real-Time Quantitative PCR

    PubMed Central

    Cunha, Pricila da Silva; Pena, Heloisa B.; D'Angelo, Carla Sustek; Koiffmann, Celia P.; Rosenfeld, Jill A.; Shaffer, Lisa G.; Stofanko, Martin; Gonçalves-Dornelas, Higgor; Pena, Sérgio Danilo Junho

    2014-01-01

    Monosomy 1p36 is considered the most common subtelomeric deletion syndrome in humans and it accounts for 0.5–0.7% of all the cases of idiopathic intellectual disability. The molecular diagnosis is often made by microarray-based comparative genomic hybridization (aCGH), which has the drawback of being a high-cost technique. However, patients with classic monosomy 1p36 share some typical clinical characteristics that, together with its common prevalence, justify the development of a less expensive, targeted diagnostic method. In this study, we developed a simple, rapid, and inexpensive real-time quantitative PCR (qPCR) assay for targeted diagnosis of monosomy 1p36, easily accessible for low-budget laboratories in developing countries. For this, we have chosen two target genes which are deleted in the majority of patients with monosomy 1p36: PRKCZ and SKI. In total, 39 patients previously diagnosed with monosomy 1p36 by aCGH, fluorescent in situ hybridization (FISH), and/or multiplex ligation-dependent probe amplification (MLPA) all tested positive on our qPCR assay. By simultaneously using these two genes we have been able to detect 1p36 deletions with 100% sensitivity and 100% specificity. We conclude that qPCR of PRKCZ and SKI is a fast and accurate diagnostic test for monosomy 1p36, costing less than 10 US dollars in reagent costs. PMID:24839341

  16. Quantitative analysis of surface electromyography: Biomarkers for convulsive seizures.

    PubMed

    Beniczky, Sándor; Conradsen, Isa; Pressler, Ronit; Wolf, Peter

    2016-08-01

    Muscle activity during seizures is in electroencephalographical (EEG) praxis often considered an irritating artefact. This article discusses ways by surface electromyography (EMG) to turn it into a valuable tool of epileptology. Muscles are in direct synaptic contact with motor neurons. Therefore, EMG signals provide direct information about the electric activity in the motor cortex. Qualitative analysis of EMG has traditionally been a part of the long-term video-EEG recordings. Recent development in quantitative analysis of EMG signals yielded valuable information on the pathomechanisms of convulsive seizures, demonstrating that it was different from maximal voluntary contraction, and different from convulsive psychogenic non-epileptic seizures. Furthermore, the tonic phase of the generalised tonic-clonic seizures (GTCS) proved to have different quantitative features than tonic seizures. The high temporal resolution of EMG allowed detailed characterisation of temporal dynamics of the GTCS, suggesting that the same inhibitory mechanisms that try to prevent the build-up of the seizure activity, contribute to ending the seizure. These findings have clinical implications: the quantitative EMG features provided the pathophysiologic substrate for developing neurophysiologic biomarkers that accurately identify GTCS. This proved to be efficient both for seizure detection and for objective, automated distinction between convulsive and non-convulsive epileptic seizures. PMID:27212115

  17. Accuracy in Quantitative 3D Image Analysis

    PubMed Central

    Bassel, George W.

    2015-01-01

    Quantitative 3D imaging is becoming an increasingly popular and powerful approach to investigate plant growth and development. With the increased use of 3D image analysis, standards to ensure the accuracy and reproducibility of these data are required. This commentary highlights how image acquisition and postprocessing can introduce artifacts into 3D image data and proposes steps to increase both the accuracy and reproducibility of these analyses. It is intended to aid researchers entering the field of 3D image processing of plant cells and tissues and to help general readers in understanding and evaluating such data. PMID:25804539

  18. Self-aliquoting microarray plates for accurate quantitative matrix-assisted laser desorption/ionization mass spectrometry.

    PubMed

    Pabst, Martin; Fagerer, Stephan R; Köhling, Rudolf; Küster, Simon K; Steinhoff, Robert; Badertscher, Martin; Wahl, Fabian; Dittrich, Petra S; Jefimovs, Konstantins; Zenobi, Renato

    2013-10-15

    Matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) is a fast analysis tool employed for the detection of a broad range of analytes. However, MALDI-MS has a reputation of not being suitable for quantitative analysis. Inhomogeneous analyte/matrix co-crystallization, spot-to-spot inhomogeneity, as well as a typically low number of replicates are the main contributing factors. Here, we present a novel MALDI sample target for quantitative MALDI-MS applications, which addresses the limitations mentioned above. The platform is based on the recently developed microarray for mass spectrometry (MAMS) technology and contains parallel lanes of hydrophilic reservoirs. Samples are not pipetted manually but deposited by dragging one or several sample droplets with a metal sliding device along these lanes. Sample is rapidly and automatically aliquoted into the sample spots due to the interplay of hydrophilic/hydrophobic interactions. With a few microliters of sample, it is possible to aliquot up to 40 replicates within seconds, each aliquot containing just 10 nL. The analyte droplet dries immediately and homogeneously, and consumption of the whole spot during MALDI-MS analysis is typically accomplished within few seconds. We evaluated these sample targets with respect to their suitability for use with different samples and matrices. Furthermore, we tested their application for generating calibration curves of standard peptides with α-cyano-4-hdydroxycinnamic acid as a matrix. For angiotensin II and [Glu(1)]-fibrinopeptide B we achieved coefficients of determination (r(2)) greater than 0.99 without the use of internal standards. PMID:24003910

  19. Digitally Enhanced Thin-Layer Chromatography: An Inexpensive, New Technique for Qualitative and Quantitative Analysis

    ERIC Educational Resources Information Center

    Hess, Amber Victoria Irish

    2007-01-01

    A study conducted shows that if digital photography is combined with regular thin-layer chromatography (TLC), it could perform highly improved qualitative analysis as well as make accurate quantitative analysis possible for a much lower cost than commercial equipment. The findings suggest that digitally enhanced TLC (DE-TLC) is low-cost and easy…

  20. Segmentation and quantitative analysis of individual cells in developmental tissues.

    PubMed

    Nandy, Kaustav; Kim, Jusub; McCullough, Dean P; McAuliffe, Matthew; Meaburn, Karen J; Yamaguchi, Terry P; Gudla, Prabhakar R; Lockett, Stephen J

    2014-01-01

    Image analysis is vital for extracting quantitative information from biological images and is used extensively, including investigations in developmental biology. The technique commences with the segmentation (delineation) of objects of interest from 2D images or 3D image stacks and is usually followed by the measurement and classification of the segmented objects. This chapter focuses on the segmentation task and here we explain the use of ImageJ, MIPAV (Medical Image Processing, Analysis, and Visualization), and VisSeg, three freely available software packages for this purpose. ImageJ and MIPAV are extremely versatile and can be used in diverse applications. VisSeg is a specialized tool for performing highly accurate and reliable 2D and 3D segmentation of objects such as cells and cell nuclei in images and stacks. PMID:24318825

  1. Quantitative architectural analysis of bronchial intraepithelial neoplasia

    NASA Astrophysics Data System (ADS)

    Guillaud, Martial; MacAulay, Calum E.; Le Riche, Jean C.; Dawe, Chris; Korbelik, Jagoda; Lam, Stephen

    2000-04-01

    Considerable variation exists among pathologist in the interpretation of intraepithelial neoplasia making it difficult to determine the natural history of these lesion and to establish management guidelines for chemoprevention. The aim of the study is to evaluate architectural features of pre-neoplastic progression in lung cancer, and to search for a correlation between architectural index and conventional pathology. Quantitative architectural analysis was performed on a series of normal lung biopsies and Carcinoma In Situ (CIS). Centers of gravity of the nuclei within a pre-defined region of interest were used as seeds to generate a Voronoi Diagram. About 30 features derived from the Voronoi diagram, its dual the Delaunay tessellation, and the Minimum Spanning Tree were extracted. A discriminant analysis was performed to separate between the two groups. The architectural Index was calculated for each of the bronchial biopsies that were interpreted as hyperplasia, metaplasia, mild, moderate or severe dysplasia by conventional histopathology criteria. As a group, lesions classified as CIS by conventional histopathology criteria could be distinguished from dysplasia using the architectural Index. Metaplasia was distinct from hyperplasia and hyperplasia from normal. There was overlap between severe and moderate dysplasia but mild dysplasia could be distinguished form moderate dysplasia. Bronchial intraepithelial neoplastic lesions can be degraded objectively by architectural features. Combination of architectural features and nuclear morphometric features may improve the quantitation of the changes occurring during the intra-epithelial neoplastic process.

  2. Quantitative interactome analysis reveals a chemoresistant edgotype

    PubMed Central

    Chavez, Juan D.; Schweppe, Devin K.; Eng, Jimmy K.; Zheng, Chunxiang; Taipale, Alex; Zhang, Yiyi; Takara, Kohji; Bruce, James E.

    2015-01-01

    Chemoresistance is a common mode of therapy failure for many cancers. Tumours develop resistance to chemotherapeutics through a variety of mechanisms, with proteins serving pivotal roles. Changes in protein conformations and interactions affect the cellular response to environmental conditions contributing to the development of new phenotypes. The ability to understand how protein interaction networks adapt to yield new function or alter phenotype is limited by the inability to determine structural and protein interaction changes on a proteomic scale. Here, chemical crosslinking and mass spectrometry were employed to quantify changes in protein structures and interactions in multidrug-resistant human carcinoma cells. Quantitative analysis of the largest crosslinking-derived, protein interaction network comprising 1,391 crosslinked peptides allows for ‘edgotype' analysis in a cell model of chemoresistance. We detect consistent changes to protein interactions and structures, including those involving cytokeratins, topoisomerase-2-alpha, and post-translationally modified histones, which correlate with a chemoresistant phenotype. PMID:26235782

  3. Quantitative interactome analysis reveals a chemoresistant edgotype.

    PubMed

    Chavez, Juan D; Schweppe, Devin K; Eng, Jimmy K; Zheng, Chunxiang; Taipale, Alex; Zhang, Yiyi; Takara, Kohji; Bruce, James E

    2015-01-01

    Chemoresistance is a common mode of therapy failure for many cancers. Tumours develop resistance to chemotherapeutics through a variety of mechanisms, with proteins serving pivotal roles. Changes in protein conformations and interactions affect the cellular response to environmental conditions contributing to the development of new phenotypes. The ability to understand how protein interaction networks adapt to yield new function or alter phenotype is limited by the inability to determine structural and protein interaction changes on a proteomic scale. Here, chemical crosslinking and mass spectrometry were employed to quantify changes in protein structures and interactions in multidrug-resistant human carcinoma cells. Quantitative analysis of the largest crosslinking-derived, protein interaction network comprising 1,391 crosslinked peptides allows for 'edgotype' analysis in a cell model of chemoresistance. We detect consistent changes to protein interactions and structures, including those involving cytokeratins, topoisomerase-2-alpha, and post-translationally modified histones, which correlate with a chemoresistant phenotype. PMID:26235782

  4. Quantitative analysis of NMR spectra with chemometrics

    NASA Astrophysics Data System (ADS)

    Winning, H.; Larsen, F. H.; Bro, R.; Engelsen, S. B.

    2008-01-01

    The number of applications of chemometrics to series of NMR spectra is rapidly increasing due to an emerging interest for quantitative NMR spectroscopy e.g. in the pharmaceutical and food industries. This paper gives an analysis of advantages and limitations of applying the two most common chemometric procedures, Principal Component Analysis (PCA) and Multivariate Curve Resolution (MCR), to a designed set of 231 simple alcohol mixture (propanol, butanol and pentanol) 1H 400 MHz spectra. The study clearly demonstrates that the major advantage of chemometrics is the visualisation of larger data structures which adds a new exploratory dimension to NMR research. While robustness and powerful data visualisation and exploration are the main qualities of the PCA method, the study demonstrates that the bilinear MCR method is an even more powerful method for resolving pure component NMR spectra from mixtures when certain conditions are met.

  5. Improvements in Accurate GPS Positioning Using Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Koyama, Yuichiro; Tanaka, Toshiyuki

    Although the Global Positioning System (GPS) is used widely in car navigation systems, cell phones, surveying, and other areas, several issues still exist. We focus on the continuous data received in public use of GPS, and propose a new positioning algorithm that uses time series analysis. By fitting an autoregressive model to the time series model of the pseudorange, we propose an appropriate state-space model. We apply the Kalman filter to the state-space model and use the pseudorange estimated by the filter in our positioning calculations. The results of the authors' positioning experiment show that the accuracy of the proposed method is much better than that of the standard method. In addition, as we can obtain valid values estimated by time series analysis using the state-space model, the proposed state-space model can be applied to several other fields.

  6. Accurate feature detection and estimation using nonlinear and multiresolution analysis

    NASA Astrophysics Data System (ADS)

    Rudin, Leonid; Osher, Stanley

    1994-11-01

    A program for feature detection and estimation using nonlinear and multiscale analysis was completed. The state-of-the-art edge detection was combined with multiscale restoration (as suggested by the first author) and robust results in the presence of noise were obtained. Successful applications to numerous images of interest to DOD were made. Also, a new market in the criminal justice field was developed, based in part, on this work.

  7. Toward Sensitive and Accurate Analysis of Antibody Biotherapeutics by Liquid Chromatography Coupled with Mass Spectrometry

    PubMed Central

    An, Bo; Zhang, Ming

    2014-01-01

    Remarkable methodological advances in the past decade have expanded the application of liquid chromatography coupled with mass spectrometry (LC/MS) analysis of biotherapeutics. Currently, LC/MS represents a promising alternative or supplement to the traditional ligand binding assay (LBA) in the pharmacokinetic, pharmacodynamic, and toxicokinetic studies of protein drugs, owing to the rapid and cost-effective method development, high specificity and reproducibility, low sample consumption, the capacity of analyzing multiple targets in one analysis, and the fact that a validated method can be readily adapted across various matrices and species. While promising, technical challenges associated with sensitivity, sample preparation, method development, and quantitative accuracy need to be addressed to enable full utilization of LC/MS. This article introduces the rationale and technical challenges of LC/MS techniques in biotherapeutics analysis and summarizes recently developed strategies to alleviate these challenges. Applications of LC/MS techniques on quantification and characterization of antibody biotherapeutics are also discussed. We speculate that despite the highly attractive features of LC/MS, it will not fully replace traditional assays such as LBA in the foreseeable future; instead, the forthcoming trend is likely the conjunction of biochemical techniques with versatile LC/MS approaches to achieve accurate, sensitive, and unbiased characterization of biotherapeutics in highly complex pharmaceutical/biologic matrices. Such combinations will constitute powerful tools to tackle the challenges posed by the rapidly growing needs for biotherapeutics development. PMID:25185260

  8. Reduction procedures for accurate analysis of MSX surveillance experiment data

    NASA Technical Reports Server (NTRS)

    Gaposchkin, E. Mike; Lane, Mark T.; Abbot, Rick I.

    1994-01-01

    Technical challenges of the Midcourse Space Experiment (MSX) science instruments require careful characterization and calibration of these sensors for analysis of surveillance experiment data. Procedures for reduction of Resident Space Object (RSO) detections will be presented which include refinement and calibration of the metric and radiometric (and photometric) data and calculation of a precise MSX ephemeris. Examples will be given which support the reduction, and these are taken from ground-test data similar in characteristics to the MSX sensors and from the IRAS satellite RSO detections. Examples to demonstrate the calculation of a precise ephemeris will be provided from satellites in similar orbits which are equipped with S-band transponders.

  9. Binary Imaging Analysis for Comprehensive Quantitative Assessment of Peripheral Nerve

    PubMed Central

    Hunter, Daniel A.; Moradzadeh, Arash; Whitlock, Elizabeth L.; Brenner, Michael J.; Myckatyn, Terence M.; Wei, Cindy H.; Tung, Thomas H.H.; Mackinnon, Susan E.

    2007-01-01

    Quantitative histomorphometry is the current gold standard for objective measurement of nerve architecture and its components. Many methods still in use rely heavily upon manual techniques that are prohibitively time consuming, predisposing to operator fatigue, sampling error, and overall limited reproducibility. More recently, investigators have attempted to combine the speed of automated morphometry with the accuracy of manual and semi-automated methods. Systematic refinements in binary imaging analysis techniques combined with an algorithmic approach allow for more exhaustive characterization of nerve parameters in the surgically relevant injury paradigms of regeneration following crush, transection, and nerve gap injuries. The binary imaging method introduced here uses multiple bitplanes to achieve reproducible, high throughput quantitative assessment of peripheral nerve. Number of myelinated axons, myelinated fiber diameter, myelin thickness, fiber distributions, myelinated fiber density, and neural debris can be quantitatively evaluated with stratification of raw data by nerve component. Results of this semi-automated method are validated by comparing values against those obtained with manual techniques. The use of this approach results in more rapid, accurate, and complete assessment of myelinated axons than manual techniques. PMID:17675163

  10. Materials characterization through quantitative digital image analysis

    SciTech Connect

    J. Philliber; B. Antoun; B. Somerday; N. Yang

    2000-07-01

    A digital image analysis system has been developed to allow advanced quantitative measurement of microstructural features. This capability is maintained as part of the microscopy facility at Sandia, Livermore. The system records images digitally, eliminating the use of film. Images obtained from other sources may also be imported into the system. Subsequent digital image processing enhances image appearance through the contrast and brightness adjustments. The system measures a variety of user-defined microstructural features--including area fraction, particle size and spatial distributions, grain sizes and orientations of elongated particles. These measurements are made in a semi-automatic mode through the use of macro programs and a computer controlled translation stage. A routine has been developed to create large montages of 50+ separate images. Individual image frames are matched to the nearest pixel to create seamless montages. Results from three different studies are presented to illustrate the capabilities of the system.

  11. Quantitative transcriptome analysis using RNA-seq.

    PubMed

    Külahoglu, Canan; Bräutigam, Andrea

    2014-01-01

    RNA-seq has emerged as the technology of choice to quantify gene expression. This technology is a convenient accurate tool to quantify diurnal changes in gene expression, gene discovery, differential use of promoters, and splice variants for all genes expressed in a single tissue. Thus, RNA-seq experiments provide sequence information and absolute expression values about transcripts in addition to relative quantification available with microarrays or qRT-PCR. The depth of information by sequencing requires careful assessment of RNA intactness and DNA contamination. Although the RNA-seq is comparatively recent, a standard analysis framework has emerged with the packages of Bowtie2, TopHat, and Cufflinks. With rising popularity of RNA-seq tools have become manageable for researchers without much bioinformatical knowledge or programming skills. Here, we present a workflow for a RNA-seq experiment from experimental planning to biological data extraction. PMID:24792045

  12. Quantitative proteomic analysis of single pancreatic islets

    PubMed Central

    Waanders, Leonie F.; Chwalek, Karolina; Monetti, Mara; Kumar, Chanchal; Lammert, Eckhard; Mann, Matthias

    2009-01-01

    Technological developments make mass spectrometry (MS)-based proteomics a central pillar of biochemical research. MS has been very successful in cell culture systems, where sample amounts are not limiting. To extend its capabilities to extremely small, physiologically distinct cell types isolated from tissue, we developed a high sensitivity chromatographic system that measures nanogram protein mixtures for 8 h with very high resolution. This technology is based on splitting gradient effluents into a capture capillary and provides an inherent technical replicate. In a single analysis, this allowed us to characterize kidney glomeruli isolated by laser capture microdissection to a depth of more than 2,400 proteins. From pooled pancreatic islets of Langerhans, another type of “miniorgan,” we obtained an in-depth proteome of 6,873 proteins, many of them involved in diabetes. We quantitatively compared the proteome of single islets, containing 2,000–4,000 cells, treated with high or low glucose levels, and covered most of the characteristic functions of beta cells. Our ultrasensitive analysis recapitulated known hyperglycemic changes but we also find components up-regulated such as the mitochondrial stress regulator Park7. Direct proteomic analysis of functionally distinct cellular structures opens up perspectives in physiology and pathology. PMID:19846766

  13. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    NASA Astrophysics Data System (ADS)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  14. Semiautomatic Software For Quantitative Analysis Of Cardiac Positron Tomography Studies

    NASA Astrophysics Data System (ADS)

    Ratib, Osman; Bidaut, Luc; Nienaber, Christoph; Krivokapich, Janine; Schelbert, Heinrich R.; Phelps, Michael E.

    1988-06-01

    In order to derive accurate values for true tissue radiotracers concentrations from gated positron emission tomography (PET) images of the heart, which are critical for quantifying noninvasively regional myocardial blood flow and metabolism, appropriate corrections for partial volume effect (PVE) and contamination from adjacent anatomical structures are required. We therefore developed an integrated software package for quantitative analysis of tomographic images which provides for such corrections. A semiautomatic edge detection technique outlines and partitions the myocardium into sectors. Myocardial wall thickness is measured on the images perpendicularly to the detected edges and used to correct for PVE. The programs automatically correct for radioactive decay, activity calibration and cross contaminations for both static and dynamic studies. Parameters derived with these programs include tracer concentrations and their changes over time. They are used for calculating regional metabolic rates and can be further displayed as color coded parametric images. The approach was validated for PET imaging in 11 dog experiments. 2D echocardiograms (Echo) were recorded simultaneously to validate the edge detection and wall thickness measurement techniques. After correction for PVE using automatic WT measurement, regional tissue tracer concentrations derived from PET images correlated well with true tissue concentrations as determined by well counting (r=0.98). These preliminary studies indicate that the developed automatic image analysis technique allows accurate and convenient evaluation of cardiac PET images for the measurement of both, regional tracer tissue concentrations as well as regional myocardial function.

  15. Selection of accurate reference genes in mouse trophoblast stem cells for reverse transcription-quantitative polymerase chain reaction.

    PubMed

    Motomura, Kaori; Inoue, Kimiko; Ogura, Atsuo

    2016-06-17

    Mouse trophoblast stem cells (TSCs) form colonies of different sizes and morphologies, which might reflect their degrees of differentiation. Therefore, each colony type can have a characteristic gene expression profile; however, the expression levels of internal reference genes may also change, causing fluctuations in their estimated gene expression levels. In this study, we validated seven housekeeping genes by using a geometric averaging method and identified Gapdh as the most stable gene across different colony types. Indeed, when Gapdh was used as the reference, expression levels of Elf5, a TSC marker gene, stringently classified TSC colonies into two groups: a high expression groups consisting of type 1 and 2 colonies, and a lower expression group consisting of type 3 and 4 colonies. This clustering was consistent with our putative classification of undifferentiated/differentiated colonies based on their time-dependent colony transitions. By contrast, use of an unstable reference gene (Rn18s) allowed no such clear classification. Cdx2, another TSC marker, did not show any significant colony type-specific expression pattern irrespective of the reference gene. Selection of stable reference genes for quantitative gene expression analysis might be critical, especially when cell lines consisting of heterogeneous cell populations are used. PMID:26853688

  16. Selection of accurate reference genes in mouse trophoblast stem cells for reverse transcription-quantitative polymerase chain reaction

    PubMed Central

    MOTOMURA, Kaori; INOUE, Kimiko; OGURA, Atsuo

    2016-01-01

    Mouse trophoblast stem cells (TSCs) form colonies of different sizes and morphologies, which might reflect their degrees of differentiation. Therefore, each colony type can have a characteristic gene expression profile; however, the expression levels of internal reference genes may also change, causing fluctuations in their estimated gene expression levels. In this study, we validated seven housekeeping genes by using a geometric averaging method and identified Gapdh as the most stable gene across different colony types. Indeed, when Gapdh was used as the reference, expression levels of Elf5, a TSC marker gene, stringently classified TSC colonies into two groups: a high expression groups consisting of type 1 and 2 colonies, and a lower expression group consisting of type 3 and 4 colonies. This clustering was consistent with our putative classification of undifferentiated/differentiated colonies based on their time-dependent colony transitions. By contrast, use of an unstable reference gene (Rn18s) allowed no such clear classification. Cdx2, another TSC marker, did not show any significant colony type-specific expression pattern irrespective of the reference gene. Selection of stable reference genes for quantitative gene expression analysis might be critical, especially when cell lines consisting of heterogeneous cell populations are used. PMID:26853688

  17. Accurate mass tag retention time database for urine proteome analysis by chromatography--mass spectrometry.

    PubMed

    Agron, I A; Avtonomov, D M; Kononikhin, A S; Popov, I A; Moshkovskii, S A; Nikolaev, E N

    2010-05-01

    Information about peptides and proteins in urine can be used to search for biomarkers of early stages of various diseases. The main technology currently used for identification of peptides and proteins is tandem mass spectrometry, in which peptides are identified by mass spectra of their fragmentation products. However, the presence of the fragmentation stage decreases sensitivity of analysis and increases its duration. We have developed a method for identification of human urinary proteins and peptides. This method based on the accurate mass and time tag (AMT) method does not use tandem mass spectrometry. The database of AMT tags containing more than 1381 AMT tags of peptides has been constructed. The software for database filling with AMT tags, normalizing the chromatograms, database application for identification of proteins and peptides, and their quantitative estimation has been developed. The new procedures for peptide identification by tandem mass spectra and the AMT tag database are proposed. The paper also lists novel proteins that have been identified in human urine for the first time. PMID:20632944

  18. Applying Knowledge of Quantitative Design and Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  19. A virtual environment for the accurate geologic analysis of Martian terrain

    NASA Astrophysics Data System (ADS)

    Traxler, Christoph; Paar, Gerhard; Gupta, Sanjeev; Hesina, Gerd; Sander, Kathrin; Barnes, Rob; Nauschnegg, Bernhard; Muller, Jan-Peter; Tao, Yu

    2015-04-01

    Remote geology on planetary surfaces requires immersive presentation of the environment to be investigated. Three-dimensional (3D) processing of images from rovers and satellites enables to reconstruct terrain in virtual space on Earth for scientific analysis. In this paper we present a virtual environment that allows to interactively explore 3D-reconstructed Martian terrain and perform accurate measurements on the surface. Geologists do not only require line-of-sight measurements between two points but much more the projected line-of-sight on the surface between two such points. Furthermore the tool supports to define paths of several points. It is also important for geologists to annotate the terrain they explore, especially when collaborating with colleagues. The path tool can also be used to separate geological layers or surround areas of interest. They can be linked with a text label directly positioned in 3D space and always oriented towards the viewing direction. All measurements and annotations can be maintained by a graphical user interface and used as landmarks, i.e. it is possible to fly to the corresponding locations. The virtual environment is fed with 3D vision products from rover cameras, placed in the 3D context gained from satellite images (digital elevations models and corresponding ortho images). This allows investigations in various scales from planet to microscopic level in a seamless manner. The modes of exploitation and added value of such an interactive means are manifold. The visualisation products enable us to map geological surfaces and rock layers over large areas in a quantitative framework. Accurate geometrical relationships of rock bodies especially for sedimentary layers can be reconstructed and the relationships between superposed layers can be established. Within sedimentary layers, we can delineate sedimentary faces and other characteristics. In particular, inclination of beds which may help ascertain flow directions can be

  20. Error Propagation Analysis for Quantitative Intracellular Metabolomics

    PubMed Central

    Tillack, Jana; Paczia, Nicole; Nöh, Katharina; Wiechert, Wolfgang; Noack, Stephan

    2012-01-01

    Model-based analyses have become an integral part of modern metabolic engineering and systems biology in order to gain knowledge about complex and not directly observable cellular processes. For quantitative analyses, not only experimental data, but also measurement errors, play a crucial role. The total measurement error of any analytical protocol is the result of an accumulation of single errors introduced by several processing steps. Here, we present a framework for the quantification of intracellular metabolites, including error propagation during metabolome sample processing. Focusing on one specific protocol, we comprehensively investigate all currently known and accessible factors that ultimately impact the accuracy of intracellular metabolite concentration data. All intermediate steps are modeled, and their uncertainty with respect to the final concentration data is rigorously quantified. Finally, on the basis of a comprehensive metabolome dataset of Corynebacterium glutamicum, an integrated error propagation analysis for all parts of the model is conducted, and the most critical steps for intracellular metabolite quantification are detected. PMID:24957773

  1. Quantitative methods for ecological network analysis.

    PubMed

    Ulanowicz, Robert E

    2004-12-01

    The analysis of networks of ecological trophic transfers is a useful complement to simulation modeling in the quest for understanding whole-ecosystem dynamics. Trophic networks can be studied in quantitative and systematic fashion at several levels. Indirect relationships between any two individual taxa in an ecosystem, which often differ in either nature or magnitude from their direct influences, can be assayed using techniques from linear algebra. The same mathematics can also be employed to ascertain where along the trophic continuum any individual taxon is operating, or to map the web of connections into a virtual linear chain that summarizes trophodynamic performance by the system. Backtracking algorithms with pruning have been written which identify pathways for the recycle of materials and energy within the system. The pattern of such cycling often reveals modes of control or types of functions exhibited by various groups of taxa. The performance of the system as a whole at processing material and energy can be quantified using information theory. In particular, the complexity of process interactions can be parsed into separate terms that distinguish organized, efficient performance from the capacity for further development and recovery from disturbance. Finally, the sensitivities of the information-theoretic system indices appear to identify the dynamical bottlenecks in ecosystem functioning. PMID:15556474

  2. EDXRF quantitative analysis of chromophore chemical elements in corundum samples.

    PubMed

    Bonizzoni, L; Galli, A; Spinolo, G; Palanza, V

    2009-12-01

    Corundum is a crystalline form of aluminum oxide (Al(2)O(3)) and is one of the rock-forming minerals. When aluminum oxide is pure, the mineral is colorless, but the presence of trace amounts of other elements such as iron, titanium, and chromium in the crystal lattice gives the typical colors (including blue, red, violet, pink, green, yellow, orange, gray, white, colorless, and black) of gemstone varieties. The starting point for our work is the quantitative evaluation of the concentration of chromophore chemical elements with a precision as good as possible to match the data obtained by different techniques as such as optical absorption photoluminescence. The aim is to give an interpretation of the absorption bands present in the NIR and visible ranges which do not involve intervalence charge transfer transitions (Fe(2+) --> Fe(3+) and Fe(2+) --> Ti(4+)), commonly considered responsible of the important features of the blue sapphire absorption spectra. So, we developed a method to evaluate as accurately as possible the autoabsorption effects and the secondary excitation effects which frequently are sources of relevant errors in the quantitative EDXRF analysis. PMID:19821113

  3. EBprot: Statistical analysis of labeling-based quantitative proteomics data.

    PubMed

    Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon

    2015-08-01

    Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/). PMID:25913743

  4. Functional linear models for association analysis of quantitative traits.

    PubMed

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. PMID:24130119

  5. Automatic quantitative analysis of cardiac MR perfusion images

    NASA Astrophysics Data System (ADS)

    Breeuwer, Marcel M.; Spreeuwers, Luuk J.; Quist, Marcel J.

    2001-07-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and accurate image analysis methods. This paper focuses on the evaluation of blood perfusion in the myocardium (the heart muscle) from MR images, using contrast-enhanced ECG-triggered MRI. We have developed an automatic quantitative analysis method, which works as follows. First, image registration is used to compensate for translation and rotation of the myocardium over time. Next, the boundaries of the myocardium are detected and for each position within the myocardium a time-intensity profile is constructed. The time interval during which the contrast agent passes for the first time through the left ventricle and the myocardium is detected and various parameters are measured from the time-intensity profiles in this interval. The measured parameters are visualized as color overlays on the original images. Analysis results are stored, so that they can later on be compared for different stress levels of the heart. The method is described in detail in this paper and preliminary validation results are presented.

  6. Some Epistemological Considerations Concerning Quantitative Analysis

    ERIC Educational Resources Information Center

    Dobrescu, Emilian

    2008-01-01

    This article presents the author's address at the 2007 "Journal of Applied Quantitative Methods" ("JAQM") prize awarding festivity. The festivity was included in the opening of the 4th International Conference on Applied Statistics, November 22, 2008, Bucharest, Romania. In the address, the author reflects on three theses that question the…

  7. Identification and quantitative analysis of chemical compounds based on multiscale linear fitting of terahertz spectra

    NASA Astrophysics Data System (ADS)

    Qiao, Lingbo; Wang, Yingxin; Zhao, Ziran; Chen, Zhiqiang

    2014-07-01

    Terahertz (THz) time-domain spectroscopy is considered as an attractive tool for the analysis of chemical composition. The traditional methods for identification and quantitative analysis of chemical compounds by THz spectroscopy are all based on full-spectrum data. However, intrinsic features of the THz spectrum only lie in absorption peaks due to existence of disturbances, such as unexpected components, scattering effects, and barrier materials. We propose a strategy that utilizes Lorentzian parameters of THz absorption peaks, extracted by a multiscale linear fitting method, for both identification of pure chemicals and quantitative analysis of mixtures. The multiscale linear fitting method can automatically remove background content and accurately determine Lorentzian parameters of the absorption peaks. The high recognition rate for 16 pure chemical compounds and the accurate predicted concentrations for theophylline-lactose mixtures demonstrate the practicability of our approach.

  8. Quantitative analysis of comparative genomic hybridization

    SciTech Connect

    Manoir, S. du; Bentz, M.; Joos, S. |

    1995-01-01

    Comparative genomic hybridization (CGH) is a new molecular cytogenetic method for the detection of chromosomal imbalances. Following cohybridization of DNA prepared from a sample to be studied and control DNA to normal metaphase spreads, probes are detected via different fluorochromes. The ratio of the test and control fluorescence intensities along a chromosome reflects the relative copy number of segments of a chromosome in the test genome. Quantitative evaluation of CGH experiments is required for the determination of low copy changes, e.g., monosomy or trisomy, and for the definition of the breakpoints involved in unbalanced rearrangements. In this study, a program for quantitation of CGH preparations is presented. This program is based on the extraction of the fluorescence ratio profile along each chromosome, followed by averaging of individual profiles from several metaphase spreads. Objective parameters critical for quantitative evaluations were tested, and the criteria for selection of suitable CGH preparations are described. The granularity of the chromosome painting and the regional inhomogeneity of fluorescence intensities in metaphase spreads proved to be crucial parameters. The coefficient of variation of the ratio value for chromosomes in balanced state (CVBS) provides a general quality criterion for CGH experiments. Different cutoff levels (thresholds) of average fluorescence ratio values were compared for their specificity and sensitivity with regard to the detection of chromosomal imbalances. 27 refs., 15 figs., 1 tab.

  9. Identification and evaluation of new reference genes in Gossypium hirsutum for accurate normalization of real-time quantitative RT-PCR data

    PubMed Central

    2010-01-01

    Background Normalizing through reference genes, or housekeeping genes, can make more accurate and reliable results from reverse transcription real-time quantitative polymerase chain reaction (qPCR). Recent studies have shown that no single housekeeping gene is universal for all experiments. Thus, suitable reference genes should be the first step of any qPCR analysis. Only a few studies on the identification of housekeeping gene have been carried on plants. Therefore qPCR studies on important crops such as cotton has been hampered by the lack of suitable reference genes. Results By the use of two distinct algorithms, implemented by geNorm and NormFinder, we have assessed the gene expression of nine candidate reference genes in cotton: GhACT4, GhEF1α5, GhFBX6, GhPP2A1, GhMZA, GhPTB, GhGAPC2, GhβTUB3 and GhUBQ14. The candidate reference genes were evaluated in 23 experimental samples consisting of six distinct plant organs, eight stages of flower development, four stages of fruit development and in flower verticils. The expression of GhPP2A1 and GhUBQ14 genes were the most stable across all samples and also when distinct plants organs are examined. GhACT4 and GhUBQ14 present more stable expression during flower development, GhACT4 and GhFBX6 in the floral verticils and GhMZA and GhPTB during fruit development. Our analysis provided the most suitable combination of reference genes for each experimental set tested as internal control for reliable qPCR data normalization. In addition, to illustrate the use of cotton reference genes we checked the expression of two cotton MADS-box genes in distinct plant and floral organs and also during flower development. Conclusion We have tested the expression stabilities of nine candidate genes in a set of 23 tissue samples from cotton plants divided into five different experimental sets. As a result of this evaluation, we recommend the use of GhUBQ14 and GhPP2A1 housekeeping genes as superior references for normalization of gene

  10. High throughput comparative proteome analysis using a quantitative cysteinyl-peptide enrichment technology

    SciTech Connect

    Liu, Tao; Qian, Weijun; Strittmatter, Eric F.; Camp, David G.; Anderson, Gordon A.; Thrall, Brian D.; Smith, Richard D.

    2004-09-15

    A new quantitative cysteinyl-peptide enrichment technology (QCET) was developed to achieve higher efficiency, greater dynamic range, and higher throughput in quantitative proteomics that use stable-isotope labeling techniques combined with high resolution liquid chromatography (LC)-mass spectrometry (MS). This approach involves {sup 18}O labeling of tryptic peptides, high efficiency enrichment of cysteine-containing peptides, and confident protein identification and quantification using the accurate mass and time tag strategy. Proteome profiling of naive and in vitro-differentiated human mammary epithelial cells using QCET resulted in the identification and quantification of 603 proteins in a single LC-Fourier transform ion cyclotron resonance MS analysis. Advantages of this technology include: (1) a simple, highly efficient method for enriching cysteinyl-peptides; (2) a high throughput strategy suitable for extensive proteome analysis; and (3) improved labeling efficiency for better quantitative measurements. This technology enhances both the functional analysis of biological systems and the detection of potential clinical biomarkers.

  11. Quantitative phase imaging applied to laser damage detection and analysis.

    PubMed

    Douti, Dam-Bé L; Chrayteh, Mhamad; Aknoun, Sherazade; Doualle, Thomas; Hecquet, Christophe; Monneret, Serge; Gallais, Laurent

    2015-10-01

    We investigate phase imaging as a measurement method for laser damage detection and analysis of laser-induced modification of optical materials. Experiments have been conducted with a wavefront sensor based on lateral shearing interferometry associated with a high-magnification optical microscope. The system has been used for the in-line observation of optical thin films and bulk samples, laser irradiated in two different conditions: 500 fs pulses at 343 and 1030 nm, and millisecond to second irradiation with a CO2 laser at 10.6 μm. We investigate the measurement of the laser-induced damage threshold of optical material by detection and phase changes and show that the technique realizes high sensitivity with different optical path measurements lower than 1 nm. Additionally, the quantitative information on the refractive index or surface modification of the samples under test that is provided by the system has been compared to classical metrology instruments used for laser damage or laser ablation characterization (an atomic force microscope, a differential interference contrast microscope, and an optical surface profiler). An accurate in-line measurement of the morphology of laser-ablated sites, from few nanometers to hundred microns in depth, is shown. PMID:26479612

  12. Quantitative analysis of flagellar proteins in Drosophila sperm tails.

    PubMed

    Mendes Maia, Teresa; Paul-Gilloteaux, Perrine; Basto, Renata

    2015-01-01

    The cilium has a well-defined structure, which can still accommodate some morphological and molecular composition diversity to suit the functional requirements of different cell types. The sperm flagellum of the fruit fly Drosophila melanogaster appears as a good model to study the genetic regulation of axoneme assembly and motility, due to the wealth of genetic tools publically available for this organism. In addition, the fruit fly's sperm flagellum displays quite a long axoneme (∼1.8mm), which may facilitate both histological and biochemical analyses. Here, we present a protocol for imaging and quantitatively analyze proteins, which associate with the fly differentiating, and mature sperm flagella. We will use as an example the quantification of tubulin polyglycylation in wild-type testes and in Bug22 mutant testes, which present defects in the deposition of this posttranslational modification. During sperm biogenesis, flagella appear tightly bundled, which makes it more challenging to get accurate measurements of protein levels from immunostained specimens. The method we present is based on the use of a novel semiautomated, macro installed in the image processing software ImageJ. It allows to measure fluorescence levels in closely associated sperm tails, through an exact distinction between positive and background signals, and provides background-corrected pixel intensity values that can directly be used for data analysis. PMID:25837396

  13. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  14. Accurate quantitative 13C NMR spectroscopy: repeatability over time of site-specific 13C isotope ratio determination.

    PubMed

    Caytan, Elsa; Botosoa, Eliot P; Silvestre, Virginie; Robins, Richard J; Akoka, Serge; Remaud, Gérald S

    2007-11-01

    The stability over time (repeatability) for the determination of site-specific 13C/12C ratios at natural abundance by quantitative 13C NMR spectroscopy has been tested on three probes: enriched bilabeled [1,2-13C2]ethanol; ethanol at natural abundance; and vanillin at natural abundance. It is shown in all three cases that the standard deviation for a series of measurements taken every 2-3 months over periods between 9 and 13 months is equal to or smaller than the standard deviation calculated from 5-10 replicate measurements made on a single sample. The precision which can be achieved using the present analytical 13C NMR protocol is higher than the prerequisite value of 1-2 per thousand for the determination of site-specific 13C/12C ratios at natural abundance (13C-SNIF-NMR). Hence, this technique permits the discrimination of very small variations in 13C/12C ratios between carbon positions, as found in biogenic natural products. This observed stability over time in 13C NMR spectroscopy indicates that further improvements in precision will depend primarily on improved signal-to-noise ratio. PMID:17900175

  15. Application of an Effective Statistical Technique for an Accurate and Powerful Mining of Quantitative Trait Loci for Rice Aroma Trait

    PubMed Central

    Golestan Hashemi, Farahnaz Sadat; Rafii, Mohd Y.; Ismail, Mohd Razi; Mohamed, Mahmud Tengku Muda; Rahim, Harun A.; Latif, Mohammad Abdul; Aslani, Farzad

    2015-01-01

    When a phenotype of interest is associated with an external/internal covariate, covariate inclusion in quantitative trait loci (QTL) analyses can diminish residual variation and subsequently enhance the ability of QTL detection. In the in vitro synthesis of 2-acetyl-1-pyrroline (2AP), the main fragrance compound in rice, the thermal processing during the Maillard-type reaction between proline and carbohydrate reduction produces a roasted, popcorn-like aroma. Hence, for the first time, we included the proline amino acid, an important precursor of 2AP, as a covariate in our QTL mapping analyses to precisely explore the genetic factors affecting natural variation for rice scent. Consequently, two QTLs were traced on chromosomes 4 and 8. They explained from 20% to 49% of the total aroma phenotypic variance. Additionally, by saturating the interval harboring the major QTL using gene-based primers, a putative allele of fgr (major genetic determinant of fragrance) was mapped in the QTL on the 8th chromosome in the interval RM223-SCU015RM (1.63 cM). These loci supported previous studies of different accessions. Such QTLs can be widely used by breeders in crop improvement programs and for further fine mapping. Moreover, no previous studies and findings were found on simultaneous assessment of the relationship among 2AP, proline and fragrance QTLs. Therefore, our findings can help further our understanding of the metabolomic and genetic basis of 2AP biosynthesis in aromatic rice. PMID:26061689

  16. Application of an Effective Statistical Technique for an Accurate and Powerful Mining of Quantitative Trait Loci for Rice Aroma Trait.

    PubMed

    Golestan Hashemi, Farahnaz Sadat; Rafii, Mohd Y; Ismail, Mohd Razi; Mohamed, Mahmud Tengku Muda; Rahim, Harun A; Latif, Mohammad Abdul; Aslani, Farzad

    2015-01-01

    When a phenotype of interest is associated with an external/internal covariate, covariate inclusion in quantitative trait loci (QTL) analyses can diminish residual variation and subsequently enhance the ability of QTL detection. In the in vitro synthesis of 2-acetyl-1-pyrroline (2AP), the main fragrance compound in rice, the thermal processing during the Maillard-type reaction between proline and carbohydrate reduction produces a roasted, popcorn-like aroma. Hence, for the first time, we included the proline amino acid, an important precursor of 2AP, as a covariate in our QTL mapping analyses to precisely explore the genetic factors affecting natural variation for rice scent. Consequently, two QTLs were traced on chromosomes 4 and 8. They explained from 20% to 49% of the total aroma phenotypic variance. Additionally, by saturating the interval harboring the major QTL using gene-based primers, a putative allele of fgr (major genetic determinant of fragrance) was mapped in the QTL on the 8th chromosome in the interval RM223-SCU015RM (1.63 cM). These loci supported previous studies of different accessions. Such QTLs can be widely used by breeders in crop improvement programs and for further fine mapping. Moreover, no previous studies and findings were found on simultaneous assessment of the relationship among 2AP, proline and fragrance QTLs. Therefore, our findings can help further our understanding of the metabolomic and genetic basis of 2AP biosynthesis in aromatic rice. PMID:26061689

  17. Validation of Reference Genes for Accurate Normalization of Gene Expression in Lilium davidii var. unicolor for Real Time Quantitative PCR

    PubMed Central

    Zhang, Jing; Teixeira da Silva, Jaime A.; Wang, ChunXia; Sun, HongMei

    2015-01-01

    Lilium is an important commercial market flower bulb. qRT-PCR is an extremely important technique to track gene expression levels. The requirement of suitable reference genes for normalization has become increasingly significant and exigent. The expression of internal control genes in living organisms varies considerably under different experimental conditions. For economically important Lilium, only a limited number of reference genes applied in qRT-PCR have been reported to date. In this study, the expression stability of 12 candidate genes including α-TUB, β-TUB, ACT, eIF, GAPDH, UBQ, UBC, 18S, 60S, AP4, FP, and RH2, in a diverse set of 29 samples representing different developmental processes, three stress treatments (cold, heat, and salt) and different organs, has been evaluated. For different organs, the combination of ACT, GAPDH, and UBQ is appropriate whereas ACT together with AP4, or ACT along with GAPDH is suitable for normalization of leaves and scales at different developmental stages, respectively. In leaves, scales and roots under stress treatments, FP, ACT and AP4, respectively showed the most stable expression. This study provides a guide for the selection of a reference gene under different experimental conditions, and will benefit future research on more accurate gene expression studies in a wide variety of Lilium genotypes. PMID:26509446

  18. Structural and quantitative analysis of Equisetum alkaloids.

    PubMed

    Cramer, Luise; Ernst, Ludger; Lubienski, Marcus; Papke, Uli; Schiebel, Hans-Martin; Jerz, Gerold; Beuerle, Till

    2015-08-01

    Equisetum palustre L. is known for its toxicity for livestock. Several studies in the past addressed the isolation and identification of the responsible alkaloids. So far, palustrine (1) and N(5)-formylpalustrine (2) are known alkaloids of E. palustre. A HPLC-ESI-MS/MS method in combination with simple sample work-up was developed to identify and quantitate Equisetum alkaloids. Besides the two known alkaloids six related alkaloids were detected in different Equisetum samples. The structure of the alkaloid palustridiene (3) was derived by comprehensive 1D and 2D NMR experiments. N(5)-Acetylpalustrine (4) was also thoroughly characterized by NMR for the first time. The structure of N(5)-formylpalustridiene (5) is proposed based on mass spectrometry results. Twenty-two E. palustre samples were screened by a HPLC-ESI-MS/MS method after development of a simple sample work-up and in most cases the set of all eight alkaloids were detected in all parts of the plant. A high variability of the alkaloid content and distribution was found depending on plant organ, plant origin and season ranging from 88 to 597mg/kg dried weight. However, palustrine (1) and the alkaloid palustridiene (3) always represented the main alkaloids. For the first time, a comprehensive identification, quantitation and distribution of Equisetum alkaloids was achieved. PMID:25823584

  19. Quantitative analysis of LISA pathfinder test-mass noise

    NASA Astrophysics Data System (ADS)

    Ferraioli, Luigi; Congedo, Giuseppe; Hueller, Mauro; Vitale, Stefano; Hewitson, Martin; Nofrarias, Miquel; Armano, Michele

    2011-12-01

    LISA Pathfinder (LPF) is a mission aiming to test the critical technology for the forthcoming space-based gravitational-wave detectors. The main scientific objective of the LPF mission is to demonstrate test masses free falling with residual accelerations below 3×10-14ms-2/Hz at 1 mHz. Reaching such an ambitious target will require a significant amount of system optimization and characterization, which will in turn require accurate and quantitative noise analysis procedures. In this paper, we discuss two main problems associated with the analysis of the data from LPF: i) excess noise detection and ii) noise parameter identification. The mission is focused on the low-frequency region ([0.1, 10] mHz) of the available signal spectrum. In such a region, the signal is dominated by the force noise acting on test masses. At the same time, the mission duration is limited to 90 days and typical data segments will be 24 hours in length. Considering those constraints, noise analysis is expected to deal with a limited amount of non-Gaussian data, since the spectrum statistics will be far from Gaussian and the lowest available frequency is limited by the data length. In this paper, we analyze the details of the expected statistics for spectral data and develop two suitable excess noise estimators. One is based on the statistical properties of the integrated spectrum, the other is based on the Kolmogorov-Smirnov test. The sensitivity of the estimators is discussed theoretically for independent data, then the algorithms are tested on LPF synthetic data. The test on realistic LPF data allows the effect of spectral data correlations on the efficiency of the different noise excess estimators to be highlighted. It also reveals the versatility of the Kolmogorov-Smirnov approach, which can be adapted to provide reasonable results on correlated data from a modified version of the standard equations for the inversion of the test statistic. Closely related to excess noise detection, the

  20. Joint association analysis of bivariate quantitative and qualitative traits.

    PubMed

    Yuan, Mengdie; Diao, Guoqing

    2011-01-01

    Univariate genome-wide association analysis of quantitative and qualitative traits has been investigated extensively in the literature. In the presence of correlated phenotypes, it is more intuitive to analyze all phenotypes simultaneously. We describe an efficient likelihood-based approach for the joint association analysis of quantitative and qualitative traits in unrelated individuals. We assume a probit model for the qualitative trait, under which an unobserved latent variable and a prespecified threshold determine the value of the qualitative trait. To jointly model the quantitative and qualitative traits, we assume that the quantitative trait and the latent variable follow a bivariate normal distribution. The latent variable is allowed to be correlated with the quantitative phenotype. Simultaneous modeling of the quantitative and qualitative traits allows us to make more precise inference on the pleiotropic genetic effects. We derive likelihood ratio tests for the testing of genetic effects. An application to the Genetic Analysis Workshop 17 data is provided. The new method yields reasonable power and meaningful results for the joint association analysis of the quantitative trait Q1 and the qualitative trait disease status at SNPs with not too small MAF. PMID:22373162

  1. Quantitative data analysis of ESAR data

    NASA Astrophysics Data System (ADS)

    Phruksahiran, N.; Chandra, M.

    2013-07-01

    A synthetic aperture radar (SAR) data processing uses the backscattered electromagnetic wave to map radar reflectivity of the ground surface. The polarization property in radar remote sensing was used successfully in many applications, especially in target decomposition. This paper presents a case study to the experiments which are performed on ESAR L-Band full polarized data sets from German Aerospace Center (DLR) to demonstrate the potential of coherent target decomposition and the possibility of using the weather radar measurement parameter, such as the differential reflectivity and the linear depolarization ratio to obtain the quantitative information of the ground surface. The raw data of ESAR has been processed by the SAR simulator developed using MATLAB program code with Range-Doppler algorithm.

  2. Qualitative and quantitative analysis of endocytic recycling.

    PubMed

    Reineke, James B; Xie, Shuwei; Naslavsky, Naava; Caplan, Steve

    2015-01-01

    Endocytosis, which encompasses the internalization and sorting of plasma membrane (PM) lipids and proteins to distinct membrane-bound intracellular compartments, is a highly regulated and fundamental cellular process by which eukaryotic cells dynamically regulate their PM composition. Indeed, endocytosis is implicated in crucial cellular processes that include proliferation, migration, and cell division as well as maintenance of tissue homeostasis such as apical-basal polarity. Once PM constituents have been taken up into the cell, either via clathrin-dependent endocytosis (CDE) or clathrin-independent endocytosis (CIE), they typically have two fates: degradation through the late-endosomal/lysosomal pathway or returning to the PM via endocytic recycling pathways. In this review, we will detail experimental procedures that allow for both qualitative and quantitative assessment of endocytic recycling of transmembrane proteins internalized by CDE and CIE, using the HeLa cervical cancer cell line as a model system. PMID:26360033

  3. Towards a Quantitative OCT Image Analysis

    PubMed Central

    Garcia Garrido, Marina; Beck, Susanne C.; Mühlfriedel, Regine; Julien, Sylvie; Schraermeyer, Ulrich; Seeliger, Mathias W.

    2014-01-01

    Background Optical coherence tomography (OCT) is an invaluable diagnostic tool for the detection and follow-up of retinal pathology in patients and experimental disease models. However, as morphological structures and layering in health as well as their alterations in disease are complex, segmentation procedures have not yet reached a satisfactory level of performance. Therefore, raw images and qualitative data are commonly used in clinical and scientific reports. Here, we assess the value of OCT reflectivity profiles as a basis for a quantitative characterization of the retinal status in a cross-species comparative study. Methods Spectral-Domain Optical Coherence Tomography (OCT), confocal Scanning-La­ser Ophthalmoscopy (SLO), and Fluorescein Angiography (FA) were performed in mice (Mus musculus), gerbils (Gerbillus perpadillus), and cynomolgus monkeys (Macaca fascicularis) using the Heidelberg Engineering Spectralis system, and additional SLOs and FAs were obtained with the HRA I (same manufacturer). Reflectivity profiles were extracted from 8-bit greyscale OCT images using the ImageJ software package (http://rsb.info.nih.gov/ij/). Results Reflectivity profiles obtained from OCT scans of all three animal species correlated well with ex vivo histomorphometric data. Each of the retinal layers showed a typical pattern that varied in relative size and degree of reflectivity across species. In general, plexiform layers showed a higher level of reflectivity than nuclear layers. A comparison of reflectivity profiles from specialized retinal regions (e.g. visual streak in gerbils, fovea in non-human primates) with respective regions of human retina revealed multiple similarities. In a model of Retinitis Pigmentosa (RP), the value of reflectivity profiles for the follow-up of therapeutic interventions was demonstrated. Conclusions OCT reflectivity profiles provide a detailed, quantitative description of retinal layers and structures including specialized retinal regions

  4. Lipid biomarker analysis for the quantitative analysis of airborne microorganisms

    SciTech Connect

    Macnaughton, S.J.; Jenkins, T.L.; Cormier, M.R.

    1997-08-01

    There is an ever increasing concern regarding the presence of airborne microbial contaminants within indoor air environments. Exposure to such biocontaminants can give rise to large numbers of different health effects including infectious diseases, allergenic responses and respiratory problems, Biocontaminants typically round in indoor air environments include bacteria, fungi, algae, protozoa and dust mites. Mycotoxins, endotoxins, pollens and residues of organisms are also known to cause adverse health effects. A quantitative detection/identification technique independent of culturability that assays both culturable and non culturable biomass including endotoxin is critical in defining risks from indoor air biocontamination. Traditionally, methods employed for the monitoring of microorganism numbers in indoor air environments involve classical culture based techniques and/or direct microscopic counting. It has been repeatedly documented that viable microorganism counts only account for between 0.1-10% of the total community detectable by direct counting. The classic viable microbiologic approach doe`s not provide accurate estimates of microbial fragments or other indoor air components that can act as antigens and induce or potentiate allergic responses. Although bioaerosol samplers are designed to damage the microbes as little as possible, microbial stress has been shown to result from air sampling, aerosolization and microbial collection. Higher collection efficiency results in greater cell damage while less cell damage often results in lower collection efficiency. Filtration can collect particulates at almost 100% efficiency, but captured microorganisms may become dehydrated and damaged resulting in non-culturability, however, the lipid biomarker assays described herein do not rely on cell culture. Lipids are components that are universally distributed throughout cells providing a means to assess independent of culturability.

  5. Toward Quantitatively Accurate Calculation of the Redox-Associated Acid–Base and Ligand Binding Equilibria of Aquacobalamin

    DOE PAGESBeta

    Johnston, Ryne C.; Zhou, Jing; Smith, Jeremy C.; Parks, Jerry M.

    2016-07-08

    In redox processes in complex transition metal-containing species are often intimately associated with changes in ligand protonation states and metal coordination number. Moreover, a major challenge is therefore to develop consistent computational approaches for computing pH-dependent redox and ligand dissociation properties of organometallic species. Reduction of the Co center in the vitamin B12 derivative aquacobalamin can be accompanied by ligand dissociation, protonation, or both, making these properties difficult to compute accurately. We examine this challenge here by using density functional theory and continuum solvation to compute Co ligand binding equilibrium constants (Kon/off), pKas and reduction potentials for models of aquacobalaminmore » in aqueous solution. We consider two models for cobalamin ligand coordination: the first follows the hexa, penta, tetra coordination scheme for CoIII, CoII, and CoI species, respectively, and the second model features saturation of each vacant axial coordination site on CoII and CoI species with a single, explicit water molecule to maintain six directly interacting ligands or water molecules in each oxidation state. Comparing these two coordination schemes in combination with five dispersion-corrected density functionals, we find that the accuracy of the computed properties is largely independent of the scheme used, but including only a continuum representation of the solvent yields marginally better results than saturating the first solvation shell around Co throughout. PBE performs best, displaying balanced accuracy and superior performance overall, with RMS errors of 80 mV for seven reduction potentials, 2.0 log units for five pKas and 2.3 log units for two log Kon/off values for the aquacobalamin system. Furthermore, we find that the BP86 functional commonly used in corrinoid studies suffers from erratic behavior and inaccurate descriptions of Co axial ligand binding, leading to substantial errors in predicted

  6. Toward Quantitatively Accurate Calculation of the Redox-Associated Acid-Base and Ligand Binding Equilibria of Aquacobalamin.

    PubMed

    Johnston, Ryne C; Zhou, Jing; Smith, Jeremy C; Parks, Jerry M

    2016-08-01

    Redox processes in complex transition metal-containing species are often intimately associated with changes in ligand protonation states and metal coordination number. A major challenge is therefore to develop consistent computational approaches for computing pH-dependent redox and ligand dissociation properties of organometallic species. Reduction of the Co center in the vitamin B12 derivative aquacobalamin can be accompanied by ligand dissociation, protonation, or both, making these properties difficult to compute accurately. We examine this challenge here by using density functional theory and continuum solvation to compute Co-ligand binding equilibrium constants (Kon/off), pKas, and reduction potentials for models of aquacobalamin in aqueous solution. We consider two models for cobalamin ligand coordination: the first follows the hexa, penta, tetra coordination scheme for Co(III), Co(II), and Co(I) species, respectively, and the second model features saturation of each vacant axial coordination site on Co(II) and Co(I) species with a single, explicit water molecule to maintain six directly interacting ligands or water molecules in each oxidation state. Comparing these two coordination schemes in combination with five dispersion-corrected density functionals, we find that the accuracy of the computed properties is largely independent of the scheme used, but including only a continuum representation of the solvent yields marginally better results than saturating the first solvation shell around Co throughout. PBE performs best, displaying balanced accuracy and superior performance overall, with RMS errors of 80 mV for seven reduction potentials, 2.0 log units for five pKas and 2.3 log units for two log Kon/off values for the aquacobalamin system. Furthermore, we find that the BP86 functional commonly used in corrinoid studies suffers from erratic behavior and inaccurate descriptions of Co-axial ligand binding, leading to substantial errors in predicted pKas and

  7. 1NON-INVASIVE RADIOIODINE IMAGING FOR ACCURATE QUANTITATION OF NIS REPORTER GENE EXPRESSION IN TRANSPLANTED HEARTS

    PubMed Central

    Ricci, Davide; Mennander, Ari A; Pham, Linh D; Rao, Vinay P; Miyagi, Naoto; Byrne, Guerard W; Russell, Stephen J; McGregor, Christopher GA

    2008-01-01

    Objectives We studied the concordance of transgene expression in the transplanted heart using bicistronic adenoviral vector coding for a transgene of interest (human carcinoembryonic antigen: hCEA - beta human chorionic gonadotropin: βhCG) and for a marker imaging transgene (human sodium iodide symporter: hNIS). Methods Inbred Lewis rats were used for syngeneic heterotopic cardiac transplantation. Donor rat hearts were perfused ex vivo for 30 minutes prior to transplantation with University of Wisconsin (UW) solution (n=3), with 109 pfu/ml of adenovirus expressing hNIS (Ad-NIS; n=6), hNIS-hCEA (Ad-NIS-CEA; n=6) and hNIS-βhCG (Ad-NIS-CG; n=6). On post-operative day (POD) 5, 10, 15 all animals underwent micro-SPECT/CT imaging of the donor hearts after tail vein injection of 1000 μCi 123I and blood sample collection for hCEA and βhCG quantification. Results Significantly higher image intensity was noted in the hearts perfused with Ad-NIS (1.1±0.2; 0.9±0.07), Ad-NIS-CEA (1.2±0.3; 0.9±0.1) and Ad-NIS-CG (1.1±0.1; 0.9±0.1) compared to UW group (0.44±0.03; 0.47±0.06) on POD 5 and 10 (p<0.05). Serum levels of hCEA and βhCG increased in animals showing high cardiac 123I uptake, but not in those with lower uptake. Above this threshold, image intensities correlated well with serum levels of hCEA and βhCG (R2=0.99 and R2=0.96 respectively). Conclusions These data demonstrate that hNIS is an excellent reporter gene for the transplanted heart. The expression level of hNIS can be accurately and non-invasively monitored by serial radioisotopic single photon emission computed tomography (SPECT) imaging. High concordance has been demonstrated between imaging and soluble marker peptides at the maximum transgene expression on POD 5. PMID:17980613

  8. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    PubMed

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet-a webserver implementation of AMPHORA2-, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under

  9. Quantitative infrared analysis of hydrogen fluoride

    SciTech Connect

    Manuta, D.M.

    1997-04-01

    This work was performed at the Portsmouth Gaseous Diffusion Plant where hydrogen fluoride is produced upon the hydrolysis of UF{sub 6}. This poses a problem for in this setting and a method for determining the mole percent concentration was desired. HF has been considered to be a non-ideal gas for many years. D. F. Smith utilized complex equations in his HF studies in the 1950s. We have evaluated HF behavior as a function of pressure from three different perspectives. (1) Absorbance at 3877 cm{sup -1} as a function of pressure for 100% HF. (2) Absorbance at 3877 cm{sup -1} as a function of increasing partial pressure HF. Total pressure = 300 mm HgA maintained with nitrogen. (3) Absorbance at 3877 cm{sup -1} for constant partial pressure HF. Total pressure is increased to greater than 800 mm HgA with nitrogen. These experiments have shown that at partial pressures up to 35mm HgA, HIF follows the ideal gas law. The absorbance at 3877 cm{sup -1} can be quantitatively analyzed via infrared methods.

  10. Quantitative multi-modal NDT data analysis

    SciTech Connect

    Heideklang, René; Shokouhi, Parisa

    2014-02-18

    A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundant information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity.

  11. Multiple quantitative trait analysis using bayesian networks.

    PubMed

    Scutari, Marco; Howell, Phil; Balding, David J; Mackay, Ian

    2014-09-01

    Models for genome-wide prediction and association studies usually target a single phenotypic trait. However, in animal and plant genetics it is common to record information on multiple phenotypes for each individual that will be genotyped. Modeling traits individually disregards the fact that they are most likely associated due to pleiotropy and shared biological basis, thus providing only a partial, confounded view of genetic effects and phenotypic interactions. In this article we use data from a Multiparent Advanced Generation Inter-Cross (MAGIC) winter wheat population to explore Bayesian networks as a convenient and interpretable framework for the simultaneous modeling of multiple quantitative traits. We show that they are equivalent to multivariate genetic best linear unbiased prediction (GBLUP) and that they are competitive with single-trait elastic net and single-trait GBLUP in predictive performance. Finally, we discuss their relationship with other additive-effects models and their advantages in inference and interpretation. MAGIC populations provide an ideal setting for this kind of investigation because the very low population structure and large sample size result in predictive models with good power and limited confounding due to relatedness. PMID:25236454

  12. Quantitative analysis of airway abnormalities in CT

    NASA Astrophysics Data System (ADS)

    Petersen, Jens; Lo, Pechin; Nielsen, Mads; Edula, Goutham; Ashraf, Haseem; Dirksen, Asger; de Bruijne, Marleen

    2010-03-01

    A coupled surface graph cut algorithm for airway wall segmentation from Computed Tomography (CT) images is presented. Using cost functions that highlight both inner and outer wall borders, the method combines the search for both borders into one graph cut. The proposed method is evaluated on 173 manually segmented images extracted from 15 different subjects and shown to give accurate results, with 37% less errors than the Full Width at Half Maximum (FWHM) algorithm and 62% less than a similar graph cut method without coupled surfaces. Common measures of airway wall thickness such as the Interior Area (IA) and Wall Area percentage (WA%) was measured by the proposed method on a total of 723 CT scans from a lung cancer screening study. These measures were significantly different for participants with Chronic Obstructive Pulmonary Disease (COPD) compared to asymptomatic participants. Furthermore, reproducibility was good as confirmed by repeat scans and the measures correlated well with the outcomes of pulmonary function tests, demonstrating the use of the algorithm as a COPD diagnostic tool. Additionally, a new measure of airway wall thickness is proposed, Normalized Wall Intensity Sum (NWIS). NWIS is shown to correlate better with lung function test values and to be more reproducible than previous measures IA, WA% and airway wall thickness at a lumen perimeter of 10 mm (PI10).

  13. Tools for Accurate and Efficient Analysis of Complex Evolutionary Mechanisms in Microbial Genomes. Final Report

    SciTech Connect

    Nakhleh, Luay

    2014-03-12

    I proposed to develop computationally efficient tools for accurate detection and reconstruction of microbes' complex evolutionary mechanisms, thus enabling rapid and accurate annotation, analysis and understanding of their genomes. To achieve this goal, I proposed to address three aspects. (1) Mathematical modeling. A major challenge facing the accurate detection of HGT is that of distinguishing between these two events on the one hand and other events that have similar "effects." I proposed to develop a novel mathematical approach for distinguishing among these events. Further, I proposed to develop a set of novel optimization criteria for the evolutionary analysis of microbial genomes in the presence of these complex evolutionary events. (2) Algorithm design. In this aspect of the project, I proposed to develop an array of e cient and accurate algorithms for analyzing microbial genomes based on the formulated optimization criteria. Further, I proposed to test the viability of the criteria and the accuracy of the algorithms in an experimental setting using both synthetic as well as biological data. (3) Software development. I proposed the nal outcome to be a suite of software tools which implements the mathematical models as well as the algorithms developed.

  14. The quantitative failure of human reliability analysis

    SciTech Connect

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  15. A Quantitative Analysis of Countries' Research Strengths

    ERIC Educational Resources Information Center

    Saxena, Anurag; Brazer, S. David; Gupta, B. M.

    2009-01-01

    This study employed a multidimensional analysis to evaluate transnational patterns of scientific research to determine relative research strengths among widely varying nations. Findings from this study may inform national policy with regard to the most efficient use of scarce national research resources, including government and private funding.…

  16. Identification and validation of reference genes for accurate normalization of real-time quantitative PCR data in kiwifruit.

    PubMed

    Ferradás, Yolanda; Rey, Laura; Martínez, Óscar; Rey, Manuel; González, Ma Victoria

    2016-05-01

    Identification and validation of reference genes are required for the normalization of qPCR data. We studied the expression stability produced by eight primer pairs amplifying four common genes used as references for normalization. Samples representing different tissues, organs and developmental stages in kiwifruit (Actinidia chinensis var. deliciosa (A. Chev.) A. Chev.) were used. A total of 117 kiwifruit samples were divided into five sample sets (mature leaves, axillary buds, stigmatic arms, fruit flesh and seeds). All samples were also analysed as a single set. The expression stability of the candidate primer pairs was tested using three algorithms (geNorm, NormFinder and BestKeeper). The minimum number of reference genes necessary for normalization was also determined. A unique primer pair was selected for amplifying the 18S rRNA gene. The primer pair selected for amplifying the ACTIN gene was different depending on the sample set. 18S 2 and ACT 2 were the candidate primer pairs selected for normalization in the three sample sets (mature leaves, fruit flesh and stigmatic arms). 18S 2 and ACT 3 were the primer pairs selected for normalization in axillary buds. No primer pair could be selected for use as the reference for the seed sample set. The analysis of all samples in a single set did not produce the selection of any stably expressing primer pair. Considering data previously reported in the literature, we validated the selected primer pairs amplifying the FLOWERING LOCUS T gene for use in the normalization of gene expression in kiwifruit. PMID:26897117

  17. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    ERIC Educational Resources Information Center

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  18. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  19. Influence of corrosion layers on quantitative analysis

    NASA Astrophysics Data System (ADS)

    Denker, A.; Bohne, W.; Opitz-Coutureau, J.; Rauschenberg, J.; Röhrich, J.; Strub, E.

    2005-09-01

    Art historians and restorers in charge of ancient metal objects are often reluctant to remove the corrosion layer evolved over time, as this would change the appearance of the artefact dramatically. Therefore, when an elemental analysis of the objects is required, this has to be done by penetrating the corrosion layer. In this work the influence of corrosion was studied on Chinese and Roman coins, where removal of oxidized material was possible. Measurements on spots with and without corrosion are presented and the results discussed.

  20. Quantitative surface spectroscopic analysis of multicomponent polymers

    NASA Astrophysics Data System (ADS)

    Zhuang, Hengzhong

    Angle-dependent electron spectroscopy for chemical analysis (ESCA) has been successfully used to examine the surface compositional gradient of a multicomponent polymer. However, photoelectron intensities detected at each take-off angle of ESCA measurements are convoluted signals. The convoluted nature of the signal distorts depth profiles for samples having compositional gradients. To recover the true concentration profiles for the samples, a deconvolution program has been described in Chapter 2. The compositional profiles of two classes of important multicomponent polymers, i.e., poly(dimethysiloxane urethane) (PU-DMS) segmented copolymers and fluorinated poly(amide urethane) block copolymers, are achieved using this program. The effects of the polymer molecular structure and the processing variation on its surface compositional profile have been studied. Besides surface composition, it is desirable to know whether the distribution of segment or block lengths at the surface is different than in the bulk, because this aspect of surface structure may lead to properties different than that predicted simply by knowledge of the surface composition and the bulk structure. In Chapter 3, we pioneered the direct determination of the distribution of polydimethylsiloxane (PDMS) segment lengths at the surface of PU-DMS using time-of-flight secondary ion mass spectrometry (SUMS). Exciting preliminary results are provided: for the thick film of PU-DMS with nominal MW of PDMS = 1000, the distribution of the PDMS segment lengths at the surface is nearly identical to that in the bulk, whereas in the case of the thick films of PU-DMS with nominal MW of PDMS = 2400, only those PDMS segments with MW of ca. 1000 preferentially segregated at the surface. As a potential minimal fouling coating or biocompatible cardio-vascular materials, PU-DMS copolymers eventually come into contact with water once in use. Could such an environmental change (from air to aqueous) induce any undesirable

  1. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. PMID:26456251

  2. [Qualitative and quantitative gamma-hydroxybutyrate analysis].

    PubMed

    Petek, Maja Jelena; Vrdoljak, Ana Lucić

    2006-12-01

    Gamma-hydroxybutyrate (GHB) is a naturally occurring compound present in the brain and peripheral tissues of mammals. It is a minor metabolite and precursor of gamma-aminobutyric acid (GABA). Just as GABA, GHB is believed to play a role in neurotransmission. GHB was first synthesized in vitro in 1960, when it revealed depressive and hypnotic effects on the central nervous system. In 1960s it was used as an anaesthetic and later as an alternative to anabolic steroids, in order to enhance muscle growth. However, after it was shown that it caused strong physical dependence and severe side effects, GHB was banned. For the last fifteen years, GHB has been abused for its intoxicating effects such as euphoria, reduced inhibitions and sedation. Illicitly it is available as white powder or as clear liquid. Paradoxically GHB can easily be manufactured from its precursor gamma-butyrolactone (GBL), which has not yet been banned. Because of many car accidents and criminal acts in which it is involved, GHB has become an important object of forensic laboratory analysis. This paper describes gas and liquid chromatography, infrared spectroscopy, microscopy, colourimetry and nuclear magnetic resonance as methods for detection and quantification of GHB in urine and illicit products. PMID:17265679

  3. Quantitative analysis of in vivo cell proliferation.

    PubMed

    Cameron, Heather A

    2006-11-01

    Injection and immunohistochemical detection of 5-bromo-2'-deoxyuridine (BrdU) has become the standard method for studying the birth and survival of neurons, glia, and other cell types in the nervous system. BrdU, a thymidine analog, becomes stably incorporated into DNA during the S-phase of mitosis. Because DNA containing BrdU can be specifically recognized by antibodies, this method allows dividing cells to be marked at any given time and then identified at time points from a few minutes to several years later. BrdU immunohistochemistry is suitable for cell counting to examine the regulation of cell proliferation and cell fate. It can be combined with labeling by other antibodies, allowing confocal analysis of cell phenotype or expression of other proteins. The potential for nonspecific labeling and toxicity are discussed. Although BrdU immunohistochemistry has almost completely replaced tritiated thymidine autoradiography for labeling dividing cells, this method and situations in which it is still useful are also described. PMID:18428635

  4. Accurate near-field calculation in the rigorous coupled-wave analysis method

    NASA Astrophysics Data System (ADS)

    Weismann, Martin; Gallagher, Dominic F. G.; Panoiu, Nicolae C.

    2015-12-01

    The rigorous coupled-wave analysis (RCWA) is one of the most successful and widely used methods for modeling periodic optical structures. It yields fast convergence of the electromagnetic far-field and has been adapted to model various optical devices and wave configurations. In this article, we investigate the accuracy with which the electromagnetic near-field can be calculated by using RCWA and explain the observed slow convergence and numerical artifacts from which it suffers, namely unphysical oscillations at material boundaries due to the Gibbs phenomenon. In order to alleviate these shortcomings, we also introduce a mathematical formulation for accurate near-field calculation in RCWA, for one- and two-dimensional straight and slanted diffraction gratings. This accurate near-field computational approach is tested and evaluated for several representative test-structures and configurations in order to illustrate the advantages provided by the proposed modified formulation of the RCWA.

  5. Control of separation and quantitative analysis by GC-FTIR

    NASA Astrophysics Data System (ADS)

    Semmoud, A.; Huvenne, Jean P.; Legrand, P.

    1992-03-01

    Software for 3-D representations of the 'Absorbance-Wavenumber-Retention time' is used to control the quality of the GC separation. Spectral information given by the FTIR detection allows the user to be sure that a chromatographic peak is 'pure.' The analysis of peppermint essential oil is presented as an example. This assurance is absolutely required for quantitative applications. In these conditions, we have worked out a quantitative analysis of caffeine. Correlation coefficients between integrated absorbance measurements and concentration of caffeine are discussed at two steps of the data treatment.

  6. Quantitative analysis of cefalexin based on artificial neural networks combined with modified genetic algorithm using short near-infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Huan, Yanfu; Feng, Guodong; Wang, Bin; Ren, Yulin; Fei, Qiang

    2013-05-01

    In this paper, a novel chemometric method was developed for rapid, accurate, and quantitative analysis of cefalexin in samples. The experiments were carried out by using the short near-infrared spectroscopy coupled with artificial neural networks. In order to enhancing the predictive ability of artificial neural networks model, a modified genetic algorithm was used to select fixed number of wavelength.

  7. A quantitative analysis of the F18 flight control system

    NASA Technical Reports Server (NTRS)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  8. Computerized rapid high resolution quantitative analysis of plasma lipoproteins based upon single vertical spin centrifugation.

    PubMed

    Cone, J T; Segrest, J P; Chung, B H; Ragland, J B; Sabesin, S M; Glasscock, A

    1982-08-01

    A method has been developed for rapidly quantitating the cholesterol concentration of normal and certain variant lipoproteins in a large number of patients (over 240 in one week). The method employs a microcomputer interfaced to the vertical autoprofiler (VAP) described earlier (Chung et al. 1981. J. Lipid Res. 22: 1003-1014). Software developed to accomplish rapid on-line analysis of the VAP signal uses peak shapes and positions derived from prior VAP analysis of isolated authentic lipoproteins HDL, LDL, and VLDL to quantitate these species in a VAP profile. Variant lipoproteins VHDL (a species with density greater than that of HDL(3)), MDL (a species, most likely Lp(a), with density intermediate between that of HDL and LDL), and IDL are subsequently quantitated by a method combining difference calculations with curve shapes. The procedure has been validated qualitatively by negative stain electron microscopy, gradient gel electrophoresis, strip electrophoresis, chemical analysis of the lipids, radioimmunoassay of the apolipoproteins, and measurement of the density of the peak centers. It has been validated quantitatively by comparison with Lipid Research Clinic methodology for HDL-, LDL-, and VLDL-cholesterol, and for MDL- and IDL-cholesterol by comparison of the amounts of MDL or IDL predicted to be present by the method with that known to be present following standard addition to whole plasma. These validations show that the method is a rapid and accurate technique of lipoprotein analysis suitable for the routine screening of patients for abnormal amounts of normal or variant lipoproteins, as well as for use as a research tool for quantitation of changes in cholesterol content of six or seven different plasma lipoprotein fractions.-Cone, J. T., J. P. Segrest, B. H. Chung, J. B. Ragland, S. M. Sabesin, and A. Glasscock. Computerized rapid high resolution quantitative analysis of plasma lipoproteins based upon single vertical spin centrifugation. PMID:7130860

  9. EXPLoRA-web: linkage analysis of quantitative trait loci using bulk segregant analysis.

    PubMed

    Pulido-Tamayo, Sergio; Duitama, Jorge; Marchal, Kathleen

    2016-07-01

    Identification of genomic regions associated with a phenotype of interest is a fundamental step toward solving questions in biology and improving industrial research. Bulk segregant analysis (BSA) combined with high-throughput sequencing is a technique to efficiently identify these genomic regions associated with a trait of interest. However, distinguishing true from spuriously linked genomic regions and accurately delineating the genomic positions of these truly linked regions requires the use of complex statistical models currently implemented in software tools that are generally difficult to operate for non-expert users. To facilitate the exploration and analysis of data generated by bulked segregant analysis, we present EXPLoRA-web, a web service wrapped around our previously published algorithm EXPLoRA, which exploits linkage disequilibrium to increase the power and accuracy of quantitative trait loci identification in BSA analysis. EXPLoRA-web provides a user friendly interface that enables easy data upload and parallel processing of different parameter configurations. Results are provided graphically and as BED file and/or text file and the input is expected in widely used formats, enabling straightforward BSA data analysis. The web server is available at http://bioinformatics.intec.ugent.be/explora-web/. PMID:27105844

  10. Nonlinear Aeroelastic Analysis Using a Time-Accurate Navier-Stokes Equations Solver

    NASA Technical Reports Server (NTRS)

    Kuruvila, Geojoe; Bartels, Robert E.; Hong, Moeljo S.; Bhatia, G.

    2007-01-01

    A method to simulate limit cycle oscillation (LCO) due to control surface freeplay using a modified CFL3D, a time-accurate Navier-Stokes computational fluid dynamics (CFD) analysis code with structural modeling capability, is presented. This approach can be used to analyze aeroelastic response of aircraft with structural behavior characterized by nonlinearity in the force verses displacement curve. A limited validation of the method, using very low Mach number experimental data for a three-degrees-of-freedom (pitch/plunge/flap deflection) airfoil model with flap freeplay, is also presented.

  11. Early Child Grammars: Qualitative and Quantitative Analysis of Morphosyntactic Production

    ERIC Educational Resources Information Center

    Legendre, Geraldine

    2006-01-01

    This article reports on a series of 5 analyses of spontaneous production of verbal inflection (tense and person-number agreement) by 2-year-olds acquiring French as a native language. A formal analysis of the qualitative and quantitative results is developed using the unique resources of Optimality Theory (OT; Prince & Smolensky, 2004). It is…

  12. Quantitating the subtleties of microglial morphology with fractal analysis

    PubMed Central

    Karperien, Audrey; Ahammer, Helmut; Jelinek, Herbert F.

    2013-01-01

    It is well established that microglial form and function are inextricably linked. In recent years, the traditional view that microglial form ranges between “ramified resting” and “activated amoeboid” has been emphasized through advancing imaging techniques that point to microglial form being highly dynamic even within the currently accepted morphological categories. Moreover, microglia adopt meaningful intermediate forms between categories, with considerable crossover in function and varying morphologies as they cycle, migrate, wave, phagocytose, and extend and retract fine and gross processes. From a quantitative perspective, it is problematic to measure such variability using traditional methods, but one way of quantitating such detail is through fractal analysis. The techniques of fractal analysis have been used for quantitating microglial morphology, to categorize gross differences but also to differentiate subtle differences (e.g., amongst ramified cells). Multifractal analysis in particular is one technique of fractal analysis that may be useful for identifying intermediate forms. Here we review current trends and methods of fractal analysis, focusing on box counting analysis, including lacunarity and multifractal analysis, as applied to microglial morphology. PMID:23386810

  13. Development of a quantitative autoradiography image analysis system

    SciTech Connect

    Hoffman, T.J.; Volkert, W.A.; Holmes R.A.

    1986-03-01

    A low cost image analysis system suitable for quantitative autoradiography (QAR) analysis has been developed. Autoradiographs can be digitized using a conventional Newvicon television camera interfaced to an IBM-XT microcomputer. Software routines for image digitization and capture permit the acquisition of thresholded or windowed images with graphic overlays that can be stored on storage devices. Image analysis software performs all background and non-linearity corrections prior to display as black/white or pseudocolor images. The relationship of pixel intensity to a standard radionuclide concentration allows the production of quantitative maps of tissue radiotracer concentrations. An easily modified subroutine is provided for adaptation to use appropriate operational equations when parameters such as regional cerebral blood flow or regional cerebral glucose metabolism are under investigation. This system could provide smaller research laboratories with the capability of QAR analysis at relatively low cost.

  14. A rapid and accurate method for the quantitative estimation of natural polysaccharides and their fractions using high performance size exclusion chromatography coupled with multi-angle laser light scattering and refractive index detector.

    PubMed

    Cheong, Kit-Leong; Wu, Ding-Tao; Zhao, Jing; Li, Shao-Ping

    2015-06-26

    In this study, a rapid and accurate method for quantitative analysis of natural polysaccharides and their different fractions was developed. Firstly, high performance size exclusion chromatography (HPSEC) was utilized to separate natural polysaccharides. And then the molecular masses of their fractions were determined by multi-angle laser light scattering (MALLS). Finally, quantification of polysaccharides or their fractions was performed based on their response to refractive index detector (RID) and their universal refractive index increment (dn/dc). Accuracy of the developed method for the quantification of individual and mixed polysaccharide standards, including konjac glucomannan, CM-arabinan, xyloglucan, larch arabinogalactan, oat β-glucan, dextran (410, 270, and 25 kDa), mixed xyloglucan and CM-arabinan, and mixed dextran 270 K and CM-arabinan was determined, and their average recoveries were between 90.6% and 98.3%. The limits of detection (LOD) and quantification (LOQ) were ranging from 10.68 to 20.25 μg/mL, and 42.70 to 68.85 μg/mL, respectively. Comparing to the conventional phenol sulfuric acid assay and HPSEC coupled with evaporative light scattering detection (HPSEC-ELSD) analysis, the developed HPSEC-MALLS-RID method based on universal dn/dc for the quantification of polysaccharides and their fractions is much more simple, rapid, and accurate with no need of individual polysaccharide standard, as well as free of calibration curve. The developed method was also successfully utilized for quantitative analysis of polysaccharides and their different fractions from three medicinal plants of Panax genus, Panax ginseng, Panax notoginseng and Panax quinquefolius. The results suggested that the HPSEC-MALLS-RID method based on universal dn/dc could be used as a routine technique for the quantification of polysaccharides and their fractions in natural resources. PMID:25990349

  15. Qualitative and quantitative analysis of uroliths in dogs: definitive determination of chemical type.

    PubMed

    Bovee, K C; McGuire, T

    1984-11-01

    Effective treatment and prevention of urolithiasis depends on accurate determination of the chemical nature of the uroliths. A widely used qualitative chemical procedure was compared with quantitative crystallographic analysis of 272 canine uroliths. Agreement between the 2 methods was 78%. Qualitative analysis failed to detect 62% of calcium-containing uroliths and 83% of carbonate apatite uroliths. Qualitative analysis gave false-positive results for urates in 55% of cystine uroliths. Mixed uroliths comprising 6% of the total could not be classified without quantitative analysis. Silicate, cystine, and urate uroliths generally were of pure composition. Crystallographic analysis indicated the following distribution of major types: struvite, 69%; calcium oxalate, 10%; urate, 7%; silicate, 3.5%; cystine, 3.2%; calcium phosphate, 1%; and mixed, 6%. Among dogs with struvite uroliths, 66% had positive results of bacterial culturing from the urinary bladder. Six breeds (Miniature Schnauzer, Welsh Corgi, Lhasa Apso, Yorkshire Terrier, Pekingese, and Pug) had a significantly higher risk for urolithiasis, compared with other breeds. The German Shepherd Dog had a significantly lowered risk, compared with other breeds. Two breeds had significant relationship to a specific type of urolith: Miniature Schnauzer for oxalate, and Dalmatian for urate (P less than 0.001). It was concluded that quantitative analysis, using crystallography, was superior for the detection of calcium oxalate, carbonate apatite, cystine, urate, and mixed uroliths. PMID:6511641

  16. Advances in liquid chromatography-high-resolution mass spectrometry for quantitative and qualitative environmental analysis.

    PubMed

    Aceña, Jaume; Stampachiacchiere, Serena; Pérez, Sandra; Barceló, Damià

    2015-08-01

    This review summarizes the advances in environmental analysis by liquid chromatography-high-resolution mass spectrometry (LC-HRMS) during the last decade and discusses different aspects of their application. LC-HRMS has become a powerful tool for simultaneous quantitative and qualitative analysis of organic pollutants, enabling their quantitation and the search for metabolites and transformation products or the detection of unknown compounds. LC-HRMS provides more information than low-resolution (LR) MS for each sample because it can accurately determine the mass of the molecular ion and its fragment ions if it can be used for MS-MS. Another advantage is that the data can be processed using either target analysis, suspect screening, retrospective analysis, or non-target screening. With the growing popularity and acceptance of HRMS analysis, current guidelines for compound confirmation need to be revised for quantitative and qualitative purposes. Furthermore, new commercial software and user-built libraries are required to mine data in an efficient and comprehensive way. The scope of this critical review is not to provide a comprehensive overview of the many studies performed with LC-HRMS in the field of environmental analysis, but to reveal its advantages and limitations using different workflows. PMID:26138893

  17. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  18. Quantitative analysis of benzodiazepines in vitreous humor by high-performance liquid chromatography

    PubMed Central

    Bazmi, Elham; Behnoush, Behnam; Akhgari, Maryam; Bahmanabadi, Leila

    2016-01-01

    Objective: Benzodiazepines are frequently screened drugs in emergency toxicology, drugs of abuse testing, and in forensic cases. As the variations of benzodiazepines concentrations in biological samples during bleeding, postmortem changes, and redistribution could be biasing forensic medicine examinations, hence selecting a suitable sample and a validated accurate method is essential for the quantitative analysis of these main drug categories. The aim of this study was to develop a valid method for the determination of four benzodiazepines (flurazepam, lorazepam, alprazolam, and diazepam) in vitreous humor using liquid–liquid extraction and high-performance liquid chromatography. Methods: Sample preparation was carried out using liquid–liquid extraction with n-hexane: ethyl acetate and subsequent detection by high-performance liquid chromatography method coupled to diode array detector. This method was applied to quantify benzodiazepines in 21 authentic vitreous humor samples. Linear curve for each drug was obtained within the range of 30–3000 ng/mL with coefficient of correlation higher than 0.99. Results: The limit of detection and quantitation were 30 and 100 ng/mL respectively for four drugs. The method showed an appropriate intra- and inter-day precision (coefficient of variation < 10%). Benzodiazepines recoveries were estimated to be over 80%. The method showed high selectivity; no additional peak due to interfering substances in samples was observed. Conclusion: The present method was selective, sensitive, accurate, and precise for the quantitative analysis of benzodiazepines in vitreous humor samples in forensic toxicology laboratory.

  19. Accurate Analysis and Computer Aided Design of Microstrip Dual Mode Resonators and Filters.

    NASA Astrophysics Data System (ADS)

    Grounds, Preston Whitfield, III

    1995-01-01

    Microstrip structures are of interest due to their many applications in microwave circuit design. Their small size and ease of connection to both passive and active components make them well suited for use in systems where size and space is at a premium. These include satellite communication systems, radar systems, satellite navigation systems, cellular phones and many others. In general, space is always a premium for any mobile system. Microstrip resonators find particular application in oscillators and filters. In typical filters each microstrip patch corresponds to one resonator. However, when dual mode patches are employed, each patch acts as two resonators and therefore reduces the amount of space required to build the filter. This dissertation focuses on the accurate electromagnetic analysis of the components of planar dual mode filters. Highly accurate analyses are required so that the resonator to resonator coupling and the resonator to input/output can be predicted with precision. Hence, filters can be built with a minimum of design iterations and tuning. The analysis used herein is an integral equation formulation in the spectral domain. The analysis is done in the spectral domain since the Green's function can be derived in closed form, and the spatial domain convolution becomes a simple product. The resulting set of equations is solved using the Method of Moments with Galerkin's procedure. The electromagnetic analysis is applied to range of problems including unloaded dual mode patches, dual mode patches coupled to microstrip feedlines, and complete filter structures. At each step calculated results are compared to measured results and good agreement is found. The calculated results are also compared to results from the circuit analysis program HP EESOF^{ rm TM} and again good agreement is found. A dual mode elliptic filter is built and good performance is obtained.

  20. Analysis of copy number variation using quantitative interspecies competitive PCR.

    PubMed

    Williams, Nigel M; Williams, Hywel; Majounie, Elisa; Norton, Nadine; Glaser, Beate; Morris, Huw R; Owen, Michael J; O'Donovan, Michael C

    2008-10-01

    Over recent years small submicroscopic DNA copy-number variants (CNVs) have been highlighted as an important source of variation in the human genome, human phenotypic diversity and disease susceptibility. Consequently, there is a pressing need for the development of methods that allow the efficient, accurate and cheap measurement of genomic copy number polymorphisms in clinical cohorts. We have developed a simple competitive PCR based method to determine DNA copy number which uses the entire genome of a single chimpanzee as a competitor thus eliminating the requirement for competitive sequences to be synthesized for each assay. This results in the requirement for only a single reference sample for all assays and dramatically increases the potential for large numbers of loci to be analysed in multiplex. In this study we establish proof of concept by accurately detecting previously characterized mutations at the PARK2 locus and then demonstrating the potential of quantitative interspecies competitive PCR (qicPCR) to accurately genotype CNVs in association studies by analysing chromosome 22q11 deletions in a sample of previously characterized patients and normal controls. PMID:18697816

  1. Quantitative numerical analysis of transient IR-experiments on buildings

    NASA Astrophysics Data System (ADS)

    Maierhofer, Ch.; Wiggenhauser, H.; Brink, A.; Röllig, M.

    2004-12-01

    Impulse-thermography has been established as a fast and reliable tool in many areas of non-destructive testing. In recent years several investigations have been done to apply active thermography to civil engineering. For quantitative investigations in this area of application, finite difference calculations have been performed for systematic studies on the influence of environmental conditions, heating power and time, defect depth and size and thermal properties of the bulk material (concrete). The comparison of simulated and experimental data enables the quantitative analysis of defects.

  2. Scanning tunneling microscopy on rough surfaces-quantitative image analysis

    NASA Astrophysics Data System (ADS)

    Reiss, G.; Brückl, H.; Vancea, J.; Lecheler, R.; Hastreiter, E.

    1991-07-01

    In this communication, the application of scanning tunneling microscopy (STM) for a quantitative evaluation of roughnesses and mean island sizes of polycrystalline thin films is discussed. Provided strong conditions concerning the resolution are satisfied, the results are in good agreement with standard techniques as, for example, transmission electron microscopy. Owing to its high resolution, STM can supply a better characterization of surfaces than established methods, especially concerning the roughness. Microscopic interpretations of surface dependent physical properties thus can be considerably improved by a quantitative analysis of STM images.

  3. Quantitative analysis of culture using millions of digitized books

    PubMed Central

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  4. Improved method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.

    An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  5. Quantitative analysis of culture using millions of digitized books.

    PubMed

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  6. Accurate reliability analysis method for quantum-dot cellular automata circuits

    NASA Astrophysics Data System (ADS)

    Cui, Huanqing; Cai, Li; Wang, Sen; Liu, Xiaoqiang; Yang, Xiaokuo

    2015-10-01

    Probabilistic transfer matrix (PTM) is a widely used model in the reliability research of circuits. However, PTM model cannot reflect the impact of input signals on reliability, so it does not completely conform to the mechanism of the novel field-coupled nanoelectronic device which is called quantum-dot cellular automata (QCA). It is difficult to get accurate results when PTM model is used to analyze the reliability of QCA circuits. To solve this problem, we present the fault tree models of QCA fundamental devices according to different input signals. After that, the binary decision diagram (BDD) is used to quantitatively investigate the reliability of two QCA XOR gates depending on the presented models. By employing the fault tree models, the impact of input signals on reliability can be identified clearly and the crucial components of a circuit can be found out precisely based on the importance values (IVs) of components. So this method is contributive to the construction of reliable QCA circuits.

  7. Quantitative analysis of autophagy using advanced 3D fluorescence microscopy.

    PubMed

    Changou, Chun A; Wolfson, Deanna L; Ahluwalia, Balpreet Singh; Bold, Richard J; Kung, Hsing-Jien; Chuang, Frank Y S

    2013-01-01

    Prostate cancer is the leading form of malignancies among men in the U.S. While surgery carries a significant risk of impotence and incontinence, traditional chemotherapeutic approaches have been largely unsuccessful. Hormone therapy is effective at early stage, but often fails with the eventual development of hormone-refractory tumors. We have been interested in developing therapeutics targeting specific metabolic deficiency of tumor cells. We recently showed that prostate tumor cells specifically lack an enzyme (argininosuccinate synthase, or ASS) involved in the synthesis of the amino acid arginine(1). This condition causes the tumor cells to become dependent on exogenous arginine, and they undergo metabolic stress when free arginine is depleted by arginine deiminase (ADI)(1,10). Indeed, we have shown that human prostate cancer cells CWR22Rv1 are effectively killed by ADI with caspase-independent apoptosis and aggressive autophagy (or macroautophagy)(1,2,3). Autophagy is an evolutionarily-conserved process that allows cells to metabolize unwanted proteins by lysosomal breakdown during nutritional starvation(4,5). Although the essential components of this pathway are well-characterized(6,7,8,9), many aspects of the molecular mechanism are still unclear - in particular, what is the role of autophagy in the death-response of prostate cancer cells after ADI treatment? In order to address this question, we required an experimental method to measure the level and extent of autophagic response in cells - and since there are no known molecular markers that can accurately track this process, we chose to develop an imaging-based approach, using quantitative 3D fluorescence microscopy(11,12). Using CWR22Rv1 cells specifically-labeled with fluorescent probes for autophagosomes and lysosomes, we show that 3D image stacks acquired with either widefield deconvolution microscopy (and later, with super-resolution, structured-illumination microscopy) can clearly capture the early

  8. Tandem Mass Spectrometry Measurement of the Collision Products of Carbamate Anions Derived from CO2 Capture Sorbents: Paving the Way for Accurate Quantitation

    NASA Astrophysics Data System (ADS)

    Jackson, Phil; Fisher, Keith J.; Attalla, Moetaz Ibrahim

    2011-08-01

    The reaction between CO2 and aqueous amines to produce a charged carbamate product plays a crucial role in post-combustion capture chemistry when primary and secondary amines are used. In this paper, we report the low energy negative-ion CID results for several anionic carbamates derived from primary and secondary amines commonly used as post-combustion capture solvents. The study was performed using the modern equivalent of a triple quadrupole instrument equipped with a T-wave collision cell. Deuterium labeling of 2-aminoethanol (1,1,2,2,-d4-2-aminoethanol) and computations at the M06-2X/6-311++G(d,p) level were used to confirm the identity of the fragmentation products for 2-hydroxyethylcarbamate (derived from 2-aminoethanol), in particular the ions CN-, NCO- and facile neutral losses of CO2 and water; there is precedent for the latter in condensed phase isocyanate chemistry. The fragmentations of 2-hydroxyethylcarbamate were generalized for carbamate anions derived from other capture amines, including ethylenediamine, diethanolamine, and piperazine. We also report unequivocal evidence for the existence of carbamate anions derived from sterically hindered amines ( Tris(2-hydroxymethyl)aminomethane and 2-methyl-2-aminopropanol). For the suite of carbamates investigated, diagnostic losses include the decarboxylation product (-CO2, 44 mass units), loss of 46 mass units and the fragments NCO- ( m/z 42) and CN- ( m/z 26). We also report low energy CID results for the dicarbamate dianion (-O2CNHC2H4NHCO{2/-}) commonly encountered in CO2 capture solution utilizing ethylenediamine. Finally, we demonstrate a promising ion chromatography-MS based procedure for the separation and quantitation of aqueous anionic carbamates, which is based on the reported CID findings. The availability of accurate quantitation methods for ionic CO2 capture products could lead to dynamic operational tuning of CO2 capture-plants and, thus, cost-savings via real-time manipulation of solvent

  9. Mitochondrial DNA as a non-invasive biomarker: Accurate quantification using real time quantitative PCR without co-amplification of pseudogenes and dilution bias

    SciTech Connect

    Malik, Afshan N.; Shahni, Rojeen; Rodriguez-de-Ledesma, Ana; Laftah, Abas; Cunningham, Phil

    2011-08-19

    Highlights: {yields} Mitochondrial dysfunction is central to many diseases of oxidative stress. {yields} 95% of the mitochondrial genome is duplicated in the nuclear genome. {yields} Dilution of untreated genomic DNA leads to dilution bias. {yields} Unique primers and template pretreatment are needed to accurately measure mitochondrial DNA content. -- Abstract: Circulating mitochondrial DNA (MtDNA) is a potential non-invasive biomarker of cellular mitochondrial dysfunction, the latter known to be central to a wide range of human diseases. Changes in MtDNA are usually determined by quantification of MtDNA relative to nuclear DNA (Mt/N) using real time quantitative PCR. We propose that the methodology for measuring Mt/N needs to be improved and we have identified that current methods have at least one of the following three problems: (1) As much of the mitochondrial genome is duplicated in the nuclear genome, many commonly used MtDNA primers co-amplify homologous pseudogenes found in the nuclear genome; (2) use of regions from genes such as {beta}-actin and 18S rRNA which are repetitive and/or highly variable for qPCR of the nuclear genome leads to errors; and (3) the size difference of mitochondrial and nuclear genomes cause a 'dilution bias' when template DNA is diluted. We describe a PCR-based method using unique regions in the human mitochondrial genome not duplicated in the nuclear genome; unique single copy region in the nuclear genome and template treatment to remove dilution bias, to accurately quantify MtDNA from human samples.

  10. Accurate and easy-to-use assessment of contiguous DNA methylation sites based on proportion competitive quantitative-PCR and lateral flow nucleic acid biosensor.

    PubMed

    Xu, Wentao; Cheng, Nan; Huang, Kunlun; Lin, Yuehe; Wang, Chenguang; Xu, Yuancong; Zhu, Longjiao; Du, Dan; Luo, Yunbo

    2016-06-15

    Many types of diagnostic technologies have been reported for DNA methylation, but they require a standard curve for quantification or only show moderate accuracy. Moreover, most technologies have difficulty providing information on the level of methylation at specific contiguous multi-sites, not to mention easy-to-use detection to eliminate labor-intensive procedures. We have addressed these limitations and report here a cascade strategy that combines proportion competitive quantitative PCR (PCQ-PCR) and lateral flow nucleic acid biosensor (LFNAB), resulting in accurate and easy-to-use assessment. The P16 gene with specific multi-methylated sites, a well-studied tumor suppressor gene, was used as the target DNA sequence model. First, PCQ-PCR provided amplification products with an accurate proportion of multi-methylated sites following the principle of proportionality, and double-labeled duplex DNA was synthesized. Then, a LFNAB strategy was further employed for amplified signal detection via immune affinity recognition, and the exact level of site-specific methylation could be determined by the relative intensity of the test line and internal reference line. This combination resulted in all recoveries being greater than 94%, which are pretty satisfactory recoveries in DNA methylation assessment. Moreover, the developed cascades show significantly high usability as a simple, sensitive, and low-cost tool. Therefore, as a universal platform for sensing systems for the detection of contiguous multi-sites of DNA methylation without external standards and expensive instrumentation, this PCQ-PCR-LFNAB cascade method shows great promise for the point-of-care diagnosis of cancer risk and therapeutics. PMID:26914373

  11. Accurate variational calculations and analysis of the HOCl vibrational energy spectrum

    SciTech Connect

    Skokov, S.; Qi, J.; Bowman, J.M.; Yang, C.; Gray, S.K.; Peterson, K.A. |; Mandelshtam, V.A.

    1998-12-01

    Large scale variational calculations for the vibrational states of HOCl are performed using a recently developed, accurate {ital ab initio} potential energy surface. Three different approaches for obtaining vibrational states are employed and contrasted; a truncation/recoupling scheme with direct diagonalization, the Lanczos method, and Chebyshev iteration with filter diagonalization. The complete spectrum of bound states for nonrotating HOCl is computed and analyzed within a random matrix theory framework. This analysis indicates almost entirely regular dynamics with only a small degree of chaos. The nearly regular spectral structure allows us to make assignments for the most significant part of the spectrum, based on analysis of coordinate expectation values and eigenfunctions. Ground state dipole moments and dipole transition probabilities are also calculated using accurate {ital ab initio} data. Computed values are in good agreement with available experimental data. Some exact rovibrational calculations for J=1, including Coriolis coupling, are performed. The exact results are nearly identical with those obtained from the adiabatic rotation approximation and very close to those from the centrifugal sudden approximation, thus indicating a very small degree of asymmetry and Coriolis coupling for the HOCl molecule. {copyright} {ital 1998 American Institute of Physics.}

  12. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. PMID:24889823

  13. Quantitative Assessment of Protein Structural Models by Comparison of H/D Exchange MS Data with Exchange Behavior Accurately Predicted by DXCOREX

    NASA Astrophysics Data System (ADS)

    Liu, Tong; Pantazatos, Dennis; Li, Sheng; Hamuro, Yoshitomo; Hilser, Vincent J.; Woods, Virgil L.

    2012-01-01

    Peptide amide hydrogen/deuterium exchange mass spectrometry (DXMS) data are often used to qualitatively support models for protein structure. We have developed and validated a method (DXCOREX) by which exchange data can be used to quantitatively assess the accuracy of three-dimensional (3-D) models of protein structure. The method utilizes the COREX algorithm to predict a protein's amide hydrogen exchange rates by reference to a hypothesized structure, and these values are used to generate a virtual data set (deuteron incorporation per peptide) that can be quantitatively compared with the deuteration level of the peptide probes measured by hydrogen exchange experimentation. The accuracy of DXCOREX was established in studies performed with 13 proteins for which both high-resolution structures and experimental data were available. The DXCOREX-calculated and experimental data for each protein was highly correlated. We then employed correlation analysis of DXCOREX-calculated versus DXMS experimental data to assess the accuracy of a recently proposed structural model for the catalytic domain of a Ca2+-independent phospholipase A2. The model's calculated exchange behavior was highly correlated with the experimental exchange results available for the protein, supporting the accuracy of the proposed model. This method of analysis will substantially increase the precision with which experimental hydrogen exchange data can help decipher challenging questions regarding protein structure and dynamics.

  14. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. PMID:26556680

  15. An Improved Quantitative Analysis Method for Plant Cortical Microtubules

    PubMed Central

    Lu, Yi; Huang, Chenyang; Wang, Jia; Shang, Peng

    2014-01-01

    The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD) algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1) image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM) algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies. PMID:24744684

  16. Data from quantitative label free proteomics analysis of rat spleen.

    PubMed

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis. PMID:27358910

  17. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference. PMID:15908107

  18. Sources of Technical Variability in Quantitative LC-MS Proteomics: Human Brain Tissue Sample Analysis.

    SciTech Connect

    Piehowski, Paul D.; Petyuk, Vladislav A.; Orton, Daniel J.; Xie, Fang; Moore, Ronald J.; Ramirez Restrepo, Manuel; Engel, Anzhelika; Lieberman, Andrew P.; Albin, Roger L.; Camp, David G.; Smith, Richard D.; Myers, Amanda J.

    2013-05-03

    To design a robust quantitative proteomics study, an understanding of both the inherent heterogeneity of the biological samples being studied as well as the technical variability of the proteomics methods and platform is needed. Additionally, accurately identifying the technical steps associated with the largest variability would provide valuable information for the improvement and design of future processing pipelines. We present an experimental strategy that allows for a detailed examination of the variability of the quantitative LC-MS proteomics measurements. By replicating analyses at different stages of processing, various technical components can be estimated and their individual contribution to technical variability can be dissected. This design can be easily adapted to other quantitative proteomics pipelines. Herein, we applied this methodology to our label-free workflow for the processing of human brain tissue. For this application, the pipeline was divided into four critical components: Tissue dissection and homogenization (extraction), protein denaturation followed by trypsin digestion and SPE clean-up (digestion), short-term run-to-run instrumental response fluctuation (instrumental variance), and long-term drift of the quantitative response of the LC-MS/MS platform over the 2 week period of continuous analysis (instrumental stability). From this analysis, we found the following contributions to variability: extraction (72%) >> instrumental variance (16%) > instrumental stability (8.4%) > digestion (3.1%). Furthermore, the stability of the platform and its’ suitability for discovery proteomics studies is demonstrated.

  19. Accurate palm vein recognition based on wavelet scattering and spectral regression kernel discriminant analysis

    NASA Astrophysics Data System (ADS)

    Elnasir, Selma; Shamsuddin, Siti Mariyam; Farokhi, Sajad

    2015-01-01

    Palm vein recognition (PVR) is a promising new biometric that has been applied successfully as a method of access control by many organizations, which has even further potential in the field of forensics. The palm vein pattern has highly discriminative features that are difficult to forge because of its subcutaneous position in the palm. Despite considerable progress and a few practical issues, providing accurate palm vein readings has remained an unsolved issue in biometrics. We propose a robust and more accurate PVR method based on the combination of wavelet scattering (WS) with spectral regression kernel discriminant analysis (SRKDA). As the dimension of WS generated features is quite large, SRKDA is required to reduce the extracted features to enhance the discrimination. The results based on two public databases-PolyU Hyper Spectral Palmprint public database and PolyU Multi Spectral Palmprint-show the high performance of the proposed scheme in comparison with state-of-the-art methods. The proposed approach scored a 99.44% identification rate and a 99.90% verification rate [equal error rate (EER)=0.1%] for the hyperspectral database and a 99.97% identification rate and a 99.98% verification rate (EER=0.019%) for the multispectral database.

  20. Accurate rotor loads prediction using the FLAP (Force and Loads Analysis Program) dynamics code

    SciTech Connect

    Wright, A.D.; Thresher, R.W.

    1987-10-01

    Accurately predicting wind turbine blade loads and response is very important in predicting the fatigue life of wind turbines. There is a clear need in the wind turbine community for validated and user-friendly structural dynamics codes for predicting blade loads and response. At the Solar Energy Research Institute (SERI), a Force and Loads Analysis Program (FLAP) has been refined and validated and is ready for general use. Currently, FLAP is operational on an IBM-PC compatible computer and can be used to analyze both rigid- and teetering-hub configurations. The results of this paper show that FLAP can be used to accurately predict the deterministic loads for rigid-hub rotors. This paper compares analytical predictions to field test measurements for a three-bladed, upwind turbine with a rigid-hub configuration. The deterministic loads predicted by FLAP are compared with 10-min azimuth averages of blade root flapwise bending moments for different wind speeds. 6 refs., 12 figs., 3 tabs.

  1. Fast and accurate sensitivity analysis of IMPT treatment plans using Polynomial Chaos Expansion

    NASA Astrophysics Data System (ADS)

    Perkó, Zoltán; van der Voort, Sebastian R.; van de Water, Steven; Hartman, Charlotte M. H.; Hoogeman, Mischa; Lathouwers, Danny

    2016-06-01

    The highly conformal planned dose distribution achievable in intensity modulated proton therapy (IMPT) can severely be compromised by uncertainties in patient setup and proton range. While several robust optimization approaches have been presented to address this issue, appropriate methods to accurately estimate the robustness of treatment plans are still lacking. To fill this gap we present Polynomial Chaos Expansion (PCE) techniques which are easily applicable and create a meta-model of the dose engine by approximating the dose in every voxel with multidimensional polynomials. This Polynomial Chaos (PC) model can be built in an automated fashion relatively cheaply and subsequently it can be used to perform comprehensive robustness analysis. We adapted PC to provide among others the expected dose, the dose variance, accurate probability distribution of dose-volume histogram (DVH) metrics (e.g. minimum tumor or maximum organ dose), exact bandwidths of DVHs, and to separate the effects of random and systematic errors. We present the outcome of our verification experiments based on 6 head-and-neck (HN) patients, and exemplify the usefulness of PCE by comparing a robust and a non-robust treatment plan for a selected HN case. The results suggest that PCE is highly valuable for both research and clinical applications.

  2. Fast and accurate sensitivity analysis of IMPT treatment plans using Polynomial Chaos Expansion.

    PubMed

    Perkó, Zoltán; van der Voort, Sebastian R; van de Water, Steven; Hartman, Charlotte M H; Hoogeman, Mischa; Lathouwers, Danny

    2016-06-21

    The highly conformal planned dose distribution achievable in intensity modulated proton therapy (IMPT) can severely be compromised by uncertainties in patient setup and proton range. While several robust optimization approaches have been presented to address this issue, appropriate methods to accurately estimate the robustness of treatment plans are still lacking. To fill this gap we present Polynomial Chaos Expansion (PCE) techniques which are easily applicable and create a meta-model of the dose engine by approximating the dose in every voxel with multidimensional polynomials. This Polynomial Chaos (PC) model can be built in an automated fashion relatively cheaply and subsequently it can be used to perform comprehensive robustness analysis. We adapted PC to provide among others the expected dose, the dose variance, accurate probability distribution of dose-volume histogram (DVH) metrics (e.g. minimum tumor or maximum organ dose), exact bandwidths of DVHs, and to separate the effects of random and systematic errors. We present the outcome of our verification experiments based on 6 head-and-neck (HN) patients, and exemplify the usefulness of PCE by comparing a robust and a non-robust treatment plan for a selected HN case. The results suggest that PCE is highly valuable for both research and clinical applications. PMID:27227661

  3. Single-Molecule Sensors: Challenges and Opportunities for Quantitative Analysis.

    PubMed

    Gooding, J Justin; Gaus, Katharina

    2016-09-12

    Measurement science has been converging to smaller and smaller samples, such that it is now possible to detect single molecules. This Review focuses on the next generation of analytical tools that combine single-molecule detection with the ability to measure many single molecules simultaneously and/or process larger and more complex samples. Such single-molecule sensors constitute a new type of quantitative analytical tool, as they perform analysis by molecular counting and thus potentially capture the heterogeneity of the sample. This Review outlines the advantages and potential of these new, quantitative single-molecule sensors, the measurement challenges in making single-molecule devices suitable for analysis, the inspiration biology provides for overcoming these challenges, and some of the solutions currently being explored. PMID:27444661

  4. Novel methods for accurate identification, isolation, and genomic analysis of symptomatic microenvironments in atherosclerotic arteries.

    PubMed

    Slevin, Mark; Baldellou, Maribel; Hill, Elspeth; Alexander, Yvonne; McDowell, Garry; Murgatroyd, Christopher; Carroll, Michael; Degens, Hans; Krupinski, Jerzy; Rovira, Norma; Chowdhury, Mohammad; Serracino-Inglott, Ferdinand; Badimon, Lina

    2014-01-01

    A challenge facing surgeons is identification and selection of patients for carotid endarterectomy or coronary artery bypass/surgical intervention. While some patients with atherosclerosis develop unstable plaques liable to undergo thrombosis, others form more stable plaques and are asymptomatic. Identification of the cellular signaling mechanisms associated with production of the inflammatory, hemorrhagic lesions of mature heterogenic plaques will help significantly in our understanding of the differences in microenvironment associated with development of regions susceptible to rupture and thrombosis and may help to predict the risk of plaque rupture and guide surgical intervention to patients who will most benefit. Here, we demonstrate detailed and novel methodologies for successful and, more importantly, accurate and reproducible extraction, sampling, and analysis of micro-regions in stable and unstable coronary/carotid arteries. This information can be applied to samples from other origins and so should be useful for scientists working with micro-isolation techniques in all fields of biomedical science. PMID:24510873

  5. Paleopedological reconstruction and quantitative analysis of weathering processes in the Southern Piedmont Province

    SciTech Connect

    Feldman, S.B.; Zelazny, L.W. ); Pavich, M.J. )

    1992-01-01

    Soils and paleosols are commonly used to estimate ages of deposits and geomorphic surfaces, and to infer paleoenvironmental conditions during pedogenesis. Accurate interpretation of these and other parameters is currently limited, however, by considerable uncertainty in many fundamental areas of soils-geomorphic research. These include: (1) lack of accurate estimates of weathering rates for reliably-dated surfaces, (2) inability to quantitatively differentiate between the complex effects of climate vs. geomorphic age on weathering rates, processes, and pedogenic properties, and (3) difficulty in assessing which soil properties persist, alter, or become obliterated in the weathering environment as conditions change. In this paper, the authors discuss a method for assessing, on a regional basis, the quantitative relationships between climate, time, and weathering processes along a soil climosequence in the Southern Piedmont Province. Their approach involves sampling exclusively in areas of granitic plutons that exhibit a high degree of homogeneity with regard to total Fe content, bulk mineralogy, and absence of secondary phyllosilicates or sesquioxides. Independent age control is being established by [sup 10]Be dating, and analytical techniques include, in part, (1) geochemical speciation of soil solution and mineral equilibrium determination, (2) elemental analysis and mass balance calculations of elemental flux during pedogenesis, and (3) detailed analysis of Fe-oxide crystallinity, structure, and Al substitution using selective dissolution analysis, and both X-ray and differential X-ray diffraction.

  6. Quantitative NMR Analysis of Partially Substituted Biodiesel Glycerols

    SciTech Connect

    Nagy, M.; Alleman, T. L.; Dyer, T.; Ragauskas, A. J.

    2009-01-01

    Phosphitylation of hydroxyl groups in biodiesel samples with 2-chloro-4,4,5,5-tetramethyl-1,3,2-dioxaphospholane followed by 31P-NMR analysis provides a rapid quantitative analytical technique for the determination of substitution patterns on partially esterified glycerols. The unique 31P-NMR chemical shift data was established with a series mono and di-substituted fatty acid esters of glycerol and then utilized to characterize an industrial sample of partially processed biodiesel.

  7. Quantitative analysis of chaotic synchronization by means of coherence

    NASA Astrophysics Data System (ADS)

    Shabunin, A.; Astakhov, V.; Kurths, J.

    2005-07-01

    We use an index of chaotic synchronization based on the averaged coherence function for the quantitative analysis of the process of the complete synchronization loss in unidirectionally coupled oscillators and maps. We demonstrate that this value manifests different stages of the synchronization breaking. It is invariant to time delay and insensitive to small noise and distortions, which can influence the accessible signals at measurements. Peculiarities of the synchronization destruction in maps and oscillators are investigated.

  8. A quantitative analysis of coupled oscillations using mobile accelerometer sensors

    NASA Astrophysics Data System (ADS)

    Castro-Palacio, Juan Carlos; Velázquez-Abad, Luisberis; Giménez, Fernando; Monsoriu, Juan A.

    2013-05-01

    In this paper, smartphone acceleration sensors were used to perform a quantitative analysis of mechanical coupled oscillations. Symmetric and asymmetric normal modes were studied separately in the first two experiments. In the third, a coupled oscillation was studied as a combination of the normal modes. Results indicate that acceleration sensors of smartphones, which are very familiar to students, represent valuable measurement instruments for introductory and first-year physics courses.

  9. Analysis and accurate quantification of CpG methylation by MALDI mass spectrometry

    PubMed Central

    Tost, Jörg; Schatz, Philipp; Schuster, Matthias; Berlin, Kurt; Gut, Ivo Glynne

    2003-01-01

    As the DNA sequence of the human genome is now nearly finished, the main task of genome research is to elucidate gene function and regulation. DNA methylation is of particular importance for gene regulation and is strongly implicated in the development of cancer. Even minor changes in the degree of methylation can have severe consequences. An accurate quantification of the methylation status at any given position of the genome is a powerful diagnostic indicator. Here we present the first assay for the analysis and precise quantification of methylation on CpG positions in simplex and multiplex reactions based on matrix-assisted laser desorption/ ionisation mass spectrometry detection. Calibration curves for CpGs in two genes were established and an algorithm was developed to account for systematic fluctuations. Regression analysis gave R2 ≥ 0.99 and standard deviation around 2% for the different positions. The limit of detection was ∼5% for the minor isomer. Calibrations showed no significant differences when carried out as simplex or multiplex analyses. All variable parameters were thoroughly investigated, several paraffin-embedded tissue biopsies were analysed and results were verified by established methods like analysis of cloned material. Mass spectrometric results were also compared to chip hybridisation. PMID:12711695

  10. Accurate optical analysis of single-molecule entrapment in nanoscale vesicles.

    PubMed

    Reiner, Joseph E; Jahn, Andreas; Stavis, Samuel M; Culbertson, Michael J; Vreeland, Wyatt N; Burden, Daniel L; Geist, Jon; Gaitan, Michael

    2010-01-01

    We present a nondestructive method to accurately characterize low analyte concentrations (0-10 molecules) in nanometer-scale lipid vesicles. Our approach is based on the application of fluorescence fluctuation analysis (FFA) and multiangle laser light scattering (MALLS) in conjunction with asymmetric field flow fractionation (AFFF) to measure the entrapment efficiency (the ratio of the concentration of encapsulated dye to the initial bulk concentration) of an ensemble of liposomes with an average diameter less than 100 nm. Water-soluble sulforhodamine B (SRB) was loaded into the aqueous interior of nanoscale liposomes synthesized in a microfluidic device. A confocal microscope was used to detect a laser-induced fluorescence signal resulting from both encapsulated and unencapsulated SRB molecules. The first two cumulants of this signal along with the autocorrelation function (ACF) were used to quantify liposome entrapment efficiency. Our analysis moves beyond typical, nonphysical assumptions of equal liposome size and brightness. These advances are essential for characterizing liposomes in the single-molecule encapsulation regime. Our work has further analytical impact because it could increase the interrogation time of free-solution molecular analysis by an order of magnitude and form the basis for the development of liposome standard reference materials. PMID:19950933

  11. Can CO2 assimilation in maize leaves be predicted accurately from chlorophyll fluorescence analysis?

    PubMed

    Edwards, G E; Baker, N R

    1993-08-01

    Analysis is made of the energetics of CO2 fixation, the photochemical quantum requirement per CO2 fixed, and sinks for utilising reductive power in the C4 plant maize. CO2 assimilation is the primary sink for energy derived from photochemistry, whereas photorespiration and nitrogen assimilation are relatively small sinks, particularly in developed leaves. Measurement of O2 exchange by mass spectrometry and CO2 exchange by infrared gas analysis under varying levels of CO2 indicate that there is a very close relationship between the true rate of O2 evolution from PS II and the net rate of CO2 fixation. Consideration is given to measurements of the quantum yields of PS II (φ PS II) from fluorescence analysis and of CO2 assimilation ([Formula: see text]) in maize over a wide range of conditions. The[Formula: see text] ratio was found to remain reasonably constant (ca. 12) over a range of physiological conditions in developed leaves, with varying temperature, CO2 concentrations, light intensities (from 5% to 100% of full sunlight), and following photoinhibition under high light and low temperature. A simple model for predicting CO2 assimilation from fluorescence parameters is presented and evaluated. It is concluded that under a wide range of conditions fluorescence parameters can be used to predict accurately and rapidly CO2 assimilation rates in maize. PMID:24317706

  12. Quantitation of glycerophosphorylcholine by flow injection analysis using immobilized enzymes.

    PubMed

    Mancini, A; Del Rosso, F; Roberti, R; Caligiana, P; Vecchini, A; Binaglia, L

    1996-09-20

    A method for quantitating glycerophosphorylcholine by flow injection analysis is reported in the present paper. Glycerophosphorylcholine phosphodiesterase and choline oxidase, immobilized on controlled porosity glass beads, are packed in a small reactor inserted in a flow injection manifold. When samples containing glycerophosphorylcholine are injected, glycerophosphorylcholine is hydrolyzed into choline and sn-glycerol-3-phosphate. The free choline produced in this reaction is oxidized to betain and hydrogen peroxide. Hydrogen peroxide is detected amperometrically. Quantitation of glycerophosphorylcholine in samples containing choline and phosphorylcholine is obtained inserting ahead of the reactor a small column packed with a mixed bed ion exchange resin. The time needed for each determination does not exceed one minute. The present method, applied to quantitate glycerophosphorylcholine in samples of seminal plasma, gave results comparable with those obtained using the standard enzymatic-spectrophotometric procedure. An alternative procedure, making use of co-immobilized glycerophosphorylcholine phosphodiesterase and glycerol-3-phosphate oxidase for quantitating glycerophosphorylcholine, glycerophosphorylethanolamine and glycerophosphorylserine is also described. PMID:8905629

  13. Quantitative Motion Analysis in Two and Three Dimensions.

    PubMed

    Wessels, Deborah J; Lusche, Daniel F; Kuhl, Spencer; Scherer, Amanda; Voss, Edward; Soll, David R

    2016-01-01

    This chapter describes 2D quantitative methods for motion analysis as well as 3D motion analysis and reconstruction methods. Emphasis is placed on the analysis of dynamic cell shape changes that occur through extension and retraction of force generating structures such as pseudopodia and lamellipodia. Quantitative analysis of these structures is an underutilized tool in the field of cell migration. Our intent, therefore, is to present methods that we developed in an effort to elucidate mechanisms of basic cell motility, directed cell motion during chemotaxis, and metastasis. We hope to demonstrate how application of these methods can more clearly define alterations in motility that arise due to specific mutations or disease and hence, suggest mechanisms or pathways involved in normal cell crawling and treatment strategies in the case of disease. In addition, we present a 4D tumorigenesis model for high-resolution analysis of cancer cells from cell lines and human cancer tissue in a 3D matrix. Use of this model led to the discovery of the coalescence of cancer cell aggregates and unique cell behaviors not seen in normal cells or normal tissue. Graphic illustrations to visually display and quantify cell shape are presented along with algorithms and formulae for calculating select 2D and 3D motion analysis parameters. PMID:26498790

  14. An automated method for analysis of microcirculation videos for accurate assessment of tissue perfusion

    PubMed Central

    2012-01-01

    Background Imaging of the human microcirculation in real-time has the potential to detect injuries and illnesses that disturb the microcirculation at earlier stages and may improve the efficacy of resuscitation. Despite advanced imaging techniques to monitor the microcirculation, there are currently no tools for the near real-time analysis of the videos produced by these imaging systems. An automated system tool that can extract microvasculature information and monitor changes in tissue perfusion quantitatively might be invaluable as a diagnostic and therapeutic endpoint for resuscitation. Methods The experimental algorithm automatically extracts microvascular network and quantitatively measures changes in the microcirculation. There are two main parts in the algorithm: video processing and vessel segmentation. Microcirculatory videos are first stabilized in a video processing step to remove motion artifacts. In the vessel segmentation process, the microvascular network is extracted using multiple level thresholding and pixel verification techniques. Threshold levels are selected using histogram information of a set of training video recordings. Pixel-by-pixel differences are calculated throughout the frames to identify active blood vessels and capillaries with flow. Results Sublingual microcirculatory videos are recorded from anesthetized swine at baseline and during hemorrhage using a hand-held Side-stream Dark Field (SDF) imaging device to track changes in the microvasculature during hemorrhage. Automatically segmented vessels in the recordings are analyzed visually and the functional capillary density (FCD) values calculated by the algorithm are compared for both health baseline and hemorrhagic conditions. These results were compared to independently made FCD measurements using a well-known semi-automated method. Results of the fully automated algorithm demonstrated a significant decrease of FCD values. Similar, but more variable FCD values were calculated

  15. Quantitative multivariate analysis of dynamic multicellular morphogenic trajectories.

    PubMed

    White, Douglas E; Sylvester, Jonathan B; Levario, Thomas J; Lu, Hang; Streelman, J Todd; McDevitt, Todd C; Kemp, Melissa L

    2015-07-01

    Interrogating fundamental cell biology principles that govern tissue morphogenesis is critical to better understanding of developmental biology and engineering novel multicellular systems. Recently, functional micro-tissues derived from pluripotent embryonic stem cell (ESC) aggregates have provided novel platforms for experimental investigation; however elucidating the factors directing emergent spatial phenotypic patterns remains a significant challenge. Computational modelling techniques offer a unique complementary approach to probe mechanisms regulating morphogenic processes and provide a wealth of spatio-temporal data, but quantitative analysis of simulations and comparison to experimental data is extremely difficult. Quantitative descriptions of spatial phenomena across multiple systems and scales would enable unprecedented comparisons of computational simulations with experimental systems, thereby leveraging the inherent power of computational methods to interrogate the mechanisms governing emergent properties of multicellular biology. To address these challenges, we developed a portable pattern recognition pipeline consisting of: the conversion of cellular images into networks, extraction of novel features via network analysis, and generation of morphogenic trajectories. This novel methodology enabled the quantitative description of morphogenic pattern trajectories that could be compared across diverse systems: computational modelling of multicellular structures, differentiation of stem cell aggregates, and gastrulation of cichlid fish. Moreover, this method identified novel spatio-temporal features associated with different stages of embryo gastrulation, and elucidated a complex paracrine mechanism capable of explaining spatiotemporal pattern kinetic differences in ESC aggregates of different sizes. PMID:26095427

  16. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    ERIC Educational Resources Information Center

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  17. Accurate Quantification of High Density Lipoprotein Particle Concentration by Calibrated Ion Mobility Analysis

    PubMed Central

    Hutchins, Patrick M.; Ronsein, Graziella E.; Monette, Jeffrey S.; Pamir, Nathalie; Wimberger, Jake; He, Yi; Anantharamaiah, G.M.; Kim, Daniel Seung; Ranchalis, Jane E.; Jarvik, Gail P.; Vaisar, Tomas; Heinecke, Jay W.

    2015-01-01

    Background It is critical to develop new metrics to determine whether high density lipoprotein (HDL) is cardioprotective in humans. One promising approach is HDL particle concentration (HDL-P) – the size and concentration of HDL in plasma or serum. However, the two methods currently used to determine HDL-P yield concentrations that differ more than 5-fold. We therefore developed and validated an improved approach to quantify HDL-P, termed calibrated ion mobility analysis (calibrated IMA). Methods HDL was isolated from plasma by ultracentrifugation, introduced into the gas phase with electrospray ionization, separated by size, and quantified by particle counting. A calibration curve constructed with purified proteins was used to correct for the ionization efficiency of HDL particles. Results The concentrations of gold nanoparticles and reconstituted HDLs measured by calibrated IMA were indistinguishable from concentrations determined by orthogonal methods. In plasma of control (n=40) and cerebrovascular disease (n=40) subjects, three subspecies of HDL were reproducibility measured, with an estimated total HDL-P of 13.4±2.4 µM (mean±SD). HDL-C accounted for 48% of the variance in HDL-P. HDL-P was significantly lower in subjects with cerebrovascular disease, and this difference remained significant after adjustment for HDL cholesterol levels. Conclusions Calibrated IMA accurately and reproducibly determined the concentration of gold nanoparticles and synthetic HDL, strongly suggesting the method could accurately quantify HDL particle concentration. Importantly, the estimated stoichiometry of apoA-I determined by calibrated IMA was 3–4 per HDL particle, in excellent agreement with current structural models. Furthermore, HDL-P associated with cardiovascular disease status in a clinical population independently of HDL cholesterol. PMID:25225166

  18. Enantiomeric separation in comprehensive two-dimensional gas chromatography with accurate mass analysis.

    PubMed

    Chin, Sung-Tong; Nolvachai, Yada; Marriott, Philip J

    2014-11-01

    Chiral comprehensive two-dimensional gas chromatography (eGC×GC) coupled to quadrupole-accurate mass time-of-flight mass spectrometry (QTOFMS) was evaluated for its capability to report the chiral composition of several monoterpenes, namely, α-pinene, β-pinene, and limonene in cardamom oil. Enantiomers in a standard mixture were fully resolved by direct enantiomeric-GC analysis with a 2,3-di-O-methyl-6-t-butylsilyl derivatized β-cyclodextrin phase; however, the (+)-(R)-limonene enantiomer in cardamom oil was overlapped with other background components including cymene and cineole. Verification of (+)-(R)-limonene components based on characteristic ions at m/z 136, 121, and 107 acquired by chiral single-dimension GC-QTOFMS in the alternate MS/MSMS mode of operation was unsuccessful due to similar parent/daughter ions generated by interfering or co-eluting cymene and cineole. Column phases SUPELCOWAX, SLB-IL111, HP-88, and SLB-IL59, were incorporated as the second dimension column ((2)D) in chiral GC×GC analysis; the SLB-IL59 offered the best resolution for the tested monoterpene enantiomers from the matrix background. Enantiomeric ratios for α-pinene, β-pinene, and limonene were determined to be 1.325, 2.703, and 1.040, respectively, in the cardamom oil sample based on relative peak area data. PMID:24420979

  19. Accurate airway segmentation based on intensity structure analysis and graph-cut

    NASA Astrophysics Data System (ADS)

    Meng, Qier; Kitsaka, Takayuki; Nimura, Yukitaka; Oda, Masahiro; Mori, Kensaku

    2016-03-01

    This paper presents a novel airway segmentation method based on intensity structure analysis and graph-cut. Airway segmentation is an important step in analyzing chest CT volumes for computerized lung cancer detection, emphysema diagnosis, asthma diagnosis, and pre- and intra-operative bronchoscope navigation. However, obtaining a complete 3-D airway tree structure from a CT volume is quite challenging. Several researchers have proposed automated algorithms basically based on region growing and machine learning techniques. However these methods failed to detect the peripheral bronchi branches. They caused a large amount of leakage. This paper presents a novel approach that permits more accurate extraction of complex bronchial airway region. Our method are composed of three steps. First, the Hessian analysis is utilized for enhancing the line-like structure in CT volumes, then a multiscale cavity-enhancement filter is employed to detect the cavity-like structure from the previous enhanced result. In the second step, we utilize the support vector machine (SVM) to construct a classifier for removing the FP regions generated. Finally, the graph-cut algorithm is utilized to connect all of the candidate voxels to form an integrated airway tree. We applied this method to sixteen cases of 3D chest CT volumes. The results showed that the branch detection rate of this method can reach about 77.7% without leaking into the lung parenchyma areas.

  20. Phase-function normalization for accurate analysis of ultrafast collimated radiative transfer.

    PubMed

    Hunter, Brian; Guo, Zhixiong

    2012-04-20

    The scattering of radiation from collimated irradiation is accurately treated via normalization of phase function. This approach is applicable to any numerical method with directional discretization. In this study it is applied to the transient discrete-ordinates method for ultrafast collimated radiative transfer analysis in turbid media. A technique recently developed by the authors, which conserves a phase-function asymmetry factor as well as scattered energy for the Henyey-Greenstein phase function in steady-state diffuse radiative transfer analysis, is applied to the general Legendre scattering phase function in ultrafast collimated radiative transfer. Heat flux profiles in a model tissue cylinder are generated for various phase functions and compared to those generated when normalization of the collimated phase function is neglected. Energy deposition in the medium is also investigated. Lack of conservation of scattered energy and the asymmetry factor for the collimated scattering phase function causes overpredictions in both heat flux and energy deposition for highly anisotropic scattering media. In addition, a discussion is presented to clarify the time-dependent formulation of divergence of radiative heat flux. PMID:22534933

  1. Accurate measurement of bromine contents in plastic samples by instrumental neutron activation analysis.

    PubMed

    Kim, I J; Lee, K S; Hwang, E; Min, H S; Yim, Y H

    2013-03-26

    Accurate measurements of bromine contents in plastic samples were made by the direct comparator instrumental neutron activation analysis (INAA). Individual factors affecting the measurements were comprehensively evaluated and compensated, including the volatility loss of bromine from standard comparators, the background bromine level in the filter papers used for preparation of the standard comparators, nuclear interference, γ-ray spectral interference and the variance among replicates of the samples. Uncertainty contributions from those factors were thoroughly evaluated and included in the uncertainty budgeting of the INAA measurement. (81)Br was chosen as the target isotope, and the INAA measurements for bromine were experimentally confirmed to exhibit good linearity within a bromine content range of 10-170 μg. The established method has been applied to the analysis of eight plastic samples: four commercially available certified reference materials (CRMs) of polyethylene and polystyrene and four acrylonitrile butadiene styrene (ABS) samples prepared as the candidate reference materials (KRISS CRM 113-01-012, -013, -014 and -015). The bromine contents of the samples were calculated at three different γ-ray energies and compared, showing good agreement. The results of the four CRMs also showed good consistency with their certified values within the stated uncertainties. Finally, the bromine contents of the ABS samples were determined with expanded uncertainties (at a 95% level of confidence) between 2.5% and 5% in a bromine content range of 25-900 mg kg(-1). PMID:23498117

  2. Label-Free Technologies for Quantitative Multiparameter Biological Analysis

    PubMed Central

    Qavi, Abraham J.; Washburn, Adam L.; Byeon, Ji-Yeon; Bailey, Ryan C.

    2009-01-01

    In the post-genomic era, information is king and information-rich technologies are critically important drivers in both fundamental biology and medicine. It is now known that single-parameter measurements provide only limited detail and that quantitation of multiple biomolecular signatures can more fully illuminate complex biological function. Label-free technologies have recently attracted significant interest for sensitive and quantitative multiparameter analysis of biological systems. There are several different classes of label-free sensors that are currently being developed both in academia and in industry. In this critical review, we highlight, compare, and contrast some of the more promising approaches. We will describe the fundamental principles of these different methodologies and discuss advantages and disadvantages that might potentially help one in selecting the appropriate technology for a given bioanalytical application. PMID:19221722

  3. Biomechanical cell analysis using quantitative phase imaging (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Wax, Adam; Park, Han Sang; Eldridge, William J.

    2016-03-01

    Quantitative phase imaging provides nanometer scale sensitivity and has been previously used to study spectral and temporal characteristics of individual cells in vitro, especially red blood cells. Here we extend this work to study the mechanical responses of individual cells due to the influence of external stimuli. Cell stiffness may be characterized by analyzing the inherent thermal fluctuations of cells but by applying external stimuli, additional information can be obtained. The time dependent response of cells due to external shear stress is examined with high speed quantitative phase imaging and found to exhibit characteristics that relate to their stiffness. However, analysis beyond the cellular scale also reveals internal organization of the cell and its modulation due to pathologic processes such as carcinogenesis. Further studies with microfluidic platforms point the way for using this approach in high throughput assays.

  4. Microcomputer-based digital image analysis system for quantitative autoradiography

    SciTech Connect

    Hoffman, T.J.; Volkert, W.A.; Holmes, R.A.

    1988-01-01

    A computerized image processing system utilizing an IBM-XT personal microcomputer with the capability of performing quantitative cerebral autoradiography is described. All of the system components are standard computer and optical hardware that can be easily assembled. The system has 512 horizontal by 512 vertical axis resolution with 8 bits per pixel (256 gray levels). Unlike other dedicated image processing systems, the IBM-XT permits the assembly of an efficient, low-cost image analysis system without sacrificing other capabilities of the IBM personal computer. The application of this system in both qualitative and quantitative autoradiography has been the principal factor in developing a new radiopharmaceutical to measure regional cerebral blood flow.

  5. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells.

    PubMed

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells. PMID:26039484

  6. Quantitative analysis of rib kinematics based on dynamic chest bone images: preliminary results

    PubMed Central

    Tanaka, Rie; Sanada, Shigeru; Sakuta, Keita; Kawashima, Hiroki

    2015-01-01

    Abstract. An image-processing technique for separating bones from soft tissue in static chest radiographs has been developed. The present study was performed to evaluate the usefulness of dynamic bone images in quantitative analysis of rib movement. Dynamic chest radiographs of 16 patients were obtained using a dynamic flat-panel detector and processed to create bone images by using commercial software (Clear Read BS, Riverain Technologies). Velocity vectors were measured in local areas on the dynamic images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as a reduced rib velocity field, resulting in an asymmetrical distribution of rib movement. Vector maps in all normal cases exhibited left/right symmetric distributions of the velocity field, whereas those in abnormal cases showed asymmetric distributions because of locally limited rib movements. Dynamic bone images were useful for accurate quantitative analysis of rib movements. The present method has a potential for an additional functional examination in chest radiography. PMID:26158097

  7. Quantitative analysis of astrogliosis in drug-dependent humans.

    PubMed

    Weber, Marco; Scherf, Nico; Kahl, Thomas; Braumann, Ulf-Dietrich; Scheibe, Patrick; Kuska, Jens-Peer; Bayer, Ronny; Büttner, Andreas; Franke, Heike

    2013-03-15

    Drug addiction is a chronic, relapsing disease caused by neurochemical and molecular changes in the brain. In this human autopsy study qualitative and quantitative changes of glial fibrillary acidic protein (GFAP)-positive astrocytes in the hippocampus of 26 lethally intoxicated drug addicts and 35 matched controls are described. The morphological characterization of these cells reflected alterations representative for astrogliosis. But, neither quantification of GFAP-positive cells nor the Western blot analysis indicated statistical significant differences between drug fatalities versus controls. However, by semi-quantitative scoring a significant shift towards higher numbers of activated astrocytes in the drug group was detected. To assess morphological changes quantitatively, graph-based representations of astrocyte morphology were obtained from single cell images captured by confocal laser scanning microscopy. Their underlying structures were used to quantify changes in astroglial fibers in an automated fashion. This morphometric analysis yielded significant differences between the investigated groups for four different measures of fiber characteristics (Euclidean distance, graph distance, number of graph elements, fiber skeleton distance), indicating that, e.g., astrocytes in drug addicts on average exhibit significant elongation of fiber structures as well as two-fold increase in GFAP-positive fibers as compared with those in controls. In conclusion, the present data show characteristic differences in morphology of hippocampal astrocytes in drug addicts versus controls and further supports the involvement of astrocytes in human pathophysiology of drug addiction. The automated quantification of astrocyte morphologies provides a novel, testable way to assess the fiber structures in a quantitative manner as opposed to standard, qualitative descriptions. PMID:23337617

  8. Bayesian Shrinkage Analysis of Quantitative Trait Loci for Dynamic Traits

    PubMed Central

    Yang, Runqing; Xu, Shizhong

    2007-01-01

    Many quantitative traits are measured repeatedly during the life of an organism. Such traits are called dynamic traits. The pattern of the changes of a dynamic trait is called the growth trajectory. Studying the growth trajectory may enhance our understanding of the genetic architecture of the growth trajectory. Recently, we developed an interval-mapping procedure to map QTL for dynamic traits under the maximum-likelihood framework. We fit the growth trajectory by Legendre polynomials. The method intended to map one QTL at a time and the entire QTL analysis involved scanning the entire genome by fitting multiple single-QTL models. In this study, we propose a Bayesian shrinkage analysis for estimating and mapping multiple QTL in a single model. The method is a combination between the shrinkage mapping for individual quantitative traits and the Legendre polynomial analysis for dynamic traits. The multiple-QTL model is implemented in two ways: (1) a fixed-interval approach where a QTL is placed in each marker interval and (2) a moving-interval approach where the position of a QTL can be searched in a range that covers many marker intervals. Simulation study shows that the Bayesian shrinkage method generates much better signals for QTL than the interval-mapping approach. We propose several alternative methods to present the results of the Bayesian shrinkage analysis. In particular, we found that the Wald test-statistic profile can serve as a mechanism to test the significance of a putative QTL. PMID:17435239

  9. NEW TARGET AND CONTROL ASSAYS FOR QUANTITATIVE POLYMERASE CHAIN REACTION (QPCR) ANALYSIS OF ENTEROCOCCI IN WATER

    EPA Science Inventory

    Enterococci are frequently monitored in water samples as indicators of fecal pollution. Attention is now shifting from culture based methods for enumerating these organisms to more rapid molecular methods such as QPCR. Accurate quantitative analyses by this method requires highly...

  10. A quantitative analysis of IRAS maps of molecular clouds

    NASA Technical Reports Server (NTRS)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-01-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  11. A quantitative analysis of IRAS maps of molecular clouds

    NASA Astrophysics Data System (ADS)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-11-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  12. Quantitative option analysis for implementation and management of landfills.

    PubMed

    Kerestecioğlu, Merih

    2016-09-01

    The selection of the most feasible strategy for implementation of landfills is a challenging step. Potential implementation options of landfills cover a wide range, from conventional construction contracts to the concessions. Montenegro, seeking to improve the efficiency of the public services while maintaining affordability, was considering privatisation as a way to reduce public spending on service provision. In this study, to determine the most feasible model for construction and operation of a regional landfill, a quantitative risk analysis was implemented with four steps: (i) development of a global risk matrix; (ii) assignment of qualitative probabilities of occurrences and magnitude of impacts; (iii) determination of the risks to be mitigated, monitored, controlled or ignored; (iv) reduction of the main risk elements; and (v) incorporation of quantitative estimates of probability of occurrence and expected impact for each risk element in the reduced risk matrix. The evaluated scenarios were: (i) construction and operation of the regional landfill by the public sector; (ii) construction and operation of the landfill by private sector and transfer of the ownership to the public sector after a pre-defined period; and (iii) operation of the landfill by the private sector, without ownership. The quantitative risk assessment concluded that introduction of a public private partnership is not the most feasible option, unlike the common belief in several public institutions in developing countries. A management contract for the first years of operation was advised to be implemented, after which, a long term operating contract may follow. PMID:27354014

  13. Facegram - Objective quantitative analysis in facial reconstructive surgery.

    PubMed

    Gerós, Ana; Horta, Ricardo; Aguiar, Paulo

    2016-06-01

    Evaluation of effectiveness in reconstructive plastic surgery has become an increasingly important asset in comparing and choosing the most suitable medical procedure to handle facial disfigurement. Unfortunately, traditional methods to assess the results of surgical interventions are mostly qualitative and lack information about movement dynamics. Along with this, the few existing methodologies tailored to objectively quantify surgery results are not practical in the medical field due to constraints in terms of cost, complexity and poor suitability to clinical environment. These limitations enforce an urgent need for the creation of a new system to quantify facial movement and allow for an easy interpretation by medical experts. With this in mind, we present here a novel method capable of quantitatively and objectively assess complex facial movements, using a set of morphological, static and dynamic measurements. For this purpose, RGB-D cameras are used to acquire both color and depth images, and a modified block matching algorithm, combining depth and color information, was developed to track the position of anatomical landmarks of interest. The algorithms are integrated into a user-friendly graphical interface and the analysis outcomes are organized into an innovative medical tool, named facegram. This system was developed in close collaboration with plastic surgeons and the methods were validated using control subjects and patients with facial paralysis. The system was shown to provide useful and detailed quantitative information (static and dynamic) making it an appropriate solution for objective quantitative characterization of facial movement in a clinical environment. PMID:26994664

  14. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  15. Quantitative analysis of volatiles in edible oils following accelerated oxidation using broad spectrum isotope standards.

    PubMed

    Gómez-Cortés, Pilar; Sacks, Gavin L; Brenna, J Thomas

    2015-05-01

    Analysis of food volatiles generated by processing are widely reported but comparisons across studies is challenging in part because most reports are inherently semi-quantitative for most analytes due to limited availability of chemical standards. We recently introduced a novel strategy for creation of broad spectrum isotopic standards for accurate quantitative food chemical analysis. Here we apply the principle to quantification of 25 volatiles in seven thermally oxidised edible oils. After extended oxidation, total volatiles of high n-3 oils (flax, fish, cod liver) were 120-170 mg/kg while low n-3 vegetable oils were <50mg/kg. Separate experiments on thermal degradation of d5-ethyl linolenate indicate that off-aroma volatiles originate throughout the n-3 molecule and not solely the n-3 terminal end. These data represent the first report using broad-spectrum isotopically labelled standards for quantitative characterisation of processing-induced volatile generation across related foodstuffs, and verify the origin of specific volatiles from parent n-3 fatty acids. PMID:25529686

  16. Quantitative analysis of volatiles in edible oils following accelerated oxidation using broad spectrum isotope standards

    PubMed Central

    Gómez-Cortés, Pilar; Sacks, Gavin L.; Brenna, J. Thomas

    2014-01-01

    Analysis of food volatiles generated by processing are widely reported but comparisons across studies is challenging in part because most reports are inherently semi-quantitative for most analytes due to limited availability of chemical standards. We recently introduced a novel strategy for creation of broad spectrum isotopic standards for accurate quantitative food chemical analysis. Here we apply the principle to quantification of 25 volatiles in seven thermally oxidized edible oils. After extended oxidation, total volatiles of high n-3 oils (flax, fish, cod liver) were 120-170 mg/kg while low n-3 vegetable oils were <50 mg/kg. Separate experiments on thermal degradation of d5-ethyl linolenate indicate that off-aroma volatiles originate throughout the n-3 molecule and not solely the n-3 terminal end. These data represent the first report using broad-spectrum isotopically labeled standards for quantitative characterization of processing-induced volatile generation across related foodstuffs, and verify the origin of specific volatiles from parent n-3 fatty acids. PMID:25529686

  17. QUANTITATIVE MASS SPECTROMETRIC ANALYSIS OF GLYCOPROTEINS COMBINED WITH ENRICHMENT METHODS

    PubMed Central

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc. Rapid Commun. Mass Spec Rev 34:148–165, 2015. PMID:24889823

  18. Phosphorylation-Specific MS/MS Scoring for Rapid and Accurate Phosphoproteome Analysis

    PubMed Central

    Payne, Samuel H.; Yau, Margaret; Smolka, Marcus B.; Tanner, Stephen; Zhou, Huilin; Bafna, Vineet

    2008-01-01

    The promise of mass spectrometry as a tool for probing signal-transduction is predicated on reliable identification of post-translational modifications. Phosphorylations are key mediators of cellular signaling, yet are hard to detect, partly because of unusual fragmentation patterns of phosphopeptides. In addition to being accurate, MS/MS identification software must be robust and efficient to deal with increasingly large spectral data sets. Here, we present a new scoring function for the Inspect software for phosphorylated peptide tandem mass spectra for ion-trap instruments, without the need for manual validation. The scoring function was modeled by learning fragmentation patterns from 7677 validated phosphopeptide spectra. We compare our algorithm against SEQUEST and X!Tandem on testing and training data sets. At a 1% false positive rate, Inspect identified the greatest total number of phosphorylated spectra, 13% more than SEQUEST and 39% more than X!Tandem. Spectra identified by Inspect tended to score better in several spectral quality measures. Furthermore, Inspect runs much faster than either SEQUEST or X!Tandem, making desktop phosphoproteomics feasible. Finally, we used our new models to reanalyze a corpus of 423 000 LTQ spectra acquired for a phosphoproteome analysis of Saccharomyces cerevisiae DNA damage and repair pathways and discovered 43% more phosphopeptides than the previous study. PMID:18563926

  19. Spectral neighbor analysis method for automated generation of quantum-accurate interatomic potentials

    SciTech Connect

    Thompson, A.P.; Swiler, L.P.; Trott, C.R.; Foiles, S.M.; Tucker, G.J.

    2015-03-15

    We present a new interatomic potential for solids and liquids called Spectral Neighbor Analysis Potential (SNAP). The SNAP potential has a very general form and uses machine-learning techniques to reproduce the energies, forces, and stress tensors of a large set of small configurations of atoms, which are obtained using high-accuracy quantum electronic structure (QM) calculations. The local environment of each atom is characterized by a set of bispectrum components of the local neighbor density projected onto a basis of hyperspherical harmonics in four dimensions. The bispectrum components are the same bond-orientational order parameters employed by the GAP potential [1]. The SNAP potential, unlike GAP, assumes a linear relationship between atom energy and bispectrum components. The linear SNAP coefficients are determined using weighted least-squares linear regression against the full QM training set. This allows the SNAP potential to be fit in a robust, automated manner to large QM data sets using many bispectrum components. The calculation of the bispectrum components and the SNAP potential are implemented in the LAMMPS parallel molecular dynamics code. We demonstrate that a previously unnoticed symmetry property can be exploited to reduce the computational cost of the force calculations by more than one order of magnitude. We present results for a SNAP potential for tantalum, showing that it accurately reproduces a range of commonly calculated properties of both the crystalline solid and the liquid phases. In addition, unlike simpler existing potentials, SNAP correctly predicts the energy barrier for screw dislocation migration in BCC tantalum.

  20. Highly accurate retrieval method of Japanese document images through a combination of morphological analysis and OCR

    NASA Astrophysics Data System (ADS)

    Katsuyama, Yutaka; Takebe, Hiroaki; Kurokawa, Koji; Saitoh, Takahiro; Naoi, Satoshi

    2001-12-01

    We have developed a method that allows Japanese document images to be retrieved more accurately by using OCR character candidate information and a conventional plain text search engine. In this method, the document image is first recognized by normal OCR to produce text. Keyword areas are then estimated from the normal OCR produced text through morphological analysis. A lattice of candidate- character codes is extracted from these areas, and then character strings are extracted from the lattice using a word-matching method in noun areas and a K-th DP-matching method in undefined word areas. Finally, these extracted character strings are added to the normal OCR produced text to improve document retrieval accuracy when u sing a conventional plain text search engine. Experimental results from searches of 49 OHP sheet images revealed that our method has a high recall rate of 98.2%, compared to 90.3% with a conventional method using only normal OCR produced text, while requiring about the same processing time as normal OCR.

  1. Quantitative Analysis Of Cristobalite In The Presence Of Quartz

    NASA Astrophysics Data System (ADS)

    Totten, Gary A.

    1985-12-01

    The detection and quantitation of-cristobalite in quartz is necessary to calculate threshold value limits (TVL) for free crystalline silica (FCS) as proposed by the American Conference of Governmental Industrial Hygienists (ACGIH). The cristobalite standard used in this study was made by heating diatomaceous earth to the transition temperature for cristobalite. The potassium bromide (KBR) pellet method was used for the analysis. Potassium cyanide (KCN) was used as an internal standard. Samples ranged from 5% to 30% cris-tobalite in quartz. Precision for this method is within 2%.

  2. Quantitative proteomic analysis of drug-induced changes in mycobacteria.

    PubMed

    Hughes, Minerva A; Silva, Jeffrey C; Geromanos, Scott J; Townsend, Craig A

    2006-01-01

    A new approach for qualitative and quantitative proteomic analysis using capillary liquid chromatography and mass spectrometry to study the protein expression response in mycobacteria following isoniazid treatment is discussed. In keeping with known effects on the fatty acid synthase II pathway, proteins encoded by the kas operon (AcpM, KasA, KasB, Accd6) were significantly overexpressed, as were those involved in iron metabolism and cell division suggesting a complex interplay of metabolic events leading to cell death. PMID:16396495

  3. Fast and Accurate Radiative Transfer Calculations Using Principal Component Analysis for Climate Modeling

    NASA Astrophysics Data System (ADS)

    Kopparla, P.; Natraj, V.; Spurr, R. J. D.; Shia, R. L.; Yung, Y. L.

    2014-12-01

    Radiative transfer (RT) computations are an essential component of energy budget calculations in climate models. However, full treatment of RT processes is computationally expensive, prompting usage of 2-stream approximations in operational climate models. This simplification introduces errors of the order of 10% in the top of the atmosphere (TOA) fluxes [Randles et al., 2013]. Natraj et al. [2005, 2010] and Spurr and Natraj [2013] demonstrated the ability of a technique using principal component analysis (PCA) to speed up RT simulations. In the PCA method for RT performance enhancement, empirical orthogonal functions are developed for binned sets of inherent optical properties that possess some redundancy; costly multiple-scattering RT calculations are only done for those (few) optical states corresponding to the most important principal components, and correction factors are applied to approximate radiation fields. Here, we extend the PCA method to a broadband spectral region from the ultraviolet to the shortwave infrared (0.3-3 micron), accounting for major gas absorptions in this region. Comparisons between the new model, called Universal Principal Component Analysis model for Radiative Transfer (UPCART), 2-stream models (such as those used in climate applications) and line-by-line RT models are performed, in order for spectral radiances, spectral fluxes and broadband fluxes. Each of these are calculated at the TOA for several scenarios with varying aerosol types, extinction and scattering optical depth profiles, and solar and viewing geometries. We demonstrate that very accurate radiative forcing estimates can be obtained, with better than 1% accuracy in all spectral regions and better than 0.1% in most cases as compared to an exact line-by-line RT model. The model is comparable in speeds to 2-stream models, potentially rendering UPCART useful for operational General Circulation Models (GCMs). The operational speed and accuracy of UPCART can be further

  4. Accurate means of detecting and characterizing abnormal patterns of ventricular activation by phase image analysis

    SciTech Connect

    Botvinick, E.H.; Frais, M.A.; Shosa, D.W.; O'Connell, J.W.; Pacheco-Alvarez, J.A.; Scheinman, M.; Hattner, R.S.; Morady, F.; Faulkner, D.B.

    1982-08-01

    The ability of scintigraphic phase image analysis to characterize patterns of abnormal ventricular activation was investigated. The pattern of phase distribution and sequential phase changes over both right and left ventricular regions of interest were evaluated in 16 patients with normal electrical activation and wall motion and compared with those in 8 patients with an artificial pacemaker and 4 patients with sinus rhythm with the Wolff-Parkinson-White syndrome and delta waves. Normally, the site of earliest phase angle was seen at the base of the interventricular septum, with sequential change affecting the body of the septum and the cardiac apex and then spreading laterally to involve the body of both ventricles. The site of earliest phase angle was located at the apex of the right ventricle in seven patients with a right ventricular endocardial pacemaker and on the lateral left ventricular wall in one patient with a left ventricular epicardial pacemaker. In each case the site corresponded exactly to the position of the pacing electrode as seen on posteroanterior and left lateral chest X-ray films, and sequential phase changes spread from the initial focus to affect both ventricles. In each of the patients with the Wolff-Parkinson-White syndrome, the site of earliest ventricular phase angle was located, and it corresponded exactly to the site of the bypass tract as determined by endocardial mapping. In this way, four bypass pathways, two posterior left paraseptal, one left lateral and one right lateral, were correctly localized scintigraphically. On the basis of the sequence of mechanical contraction, phase image analysis provides an accurate noninvasive method of detecting abnormal foci of ventricular activation.

  5. Fast and Accurate Radiative Transfer Calculations Using Principal Component Analysis for (Exo-)Planetary Retrieval Models

    NASA Astrophysics Data System (ADS)

    Kopparla, P.; Natraj, V.; Shia, R. L.; Spurr, R. J. D.; Crisp, D.; Yung, Y. L.

    2015-12-01

    Radiative transfer (RT) computations form the engine of atmospheric retrieval codes. However, full treatment of RT processes is computationally expensive, prompting usage of two-stream approximations in current exoplanetary atmospheric retrieval codes [Line et al., 2013]. Natraj et al. [2005, 2010] and Spurr and Natraj [2013] demonstrated the ability of a technique using principal component analysis (PCA) to speed up RT computations. In the PCA method for RT performance enhancement, empirical orthogonal functions are developed for binned sets of inherent optical properties that possess some redundancy; costly multiple-scattering RT calculations are only done for those few optical states corresponding to the most important principal components, and correction factors are applied to approximate radiation fields. Kopparla et al. [2015, in preparation] extended the PCA method to a broadband spectral region from the ultraviolet to the shortwave infrared (0.3-3 micron), accounting for major gas absorptions in this region. Here, we apply the PCA method to a some typical (exo-)planetary retrieval problems. Comparisons between the new model, called Universal Principal Component Analysis Radiative Transfer (UPCART) model, two-stream models and line-by-line RT models are performed, for spectral radiances, spectral fluxes and broadband fluxes. Each of these are calculated at the top of the atmosphere for several scenarios with varying aerosol types, extinction and scattering optical depth profiles, and stellar and viewing geometries. We demonstrate that very accurate radiance and flux estimates can be obtained, with better than 1% accuracy in all spectral regions and better than 0.1% in most cases, as compared to a numerically exact line-by-line RT model. The accuracy is enhanced when the results are convolved to typical instrument resolutions. The operational speed and accuracy of UPCART can be further improved by optimizing binning schemes and parallelizing the codes, work

  6. Functional Regression Models for Epistasis Analysis of Multiple Quantitative Traits

    PubMed Central

    Xie, Dan; Liang, Meimei; Xiong, Momiao

    2016-01-01

    To date, most genetic analyses of phenotypes have focused on analyzing single traits or analyzing each phenotype independently. However, joint epistasis analysis of multiple complementary traits will increase statistical power and improve our understanding of the complicated genetic structure of the complex diseases. Despite their importance in uncovering the genetic structure of complex traits, the statistical methods for identifying epistasis in multiple phenotypes remains fundamentally unexplored. To fill this gap, we formulate a test for interaction between two genes in multiple quantitative trait analysis as a multiple functional regression (MFRG) in which the genotype functions (genetic variant profiles) are defined as a function of the genomic position of the genetic variants. We use large-scale simulations to calculate Type I error rates for testing interaction between two genes with multiple phenotypes and to compare the power with multivariate pairwise interaction analysis and single trait interaction analysis by a single variate functional regression model. To further evaluate performance, the MFRG for epistasis analysis is applied to five phenotypes of exome sequence data from the NHLBI’s Exome Sequencing Project (ESP) to detect pleiotropic epistasis. A total of 267 pairs of genes that formed a genetic interaction network showed significant evidence of epistasis influencing five traits. The results demonstrate that the joint interaction analysis of multiple phenotypes has a much higher power to detect interaction than the interaction analysis of a single trait and may open a new direction to fully uncovering the genetic structure of multiple phenotypes. PMID:27104857

  7. Functional Regression Models for Epistasis Analysis of Multiple Quantitative Traits.

    PubMed

    Zhang, Futao; Xie, Dan; Liang, Meimei; Xiong, Momiao

    2016-04-01

    To date, most genetic analyses of phenotypes have focused on analyzing single traits or analyzing each phenotype independently. However, joint epistasis analysis of multiple complementary traits will increase statistical power and improve our understanding of the complicated genetic structure of the complex diseases. Despite their importance in uncovering the genetic structure of complex traits, the statistical methods for identifying epistasis in multiple phenotypes remains fundamentally unexplored. To fill this gap, we formulate a test for interaction between two genes in multiple quantitative trait analysis as a multiple functional regression (MFRG) in which the genotype functions (genetic variant profiles) are defined as a function of the genomic position of the genetic variants. We use large-scale simulations to calculate Type I error rates for testing interaction between two genes with multiple phenotypes and to compare the power with multivariate pairwise interaction analysis and single trait interaction analysis by a single variate functional regression model. To further evaluate performance, the MFRG for epistasis analysis is applied to five phenotypes of exome sequence data from the NHLBI's Exome Sequencing Project (ESP) to detect pleiotropic epistasis. A total of 267 pairs of genes that formed a genetic interaction network showed significant evidence of epistasis influencing five traits. The results demonstrate that the joint interaction analysis of multiple phenotypes has a much higher power to detect interaction than the interaction analysis of a single trait and may open a new direction to fully uncovering the genetic structure of multiple phenotypes. PMID:27104857

  8. Quantitative paleobathymetric analysis from subsidence data: example from middle Ordovician of Michigan basin

    SciTech Connect

    Howell, P.D.; Budai, J.M.

    1989-03-01

    Quantitative paleobathymetry is difficult to determine for any rock sequence with a significant subtidal component. Water depth estimates are traditionally obtained from detailed sedimentology and paleontology, but this type of data is seldom available in subsurface work. Further, a good geological data base may be inconclusive for paleobathymetry in subtidal or substorm-wave base environments. More accurate facies prediction would be possible if paleobathymetry could be determined from the conventional subsurface data normally available to explorationists. Subsidence analysis of sedimentary basins has the potential to provide precise paleobathymetric estimates for a variety of depositional settings. This technique is illustrated using the Middle Ordovician carbonates of the Michigan basin. Tectonic subsidence patterns established from stratigraphic and subsidence modeling of the Lower-Middle Ordovician Prairie du Chien Group in Michigan are projected forward through the Middle Ordovician. Isopach thicknesses of the Black River and Trenton carbonates are superimposed on the tectonic subsidence patterns to provide a quantitative basin-fill model. The model paleobathymetry is then compared with core data from exploration wells to evaluate the model facies interpretation. An excellent fit is achieved for the shallow to deep subtidal platform and basinal Trenton carbonates. This technique allows paleobathymetry to be calculated in many basins where tectonic subsidence patterns can be accurately modeled.

  9. Digital photogrammetry for quantitative wear analysis of retrieved TKA components.

    PubMed

    Grochowsky, J C; Alaways, L W; Siskey, R; Most, E; Kurtz, S M

    2006-11-01

    The use of new materials in knee arthroplasty demands a way in which to accurately quantify wear in retrieved components. Methods such as damage scoring, coordinate measurement, and in vivo wear analysis have been used in the past. The limitations in these methods illustrate a need for a different methodology that can accurately quantify wear, which is relatively easy to perform and uses a minimal amount of expensive equipment. Off-the-shelf digital photogrammetry represents a potentially quick and easy alternative to what is readily available. Eighty tibial inserts were visually examined for front and backside wear and digitally photographed in the presence of two calibrated reference fields. All images were segmented (via manual and automated algorithms) using Adobe Photoshop and National Institute of Health ImageJ. Finally, wear was determined using ImageJ and Rhinoceros software. The absolute accuracy of the method and repeatability/reproducibility by different observers were measured in order to determine the uncertainty of wear measurements. To determine if variation in wear measurements was due to implant design, 35 implants of the three most prevalent designs were subjected to retrieval analysis. The overall accuracy of area measurements was 97.8%. The error in automated segmentation was found to be significantly lower than that of manual segmentation. The photogrammetry method was found to be reasonably accurate and repeatable in measuring 2-D areas and applicable to determining wear. There was no significant variation in uncertainty detected among different implant designs. Photogrammetry has a broad range of applicability since it is size- and design-independent. A minimal amount of off-the-shelf equipment is needed for the procedure and no proprietary knowledge of the implant is needed. PMID:16649169

  10. Quantitative dual-probe microdialysis: mathematical model and analysis.

    PubMed

    Chen, Kevin C; Höistad, Malin; Kehr, Jan; Fuxe, Kjell; Nicholson, Charles

    2002-04-01

    Steady-state microdialysis is a widely used technique to monitor the concentration changes and distributions of substances in tissues. To obtain more information about brain tissue properties from microdialysis, a dual-probe approach was applied to infuse and sample the radiotracer, [3H]mannitol, simultaneously both in agar gel and in the rat striatum. Because the molecules released by one probe and collected by the other must diffuse through the interstitial space, the concentration profile exhibits dynamic behavior that permits the assessment of the diffusion characteristics in the brain extracellular space and the clearance characteristics. In this paper a mathematical model for dual-probe microdialysis was developed to study brain interstitial diffusion and clearance processes. Theoretical expressions for the spatial distribution of the infused tracer in the brain extracellular space and the temporal concentration at the probe outlet were derived. A fitting program was developed using the simplex algorithm, which finds local minima of the standard deviations between experiments and theory by adjusting the relevant parameters. The theoretical curves accurately fitted the experimental data and generated realistic diffusion parameters, implying that the mathematical model is capable of predicting the interstitial diffusion behavior of [3H]mannitol and that it will be a valuable quantitative tool in dual-probe microdialysis. PMID:12067242

  11. Quantitative vibration analysis using a single fringe pattern in time-average speckle interferometry.

    PubMed

    Deepan, B; Quan, C; Tay, C J

    2016-08-01

    In this paper, a novel technique for quantitative vibration analysis using time-average electronic speckle pattern interferometry is proposed. An amplitude-varied time-average refreshing reference frame method is used to capture a fringe pattern with a better fringe contrast than the conventional reference frame technique. The recorded fringe patterns with improved contrast provide better mode shape visibility and are easier to process. A derivative-based regularized phase tracker model is used to retrieve vibration amplitudes from a single fringe pattern. The method does not require a phase shifter to obtain the mode shape or amplitude. The method provides unwrapped amplitude and amplitude derivatives maps directly, so a separate phase unwrapping process is not required. Experimental work is carried out using a circular aluminum plate test specimen and the results are compared with a finite element method modal analysis. Both experimental and numerical results show that the proposed method is robust and accurate. PMID:27505366

  12. Multivariate calibration applied to the quantitative analysis of infrared spectra

    NASA Astrophysics Data System (ADS)

    Haaland, David M.

    1992-03-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in- situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mid- or near-infrared spectra of the blood. Progress toward the noninvasive determination of glucose levels in diabetics is an ultimate goal of this research.

  13. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    SciTech Connect

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-08-14

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ...Cπinteractions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. Finally, the quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.

  14. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    DOE PAGESBeta

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-08-14

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically,more » the presence of Cπ...Cπinteractions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. Finally, the quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.« less

  15. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    PubMed Central

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-01-01

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ⋯Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations. PMID:26306198

  16. Multivariate calibration applied to the quantitative analysis of infrared spectra

    SciTech Connect

    Haaland, D.M.

    1991-01-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.

  17. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  18. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics

    PubMed Central

    Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999–2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  19. A method for quantitative wet chemical analysis of urinary calculi.

    PubMed

    Larsson, L; Sörbo, B; Tiselius, H G; Ohman, S

    1984-06-27

    We describe a simple method for quantitative chemical analysis of urinary calculi requiring no specialized equipment. Pulverized calculi are dried over silica gel at room temperature and dissolved in nitric acid, which was the only effective agent for complete dissolution. Calcium, magnesium, ammonium, and phosphate are then determined by conventional methods. Oxalate is determined by a method based on the quenching action of oxalate on the fluorescence of a zirconium-flavonol complex. Uric acid, when treated with nitric acid, is stoichiometrically converted to alloxan, which is determined fluorimetrically with 1,2-phenylenediamine. Similarly, cystine is oxidized by nitric acid to sulfate, which is determined turbidimetrically as barium sulfate. Protein is determined spectrophotometrically as xanthoprotein. The total mass recovery of authentic calculi was 92.2 +/- 6.7 (SD) per cent. The method permits analysis of calculi as small as 1.0 mg. Internal quality control is performed with specially designed control samples. PMID:6086179

  20. [Quantitative analysis of transformer oil dissolved gases using FTIR].

    PubMed

    Zhao, An-xin; Tang, Xiao-jun; Wang, Er-zhen; Zhang, Zhong-hua; Liu, Jun-hua

    2013-09-01

    For the defects of requiring carrier gas and regular calibration, and low safety using chromatography to on line monitor transformer dissolved gases, it was attempted to establish a dissolved gas analysis system based on Fourier transform infrared spectroscopy. Taking into account the small amount of characteristic gases, many components, detection limit and safety requirements and the difficulty of degasser to put an end to the presence of interference gas, the quantitative analysis model was established based on sparse partial least squares, piecewise section correction and feature variable extraction algorithm using improvement TR regularization. With the characteristic gas of CH4, C2H6, C2H6, and CO2, the results show that using FTIR meets DGA requirements with the spectrum wave number resolution of 1 cm(-1) and optical path of 10 cm. PMID:24369641

  1. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.

    PubMed

    Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  2. Quantitative morphometric analysis for the tectonic characterisation of northern Tunisia.

    NASA Astrophysics Data System (ADS)

    Camafort, Miquel; Pérez-Peña, José Vicente; Booth-Rea, Guillermo; Ranero, César R.; Gràcia, Eulàlia; Azañón, José Miguel; Melki, Fetheddine; Ouadday, Mohamed

    2016-04-01

    Northern Tunisia is characterized by low deformation rates and low to moderate seismicity. Although instrumental seismicity reaches maximum magnitudes of Mw 5.5, some historical earthquakes have occurred with catastrophic consequences in this region. Aiming to improve our knowledge of active tectonics in Tunisia, we carried out both a quantitative morphometric analysis and field study in the north-western region. We applied different morphometric tools, like river profiles, knickpoint analysis, hypsometric curves and integrals and drainage pattern anomalies in order to differentiate between zones with high or low recent tectonic activity. This analysis helps identifying uplift and subsidence zones, which we relate to fault activity. Several active faults in a sparse distribution were identified. A selected sector was studied with a field campaign to test the results obtained with the quantitative analysis. During the fieldwork we identified geological evidence of recent activity and a considerable seismogenic potential along El Alia-Teboursouk (ETF) and Dkhila (DF) faults. The ETF fault could be responsible of one of the most devastating historical earthquakes in northern Tunisia that destroyed Utique in 412 A.D. Geological evidence include fluvial terraces folded by faults, striated and cracked pebbles, clastic dikes, sand volcanoes, coseismic cracks, etc. Although not reflected in the instrumental seismicity, our results support an important seismic hazard, evidenced by the several active tectonic structures identified and the two seismogenic faults described. After obtaining the current active tectonic framework of Tunisia we discuss our results within the western Mediterranean trying to contribute to the understanding of the western Mediterranean tectonic context. With our results, we suggest that the main reason explaining the sparse and scarce seismicity of the area in contrast with the adjacent parts of the Nubia-Eurasia boundary is due to its extended

  3. Quantitative analysis of intact apolipoproteins in human HDL by top-down differential mass spectrometry

    PubMed Central

    Mazur, Matthew T.; Cardasis, Helene L.; Spellman, Daniel S.; Liaw, Andy; Yates, Nathan A.; Hendrickson, Ronald C.

    2010-01-01

    Top-down mass spectrometry holds tremendous potential for the characterization and quantification of intact proteins, including individual protein isoforms and specific posttranslationally modified forms. This technique does not require antibody reagents and thus offers a rapid path for assay development with increased specificity based on the amino acid sequence. Top-down MS is efficient whereby intact protein mass measurement, purification by mass separation, dissociation, and measurement of product ions with ppm mass accuracy occurs on the seconds to minutes time scale. Moreover, as the analysis is based on the accurate measurement of an intact protein, top-down mass spectrometry opens a research paradigm to perform quantitative analysis of “unknown” proteins that differ in accurate mass. As a proof of concept, we have applied differential mass spectrometry (dMS) to the top-down analysis of apolipoproteins isolated from human HDL3. The protein species at 9415.45 Da demonstrates an average fold change of 4.7 (p-value 0.017) and was identified as an O-glycosylated form of apolipoprotein C-III [NANA-(2 → 3)-Gal-β(1 → 3)-GalNAc, +656.2037 Da], a protein associated with coronary artery disease. This work demonstrates the utility of top-down dMS for quantitative analysis of intact protein mixtures and holds potential for facilitating a better understanding of HDL biology and complex biological systems at the protein level. PMID:20388904

  4. Quantitative image analysis in sonograms of the thyroid gland

    NASA Astrophysics Data System (ADS)

    Catherine, Skouroliakou; Maria, Lyra; Aristides, Antoniou; Lambros, Vlahos

    2006-12-01

    High-resolution, real-time ultrasound is a routine examination for assessing the disorders of the thyroid gland. However, the current diagnosis practice is based mainly on qualitative evaluation of the resulting sonograms, therefore depending on the physician's experience. Computerized texture analysis is widely employed in sonographic images of various organs (liver, breast), and it has been proven to increase the sensitivity of diagnosis by providing a better tissue characterization. The present study attempts to characterize thyroid tissue by automatic texture analysis. The texture features that are calculated are based on co-occurrence matrices as they have been proposed by Haralick. The sample consists of 40 patients. For each patient two sonographic images (one for each lobe) are recorded in DICOM format. The lobe is manually delineated in each sonogram, and the co-occurrence matrices for 52 separation vectors are calculated. The texture features extracted from each one of these matrices are: contrast, correlation, energy and homogeneity. Primary component analysis is used to select the optimal set of features. The statistical analysis resulted in the extraction of 21 optimal descriptors. The optimal descriptors are all co-occurrence parameters as the first-order statistics did not prove to be representative of the images characteristics. The bigger number of components depends mainly on correlation for very close or very far distances. The results indicate that quantitative analysis of thyroid sonograms can provide an objective characterization of thyroid tissue.

  5. Trophic relationships in an estuarine environment: A quantitative fatty acid analysis signature approach

    NASA Astrophysics Data System (ADS)

    Magnone, Larisa; Bessonart, Martin; Gadea, Juan; Salhi, María

    2015-12-01

    In order to better understand the functioning of aquatic environments, it is necessary to obtain accurate diet estimations in food webs. Their description should incorporate information about energy flow and the relative importance of trophic pathways. Fatty acids have been extensively used in qualitative studies on trophic relationships in food webs. Recently a new method to estimate quantitatively single predator diet has been developed. In this study, a model of aquatic food web through quantitative fatty acid signature analysis was generated to identify the trophic interactions among the species in the Rocha Lagoon. The biological sampling over two consecutive annual periods was comprehensive enough to identify all functional groups in the aquatic food web (except birds and mammals). Heleobia australis seemed to play a central role in this estuarine ecosystem. As both, a grazer and a prey to several other species, probably H. australis is transferring a great amount of energy to upper trophic levels. Most of the species at Rocha Lagoon have a wide range of prey items in their diet reflecting a complex food web, which is characteristic of extremely dynamic environment as estuarine ecosystems. QFASA is a model in tracing and quantitative estimate trophic pathways among species in an estuarine food web. The results obtained in the present work are a valuable contribution in the understanding of trophic relationships in Rocha Lagoon.

  6. The Quantitative Analysis of Chennai Automotive Industry Cluster

    NASA Astrophysics Data System (ADS)

    Bhaskaran, Ethirajan

    2016-07-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  7. The Quantitative Analysis of Chennai Automotive Industry Cluster

    NASA Astrophysics Data System (ADS)

    Bhaskaran, Ethirajan

    2016-05-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  8. Quantitative inverse modelling of a cylindrical object in the laboratory using ERT: An error analysis

    NASA Astrophysics Data System (ADS)

    Korteland, Suze-Anne; Heimovaara, Timo

    2015-03-01

    Electrical resistivity tomography (ERT) is a geophysical technique that can be used to obtain three-dimensional images of the bulk electrical conductivity of the subsurface. Because the electrical conductivity is strongly related to properties of the subsurface and the flow of water it has become a valuable tool for visualization in many hydrogeological and environmental applications. In recent years, ERT is increasingly being used for quantitative characterization, which requires more detailed prior information than a conventional geophysical inversion for qualitative purposes. In addition, the careful interpretation of measurement and modelling errors is critical if ERT measurements are to be used in a quantitative way. This paper explores the quantitative determination of the electrical conductivity distribution of a cylindrical object placed in a water bath in a laboratory-scale tank. Because of the sharp conductivity contrast between the object and the water, a standard geophysical inversion using a smoothness constraint could not reproduce this target accurately. Better results were obtained by using the ERT measurements to constrain a model describing the geometry of the system. The posterior probability distributions of the parameters describing the geometry were estimated with the Markov chain Monte Carlo method DREAM(ZS). Using the ERT measurements this way, accurate estimates of the parameters could be obtained. The information quality of the measurements was assessed by a detailed analysis of the errors. Even for the uncomplicated laboratory setup used in this paper, errors in the modelling of the shape and position of the electrodes and the shape of the domain could be identified. The results indicate that the ERT measurements have a high information content which can be accessed by the inclusion of prior information and the consideration of measurement and modelling errors.

  9. Quantitative analysis of the polarization characteristics of atherosclerotic plaques

    NASA Astrophysics Data System (ADS)

    Gubarkova, Ekaterina V.; Kirillin, Michail Y.; Dudenkova, Varvara V.; Kiseleva, Elena B.; Moiseev, Alexander A.; Gelikonov, Grigory V.; Timofeeva, Lidia B.; Fiks, Ilya I.; Feldchtein, Felix I.; Gladkova, Natalia D.

    2016-04-01

    In this study we demonstrate the capability of cross-polarization optical coherence tomography (CP OCT) to assess collagen and elastin fibers condition in atherosclerotic plaques basing on ratio of the OCT signal levels in cross- and co- polarizations. We consider the depolarization factor (DF) and the effective birefringence (Δn) as quantitative characteristics of CP OCT images. We revealed that calculation of both DF and Δn in the region of interest (fibrous cap) yields a statistically significant difference between stable and unstable plaques (0.46+/-0.21 vs 0.09+/-0.04 for IDF; (4.7+/-1.0)•10-4 vs (2.5+/-0.7)•10-4 for Δn p<0.05). In parallel with CP OCT we used the nonlinear microscopy for analysis of thin cross-section of atherosclerotic plaque, revealing the different average isotropy index of collagen and elastin fibers for stable and unstable plaques (0.30 +/- 0.10 vs 0.70 +/- 0.08; p<0.001). The proposed approach for quantitative assessment of CP OCT images allows cross-scattering and birefringence characterization of stable and unstable atherosclerotic plaques.

  10. Bayesian robust analysis for genetic architecture of quantitative traits

    PubMed Central

    Yang, Runqing; Wang, Xin; Li, Jian; Deng, Hongwen

    2009-01-01

    Motivation: In most quantitative trait locus (QTL) mapping studies, phenotypes are assumed to follow normal distributions. Deviations from this assumption may affect the accuracy of QTL detection and lead to detection of spurious QTLs. To improve the robustness of QTL mapping methods, we replaced the normal distribution for residuals in multiple interacting QTL models with the normal/independent distributions that are a class of symmetric and long-tailed distributions and are able to accommodate residual outliers. Subsequently, we developed a Bayesian robust analysis strategy for dissecting genetic architecture of quantitative traits and for mapping genome-wide interacting QTLs in line crosses. Results: Through computer simulations, we showed that our strategy had a similar power for QTL detection compared with traditional methods assuming normal-distributed traits, but had a substantially increased power for non-normal phenotypes. When this strategy was applied to a group of traits associated with physical/chemical characteristics and quality in rice, more main and epistatic QTLs were detected than traditional Bayesian model analyses under the normal assumption. Contact: runqingyang@sjtu.edu.cn; dengh@umkc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:18974168

  11. Quantitative analysis of gene function in the Drosophila embryo.

    PubMed Central

    Tracey, W D; Ning, X; Klingler, M; Kramer, S G; Gergen, J P

    2000-01-01

    The specific functions of gene products frequently depend on the developmental context in which they are expressed. Thus, studies on gene function will benefit from systems that allow for manipulation of gene expression within model systems where the developmental context is well defined. Here we describe a system that allows for genetically controlled overexpression of any gene of interest under normal physiological conditions in the early Drosophila embryo. This regulated expression is achieved through the use of Drosophila lines that express a maternal mRNA for the yeast transcription factor GAL4. Embryos derived from females that express GAL4 maternally activate GAL4-dependent UAS transgenes at uniform levels throughout the embryo during the blastoderm stage of embryogenesis. The expression levels can be quantitatively manipulated through the use of lines that have different levels of maternal GAL4 activity. Specific phenotypes are produced by expression of a number of different developmental regulators with this system, including genes that normally do not function during Drosophila embryogenesis. Analysis of the response to overexpression of runt provides evidence that this pair-rule segmentation gene has a direct role in repressing transcription of the segment-polarity gene engrailed. The maternal GAL4 system will have applications both for the measurement of gene activity in reverse genetic experiments as well as for the identification of genetic factors that have quantitative effects on gene function in vivo. PMID:10628987

  12. Quantitative analysis of multiple sclerosis: a feasibility study

    NASA Astrophysics Data System (ADS)

    Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.

  13. Quantitative PCR analysis of laryngeal muscle fiber types

    PubMed Central

    Van Daele, Douglas J.

    2013-01-01

    Voice and swallowing dysfunction as a result of recurrent laryngeal nerve paralysis can be improved with vocal fold injections or laryngeal framework surgery. However, denervation atrophy can cause late-term clinical failure. A major determinant of skeletal muscle physiology is myosin heavy chain (MyHC) expression, and previous protein analyses have shown changes in laryngeal muscle fiber MyHC isoform with denervation. RNA analyses in this setting have not been performed, and understanding RNA levels will allow interventions better designed to reverse processes such as denervation in the future. Total RNA was extracted from bilateral rat thyroarytenoid (TA), posterior cricoarytenoid (PCA), and cricothyroid (CT) muscles in rats. Primers were designed using published MyHC isoform sequences. SYBR Green real time reverse transcription-polymerase chain reaction (SYBR-RT-PCR) was used for quantification. The electropherogram showed a clear separation of total RNA to 28S and 18S subunits. Melting curves illustrated single peaks for all type MyHC primers. All MyHC isoforms were identified in all muscles with various degrees of expression. Quantitative PCR is a sensitive method to detect MyHC isoforms in laryngeal muscle. Isoform expression using mRNA analysis was similar to previous analyses but showed some important differences. This technique can be used to quantitatively assess response to interventions targeted to maintain muscle bulk after denervation. PMID:20430402

  14. Quantitative analysis of incipient mineral loss in hard tissues

    NASA Astrophysics Data System (ADS)

    Matvienko, Anna; Mandelis, Andreas; Hellen, Adam; Jeon, Raymond; Abrams, Stephen; Amaechi, Bennett

    2009-02-01

    A coupled diffuse-photon-density-wave and thermal-wave theoretical model was developed to describe the biothermophotonic phenomena in multi-layered hard tissue structures. Photothermal Radiometry was applied as a safe, non-destructive, and highly sensitive tool for the detection of early tooth enamel demineralization to test the theory. Extracted human tooth was treated sequentially with an artificial demineralization gel to simulate controlled mineral loss in the enamel. The experimental setup included a semiconductor laser (659 nm, 120 mW) as the source of the photothermal signal. Modulated laser light generated infrared blackbody radiation from teeth upon absorption and nonradiative energy conversion. The infrared flux emitted by the treated region of the tooth surface and sub-surface was monitored with an infrared detector, both before and after treatment. Frequency scans with a laser beam size of 3 mm were performed in order to guarantee one-dimensionality of the photothermal field. TMR images showed clear differences between sound and demineralized enamel, however this technique is destructive. Dental radiographs did not indicate any changes. The photothermal signal showed clear change even after 1 min of gel treatment. As a result of the fittings, thermal and optical properties of sound and demineralized enamel were obtained, which allowed for quantitative differentiation of healthy and non-healthy regions. In conclusion, the developed model was shown to be a promising tool for non-invasive quantitative analysis of early demineralization of hard tissues.

  15. Analysis of generalized interictal discharges using quantitative EEG.

    PubMed

    da Silva Braga, Aline Marques; Fujisao, Elaine Keiko; Betting, Luiz Eduardo

    2014-12-01

    Experimental evidence from animal models of the absence seizures suggests a focal source for the initiation of generalized spike-and-wave (GSW) discharges. Furthermore, clinical studies indicate that patients diagnosed with idiopathic generalized epilepsy (IGE) exhibit focal electroencephalographic abnormalities, which involve the thalamo-cortical circuitry. This circuitry is a key network that has been implicated in the initiation of generalized discharges, and may contribute to the pathophysiology of GSW discharges. Quantitative electroencephalogram (qEEG) analysis may be able to detect abnormalities associated with the initiation of GSW discharges. The objective of this study was to determine whether interictal GSW discharges exhibit focal characteristics using qEEG analysis. In this study, 75 EEG recordings from 64 patients were analyzed. All EEG recordings analyzed contained at least one GSW discharge. EEG recordings were obtained by a 22-channel recorder with electrodes positioned according to the international 10-20 system of electrode placement. EEG activity was recorded for 20 min including photic stimulation and hyperventilation. The EEG recordings were visually inspected, and the first unequivocally confirmed generalized spike was marked for each discharge. Three methods of source imaging analysis were applied: dipole source imaging (DSI), classical LORETA analysis recursively applied (CLARA), and equivalent dipole of independent components with cluster analysis. A total of 753 GSW discharges were identified and spatiotemporally analyzed. Source evaluation analysis using all three techniques revealed that the frontal lobe was the principal source of GSW discharges (70%), followed by the parietal and occipital lobes (14%), and the basal ganglia (12%). The main anatomical sources of GSW discharges were the anterior cingulate cortex (36%) and the medial frontal gyrus (23%). Source analysis did not reveal a common focal source of GSW discharges. However

  16. Quantitative multi-image analysis for biomedical Raman spectroscopic imaging.

    PubMed

    Hedegaard, Martin A B; Bergholt, Mads S; Stevens, Molly M

    2016-05-01

    Imaging by Raman spectroscopy enables unparalleled label-free insights into cell and tissue composition at the molecular level. With established approaches limited to single image analysis, there are currently no general guidelines or consensus on how to quantify biochemical components across multiple Raman images. Here, we describe a broadly applicable methodology for the combination of multiple Raman images into a single image for analysis. This is achieved by removing image specific background interference, unfolding the series of Raman images into a single dataset, and normalisation of each Raman spectrum to render comparable Raman images. Multivariate image analysis is finally applied to derive the contributing 'pure' biochemical spectra for relative quantification. We present our methodology using four independently measured Raman images of control cells and four images of cells treated with strontium ions from substituted bioactive glass. We show that the relative biochemical distribution per area of the cells can be quantified. In addition, using k-means clustering, we are able to discriminate between the two cell types over multiple Raman images. This study shows a streamlined quantitative multi-image analysis tool for improving cell/tissue characterisation and opens new avenues in biomedical Raman spectroscopic imaging. PMID:26833935

  17. Quantitative analysis of the reconstruction performance of interpolants

    NASA Technical Reports Server (NTRS)

    Lansing, Donald L.; Park, Stephen K.

    1987-01-01

    The analysis presented provides a quantitative measure of the reconstruction or interpolation performance of linear, shift-invariant interpolants. The performance criterion is the mean square error of the difference between the sampled and reconstructed functions. The analysis is applicable to reconstruction algorithms used in image processing and to many types of splines used in numerical analysis and computer graphics. When formulated in the frequency domain, the mean square error clearly separates the contribution of the interpolation method from the contribution of the sampled data. The equations provide a rational basis for selecting an optimal interpolant; that is, one which minimizes the mean square error. The analysis has been applied to a selection of frequently used data splines and reconstruction algorithms: parametric cubic and quintic Hermite splines, exponential and nu splines (including the special case of the cubic spline), parametric cubic convolution, Keys' fourth-order cubic, and a cubic with a discontinuous first derivative. The emphasis in this paper is on the image-dependent case in which no a priori knowledge of the frequency spectrum of the sampled function is assumed.

  18. Quantitative analysis of chromosome in situ hybridization signal in paraffin-embedded tissue sections.

    PubMed

    Dhingra, K; Sneige, N; Pandita, T K; Johnston, D A; Lee, J S; Emami, K; Hortobagyi, G N; Hittelman, W N

    1994-06-01

    Interphase cytogenetic analysis using chromosome-specific probes is increasingly being used to detect chromosomal aberrations on paraffin-embedded tissue sections. However, quantitative analysis of the hybridization signal is confounded by the nuclear slicing that occurs during sectioning. To determine the sensitivity and accuracy of chromosome in situ hybridization for detecting numerical chromosomal aberrations on paraffin-embedded sections, in situ hybridization was performed on sections derived from mixtures of cell populations with known frequencies of numerical chromosomal aberrations and the Chromosome Index (CI) was calculated (i.e., total number of signal spots/number of nuclei counted) as a quantitative measure of chromosome copy number. The presence of 25% or more monosomic or tetrasomic cells in a given population was easily detected as a change in CI (P < 0.05). Lower degrees of polysomy could be detected as a small percentage of nuclear fragments with > 2 signal spots. The CI was not significantly influenced by a change in section thickness from 4 to 8 microM, by an increase in cell size from 478 to 986 microM3, or by the choice of detection method (fluorescence vs. conventional bright-field microscopy). Comparative analysis of touch preparations and tissue sections from the corresponding breast tumors showed that CI accurately reflects the average copy number of chromosomes in intact nuclei and may actually be superior to in situ hybridization on whole nuclei for the detection of numerical chromosomal changes in defined histologic areas. This method is thus a sensitive and accurate means of studying genetic changes in premalignant and malignant tissue, and of assessing the genetic changes associated with specific phenotypes. PMID:7924678

  19. An accurate modeling, simulation, and analysis tool for predicting and estimating Raman LIDAR system performance

    NASA Astrophysics Data System (ADS)

    Grasso, Robert J.; Russo, Leonard P.; Barrett, John L.; Odhner, Jefferson E.; Egbert, Paul I.

    2007-09-01

    BAE Systems presents the results of a program to model the performance of Raman LIDAR systems for the remote detection of atmospheric gases, air polluting hydrocarbons, chemical and biological weapons, and other molecular species of interest. Our model, which integrates remote Raman spectroscopy, 2D and 3D LADAR, and USAF atmospheric propagation codes permits accurate determination of the performance of a Raman LIDAR system. The very high predictive performance accuracy of our model is due to the very accurate calculation of the differential scattering cross section for the specie of interest at user selected wavelengths. We show excellent correlation of our calculated cross section data, used in our model, with experimental data obtained from both laboratory measurements and the published literature. In addition, the use of standard USAF atmospheric models provides very accurate determination of the atmospheric extinction at both the excitation and Raman shifted wavelengths.

  20. Quantitative analysis of gallstones using laser-induced breakdown spectroscopy

    SciTech Connect

    Singh, Vivek K.; Singh, Vinita; Rai, Awadhesh K.; Thakur, Surya N.; Rai, Pradeep K.; Singh, Jagdish P

    2008-11-01

    The utility of laser-induced breakdown spectroscopy (LIBS) for categorizing different types of gallbladder stone has been demonstrated by analyzing their major and minor constituents. LIBS spectra of three types of gallstone have been recorded in the 200-900 nm spectral region. Calcium is found to be the major element in all types of gallbladder stone. The spectrophotometric method has been used to classify the stones. A calibration-free LIBS method has been used for the quantitative analysis of metal elements, and the results have been compared with those obtained from inductively coupled plasma atomic emission spectroscopy (ICP-AES) measurements. The single-shot LIBS spectra from different points on the cross section (in steps of 0.5 mm from one end to the other) of gallstones have also been recorded to study the variation of constituents from the center to the surface. The presence of different metal elements and their possible role in gallstone formation is discussed.

  1. Quantitative genetic analysis of injury liability in infants and toddlers

    SciTech Connect

    Phillips, K.; Matheny, A.P. Jr.

    1995-02-27

    A threshold model of latent liability was applied to infant and toddler twin data on total count of injuries sustained during the interval from birth to 36 months of age. A quantitative genetic analysis of estimated twin correlations in injury liability indicated strong genetic dominance effects, but no additive genetic variance was detected. Because interpretations involving overdominance have little research support, the results may be due to low order epistasis or other interaction effects. Boys had more injuries than girls, but this effect was found only for groups whose parents were prompted and questioned in detail about their children`s injuries. Activity and impulsivity are two behavioral predictors of childhood injury, and the results are discussed in relation to animal research on infant and adult activity levels, and impulsivity in adult humans. Genetic epidemiological approaches to childhood injury should aid in targeting higher risk children for preventive intervention. 30 refs., 4 figs., 3 tabs.

  2. Quantitative analysis of forest island pattern in selected Ohio landscapes

    SciTech Connect

    Bowen, G.W.; Burgess, R.L.

    1981-07-01

    The purpose of this study was to quantitatively describe the various aspects of regional distribution patterns of forest islands and relate those patterns to other landscape features. Several maps showing the forest cover of various counties in Ohio were selected as representative examples of forest patterns to be quantified. Ten thousand hectare study areas (landscapes) were delineated on each map. A total of 15 landscapes representing a wide variety of forest island patterns was chosen. Data were converted into a series of continuous variables which contained information pertinent to the sizes, shape, numbers, and spacing of woodlots within a landscape. The continuous variables were used in a factor analysis to describe the variation among landscapes in terms of forest island pattern. The results showed that forest island patterns are related to topography and other environmental features correlated with topography.

  3. Quantitative multielement analysis using high energy particle bombardment

    NASA Technical Reports Server (NTRS)

    Clark, P. J.; Neal, G. F.; Allen, R. O.

    1974-01-01

    Charged particles ranging in energy from 0.8 to 4.0 MeV are used to induce resonant nuclear reactions, Coulomb excitation (gamma X-rays), and X-ray emission in both thick and thin targets. Quantitative analysis is possible for elements from Li to Pb in complex environmental samples, although the matrix can severely reduce the sensitivity. It is necessary to use a comparator technique for the gamma-rays, while for X-rays an internal standard can be used. A USGS standard rock is analyzed for a total of 28 elements. Water samples can be analyzed either by nebulizing the sample doped with Cs or Y onto a thin formvar film or by extracting the sample (with or without an internal standard) onto ion exchange resin which is pressed into a pellet.

  4. Quantitative image analysis of WE43-T6 cracking behavior

    NASA Astrophysics Data System (ADS)

    Ahmad, A.; Yahya, Z.

    2013-06-01

    Environment-assisted cracking of WE43 cast magnesium (4.2 wt.% Yt, 2.3 wt.% Nd, 0.7% Zr, 0.8% HRE) in the T6 peak-aged condition was induced in ambient air in notched specimens. The mechanism of fracture was studied using electron backscatter diffraction, serial sectioning and in situ observations of crack propagation. The intermetallic (rare earthed-enriched divorced intermetallic retained at grain boundaries and predominantly at triple points) material was found to play a significant role in initiating cracks which leads to failure of this material. Quantitative measurements were required for this project. The populations of the intermetallic and clusters of intermetallic particles were analyzed using image analysis of metallographic images. This is part of the work to generate a theoretical model of the effect of notch geometry on the static fatigue strength of this material.

  5. [Accounting for Expected Linkage in Biometric Analysis of Quantitative Traits].

    PubMed

    Mikhailov, M E

    2015-08-01

    The problem of accounting for a genetic estimation of expected linkage in the disposition of random loci was solved for the additive-dominant model. The Comstock-Robinson estimations for the sum of squares of dominant effects, the sum of squares of additive effects, and the average degree of dominance were modified. Also, the Wright's estimation for the number of loci controlling the variation of a quantitative trait was modified and its application sphere was extended. Formulas that should eliminate linkage, on average, were derived for these estimations. Nonbiased estimations were applied to the analysis of maize data. Our result showed that the most likely cause of heterosis is dominance rather than overdominance and that the main part of the heterotic effect is provided by dozens of genes. PMID:26601496

  6. Automated quantitative cytological analysis using portable microfluidic microscopy.

    PubMed

    Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva

    2016-06-01

    In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. PMID:25990413

  7. Large-Scale Quantitative Analysis of Painting Arts

    PubMed Central

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-01-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images – the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances. PMID:25501877

  8. Sensitive LC MS quantitative analysis of carbohydrates by Cs+ attachment.

    PubMed

    Rogatsky, Eduard; Jayatillake, Harsha; Goswami, Gayotri; Tomuta, Vlad; Stein, Daniel

    2005-11-01

    The development of a sensitive assay for the quantitative analysis of carbohydrates from human plasma using LC/MS/MS is described in this paper. After sample preparation, carbohydrates were cationized by Cs(+) after their separation by normal phase liquid chromatography on an amino based column. Cesium is capable of forming a quasi-molecular ion [M + Cs](+) with neutral carbohydrate molecules in the positive ion mode of electrospray ionization mass spectrometry. The mass spectrometer was operated in multiple reaction monitoring mode, and transitions [M + 133] --> 133 were monitored (M, carbohydrate molecular weight). The new method is robust, highly sensitive, rapid, and does not require postcolumn addition or derivatization. It is useful in clinical research for measurement of carbohydrate molecules by isotope dilution assay. PMID:16182559

  9. Quantitative Image Analysis of HIV-1 Infection in Lymphoid Tissue

    NASA Astrophysics Data System (ADS)

    Haase, Ashley T.; Henry, Keith; Zupancic, Mary; Sedgewick, Gerald; Faust, Russell A.; Melroe, Holly; Cavert, Winston; Gebhard, Kristin; Staskus, Katherine; Zhang, Zhi-Qiang; Dailey, Peter J.; Balfour, Henry H., Jr.; Erice, Alejo; Perelson, Alan S.

    1996-11-01

    Tracking human immunodeficiency virus-type 1 (HIV-1) infection at the cellular level in tissue reservoirs provides opportunities to better understand the pathogenesis of infection and to rationally design and monitor therapy A quantitative technique was developed to determine viral burden in two important cellular compartments in lymphoid tissues. Image analysis and in situ hybridization were combined to show that in the presymptomatic stages of infection there is a large, relatively stable pool of virions on the surfaces of follicular dendritic cells and a smaller pool of productively infected cells Despite evidence of constraints on HIV-1 replication in the infected cell population in lymphoid tissues, estimates of the numbers of these cells and the virus they could produce are consistent with the quantities of virus that have been detected in the bloodstream. The cellular sources of virus production and storage in lymphoid tissues can now be studied with this approach over the course of infection and treatment.

  10. Large-Scale Quantitative Analysis of Painting Arts

    NASA Astrophysics Data System (ADS)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  11. Quantitative Analysis of 3′-Hydroxynorcotinine in Human Urine

    PubMed Central

    Upadhyaya, Pramod

    2015-01-01

    Introduction: Based on previous metabolism studies carried out in patas monkeys, we hypothesized that urinary 3′-hydroxynorcotinine could be a specific biomarker for uptake and metabolism of the carcinogen N′-nitrosonornicotine in people who use tobacco products. Methods: We developed a method for quantitation of 3′-hydroxynorcotinine in human urine. [Pyrrolidinone-13C4]3′-hydroxynorcotinine was added to urine as an internal standard, the samples were treated with β-glucuronidase, partially purified by solid supported liquid extraction and quantified by liquid chromatography–electrospray ionization–tandem mass spectrometry. Results: The method was accurate (average accuracy = 102%) and precise (coefficient of variation = 5.6%) in the range of measurement. 3′-Hydroxynorcotinine was detected in 48 urine samples from smokers (mean 393±287 pmol/ml urine) and 12 samples from individuals who had stopped smoking and were using the nicotine patch (mean 658±491 pmol/ml urine), but not in any of 10 samples from nonsmokers. Conclusions: Since the amounts of 3′-hydroxynorcotinine found in smokers’ urine were approximately 50 times greater than the anticipated daily dose of N′-nitrosonornicotine, we concluded that it is a metabolite of nicotine or one of its metabolites, comprising perhaps 1% of nicotine intake in smokers. Therefore, it would not be suitable as a specific biomarker for uptake and metabolism of N′-nitrosonornicotine. Since 3′-hydroxynorcotinine has never been previously reported as a constituent of human urine, further studies are required to determine its source and mode of formation. PMID:25324430

  12. A Novel Quantitative Approach to Concept Analysis: The Internomological Network

    PubMed Central

    Cook, Paul F.; Larsen, Kai R.; Sakraida, Teresa J.; Pedro, Leli

    2012-01-01

    Background When a construct such as patients’ transition to self-management of chronic illness is studied by researchers across multiple disciplines, the meaning of key terms can become confused. This results from inherent problems in language where a term can have multiple meanings (polysemy) and different words can mean the same thing (synonymy). Objectives To test a novel quantitative method for clarifying the meaning of constructs by examining the similarity of published contexts in which they are used. Method Published terms related to the concept transition to self-management of chronic illness were analyzed using the internomological network (INN), a type of latent semantic analysis to calculate the mathematical relationships between constructs based on the contexts in which researchers use each term. This novel approach was tested by comparing results to those from concept analysis, a best-practice qualitative approach to clarifying meanings of terms. By comparing results of the two methods, the best synonyms of transition to self-management, as well as key antecedent, attribute, and consequence terms, were identified. Results Results from INN analysis were consistent with those from concept analysis. The potential synonyms self-management, transition, and adaptation had the greatest utility. Adaptation was the clearest overall synonym, but had lower cross-disciplinary use. The terms coping and readiness had more circumscribed meanings. The INN analysis confirmed key features of transition to self-management, and suggested related concepts not found by the previous review. Discussion The INN analysis is a promising novel methodology that allows researchers to quantify the semantic relationships between constructs. The method works across disciplinary boundaries, and may help to integrate the diverse literature on self-management of chronic illness. PMID:22592387

  13. A quantitative histological analysis of the dilated ureter of childhood.

    PubMed

    Lee, B R; Partin, A W; Epstein, J I; Quinlan, D M; Gosling, J A; Gearhart, J P

    1992-11-01

    A quantitative histological study of the dilated ureter of childhood was performed on 26 ureters. The specimens were from 15 male and 11 female patients 10 days to 12 years old (mean age 2.0 years). A color image analysis system was used to examine and compare collagen and smooth muscle components of the muscularis layers to normal control ureters of similar age. In comparing primary obstructed (12) to primary refluxing (14) megaureters and control ureters (6), there was a statistically different collagen-to-smooth muscle ratio (p < 0.001) between the primary obstructed and primary refluxing megaureter groups. For patients with primary refluxing megaureter there was a 2-fold increase in the tissue matrix ratio of collagen-to-smooth muscle when compared to patients with primary obstructed megaureter. In the primary obstructed megaureters the amount of collagen and smooth muscle was not statistically different from controls (p > 0.01). The increased tissue matrix ratio of 2.0 +/- 0.35 (collagen-to-smooth muscle) in the refluxing megaureter group compared to 0.78 +/- 0.22 in the obstructed megaureter group and 0.52 +/- 0.12 in controls was found to be due not only to a marked increase in collagen but also a significant decrease in the smooth muscle component of the tissue. Primary obstructed and normal control ureters had similar quantitative amounts of smooth muscle with 60 +/- 5% and 61 +/- 6%, respectively, while refluxing megaureters had only 40 +/- 5% smooth muscle. The percentage collagen was 36 +/- 5 in the obstructed megaureter group and 30 +/- 5 in controls, with refluxing megaureters having 58 +/- 5% collagen on analysis. Our findings emphasize the significant differences in the structural components (collagen and smooth muscle) of the dilated ureter of childhood, and provide us with further insight into the pathological nature of these dilated ureters and their surgical repair. PMID:1433552

  14. Quantitative analysis of protein-ligand interactions by NMR.

    PubMed

    Furukawa, Ayako; Konuma, Tsuyoshi; Yanaka, Saeko; Sugase, Kenji

    2016-08-01

    Protein-ligand interactions have been commonly studied through static structures of the protein-ligand complex. Recently, however, there has been increasing interest in investigating the dynamics of protein-ligand interactions both for fundamental understanding of the underlying mechanisms and for drug development. NMR is a versatile and powerful tool, especially because it provides site-specific quantitative information. NMR has widely been used to determine the dissociation constant (KD), in particular, for relatively weak interactions. The simplest NMR method is a chemical-shift titration experiment, in which the chemical-shift changes of a protein in response to ligand titration are measured. There are other quantitative NMR methods, but they mostly apply only to interactions in the fast-exchange regime. These methods derive the dissociation constant from population-averaged NMR quantities of the free and bound states of a protein or ligand. In contrast, the recent advent of new relaxation-based experiments, including R2 relaxation dispersion and ZZ-exchange, has enabled us to obtain kinetic information on protein-ligand interactions in the intermediate- and slow-exchange regimes. Based on R2 dispersion or ZZ-exchange, methods that can determine the association rate, kon, dissociation rate, koff, and KD have been developed. In these approaches, R2 dispersion or ZZ-exchange curves are measured for multiple samples with different protein and/or ligand concentration ratios, and the relaxation data are fitted to theoretical kinetic models. It is critical to choose an appropriate kinetic model, such as the two- or three-state exchange model, to derive the correct kinetic information. The R2 dispersion and ZZ-exchange methods are suitable for the analysis of protein-ligand interactions with a micromolar or sub-micromolar dissociation constant but not for very weak interactions, which are typical in very fast exchange. This contrasts with the NMR methods that are used

  15. Quantitative Medical Image Analysis for Clinical Development of Therapeutics

    NASA Astrophysics Data System (ADS)

    Analoui, Mostafa

    There has been significant progress in development of therapeutics for prevention and management of several disease areas in recent years, leading to increased average life expectancy, as well as of quality of life, globally. However, due to complexity of addressing a number of medical needs and financial burden of development of new class of therapeutics, there is a need for better tools for decision making and validation of efficacy and safety of new compounds. Numerous biological markers (biomarkers) have been proposed either as adjunct to current clinical endpoints or as surrogates. Imaging biomarkers are among rapidly increasing biomarkers, being examined to expedite effective and rational drug development. Clinical imaging often involves a complex set of multi-modality data sets that require rapid and objective analysis, independent of reviewer's bias and training. In this chapter, an overview of imaging biomarkers for drug development is offered, along with challenges that necessitate quantitative and objective image analysis. Examples of automated and semi-automated analysis approaches are provided, along with technical review of such methods. These examples include the use of 3D MRI for osteoarthritis, ultrasound vascular imaging, and dynamic contrast enhanced MRI for oncology. Additionally, a brief overview of regulatory requirements is discussed. In conclusion, this chapter highlights key challenges and future directions in this area.

  16. Real-time cellular analysis for quantitative detection of functional Clostridium difficile toxin in stool.

    PubMed

    Huang, Bin; Li, Haijing; Jin, Dazhi; Stratton, Charles W; Tang, Yi-Wei

    2014-04-01

    Rapid and accurate diagnosis and monitoring of Clostridium difficile infection (CDI) is critical for patient care and infection control. We will briefly review current laboratory techniques for the diagnosis of CDI and identify aspects needing improvement. We will also introduce a real-time cellular analysis (RTCA) assay developed for the diagnosis and monitoring of CDI using electronic impedance to assess the cell status. The RTCA assay uses impedance measurement to detect minute physiological changes in cells cultured on gold microelectrodes embedded in glass substrates in the bottom of microtiter wells. This assay has been adapted for quantitative detection of C. difficile functional toxin directly from stool specimens. Compared to conventional techniques and molecular assays, the RTCA assay provides a valuable tool for the diagnosis of CDI as well as for the assessment of clinical severity and for monitoring therapeutic efficacies. PMID:24649817

  17. An approach for quantitative image quality analysis for CT

    NASA Astrophysics Data System (ADS)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  18. A New 3-Dimensional Dynamic Quantitative Analysis System of Facial Motion: An Establishment and Reliability Test

    PubMed Central

    Feng, Guodong; Zhao, Yang; Tian, Xu; Gao, Zhiqiang

    2014-01-01

    This study aimed to establish a 3-dimensional dynamic quantitative facial motion analysis system, and then determine its accuracy and test-retest reliability. The system could automatically reconstruct the motion of the observational points. Standardized T-shaped rod and L-shaped rods were used to evaluate the static and dynamic accuracy of the system. Nineteen healthy volunteers were recruited to test the reliability of the system. The average static distance error measurement was 0.19 mm, and the average angular error was 0.29°. The measuring results decreased with the increase of distance between the cameras and objects, 80 cm of which was considered to be optimal. It took only 58 seconds to perform the full facial measurement process. The average intra-class correlation coefficient for distance measurement and angular measurement was 0.973 and 0.794 respectively. The results demonstrated that we successfully established a practical 3-dimensional dynamic quantitative analysis system that is accurate and reliable enough to meet both clinical and research needs. PMID:25390881

  19. [On the quantitative analysis of focal ischemic cerebral infarction by TTC staining].

    PubMed

    Feng, Chunyan; Fan, Xiaonong; Zhang, Chunhong; Shi, Xuemin

    2009-12-01

    It is known that ischemic cerebrovascular disease is causing enormous harm to human health on account of the resultant high morbidity and disability rate. In this connexion, the anticipated target is to control the size of focal ischemic cerebral infarction, which is also an important method for judgment of therapeutic efficacy. The key question is to survey the size accurately and objectively; at the same time, the quantitative analysis of focal ischemic cerebral infarction is the pivotal question affecting the experiment conclusion and the reliability level. In this paper are introduced and summarized the methods being recently and commonly used in survey and computation, and the studies made on quantitative analysis of focal ischemic cerebral infarction by 2, 3, 5-triphenyltetrazolium chloride (TTC) staining method. Also are summarized the principles of dyeing in TTC method, the preparatory work, and the commonly used method of surveying and computation. It is the intent of this review to provide relevant data and suggestion for research workers. PMID:20095504

  20. SearchLight: a freely available web-based quantitative spectral analysis tool (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Prabhat, Prashant; Peet, Michael; Erdogan, Turan

    2016-03-01

    In order to design a fluorescence experiment, typically the spectra of a fluorophore and of a filter set are overlaid on a single graph and the spectral overlap is evaluated intuitively. However, in a typical fluorescence imaging system the fluorophores and optical filters are not the only wavelength dependent variables - even the excitation light sources have been changing. For example, LED Light Engines may have a significantly different spectral response compared to the traditional metal-halide lamps. Therefore, for a more accurate assessment of fluorophore-to-filter-set compatibility, all sources of spectral variation should be taken into account simultaneously. Additionally, intuitive or qualitative evaluation of many spectra does not necessarily provide a realistic assessment of the system performance. "SearchLight" is a freely available web-based spectral plotting and analysis tool that can be used to address the need for accurate, quantitative spectral evaluation of fluorescence measurement systems. This tool is available at: http://searchlight.semrock.com/. Based on a detailed mathematical framework [1], SearchLight calculates signal, noise, and signal-to-noise ratio for multiple combinations of fluorophores, filter sets, light sources and detectors. SearchLight allows for qualitative and quantitative evaluation of the compatibility of filter sets with fluorophores, analysis of bleed-through, identification of optimized spectral edge locations for a set of filters under specific experimental conditions, and guidance regarding labeling protocols in multiplexing imaging assays. Entire SearchLight sessions can be shared with colleagues and collaborators and saved for future reference. [1] Anderson, N., Prabhat, P. and Erdogan, T., Spectral Modeling in Fluorescence Microscopy, http://www.semrock.com (2010).

  1. Accurate and semi-automated analysis of bacterial association with mammalian cells.

    PubMed

    Murphy, C M; Paré, S; Galea, G; Simpson, J C; Smith, S G J

    2016-03-01

    To efficiently and accurately quantify the interactions of bacteria with mammalian cells, a reliable fluorescence microscopy assay was developed. Bacteria were engineered to become rapidly and stably fluorescent using Green Fluorescent Protein (GFP) expressed from an inducible Tet promoter. Upon application of the fluorescent bacteria onto a monolayer, extracellular bacteria could be discriminated from intracellular bacteria by antibody staining and microscopy. All bacteria could be detected by GFP expression. External bacteria stained orange, whereas internalised bacteria did not. Internalised bacteria could thus be discriminated from external bacteria by virtue of being green but not orange fluorescent. Image acquisition and counting of various fluorophore-stained entities were accomplished with a high-content screening platform. This allowed for semi-automated and accurate counting of intracellular and extracellular bacteria. PMID:26769557

  2. Quantitative DNA methylation analysis of candidate genes in cervical cancer.

    PubMed

    Siegel, Erin M; Riggs, Bridget M; Delmas, Amber L; Koch, Abby; Hakam, Ardeshir; Brown, Kevin D

    2015-01-01

    Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2). A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site) per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC) of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97-1.00, p-value = 0.003). Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated. PMID:25826459

  3. Quantitative DNA Methylation Analysis of Candidate Genes in Cervical Cancer

    PubMed Central

    Siegel, Erin M.; Riggs, Bridget M.; Delmas, Amber L.; Koch, Abby; Hakam, Ardeshir; Brown, Kevin D.

    2015-01-01

    Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2). A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site) per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC) of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97–1.00, p-value = 0.003). Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated. PMID:25826459

  4. Quantitative Analysis Of Acoustic Emission From Rock Fracture Experiments

    NASA Astrophysics Data System (ADS)

    Goodfellow, Sebastian David

    This thesis aims to advance the methods of quantitative acoustic emission (AE) analysis by calibrating sensors, characterizing sources, and applying the results to solve engi- neering problems. In the first part of this thesis, we built a calibration apparatus and successfully calibrated two commercial AE sensors. The ErgoTech sensor was found to have broadband velocity sensitivity and the Panametrics V103 was sensitive to surface normal displacement. These calibration results were applied to two AE data sets from rock fracture experiments in order to characterize the sources of AE events. The first data set was from an in situ rock fracture experiment conducted at the Underground Research Laboratory (URL). The Mine-By experiment was a large scale excavation response test where both AE (10 kHz - 1 MHz) and microseismicity (MS) (1 Hz - 10 kHz) were monitored. Using the calibration information, magnitude, stress drop, dimension and energy were successfully estimated for 21 AE events recorded in the tensile region of the tunnel wall. Magnitudes were in the range -7.5 < Mw < -6.8, which is consistent with other laboratory AE results, and stress drops were within the range commonly observed for induced seismicity in the field (0.1 - 10 MPa). The second data set was AE collected during a true-triaxial deformation experiment, where the objectives were to characterize laboratory AE sources and identify issues related to moving the analysis from ideal in situ conditions to more complex laboratory conditions in terms of the ability to conduct quantitative AE analysis. We found AE magnitudes in the range -7.8 < Mw < -6.7 and as with the in situ data, stress release was within the expected range of 0.1 - 10 MPa. We identified four major challenges to quantitative analysis in the laboratory, which in- hibited our ability to study parameter scaling (M0 ∝ fc -3 scaling). These challenges were 0c (1) limited knowledge of attenuation which we proved was continuously evolving, (2

  5. Quantitative Phosphoproteomics Analysis of ERBB3/ERBB4 Signaling

    PubMed Central

    Jacobs, Kris; Klammer, Martin; Jordan, Nicole; Elschenbroich, Sarah; Parade, Marc; Jacoby, Edgar; Linders, Joannes T. M.; Brehmer, Dirk; Cools, Jan; Daub, Henrik

    2016-01-01

    The four members of the epidermal growth factor receptor (EGFR/ERBB) family form homo- and heterodimers which mediate ligand-specific regulation of many key cellular processes in normal and cancer tissues. While signaling through the EGFR has been extensively studied on the molecular level, signal transduction through ERBB3/ERBB4 heterodimers is less well understood. Here, we generated isogenic mouse Ba/F3 cells that express full-length and functional membrane-integrated ERBB3 and ERBB4 or ERBB4 alone, to serve as a defined cellular model for biological and phosphoproteomics analysis of ERBB3/ERBB4 signaling. ERBB3 co-expression significantly enhanced Ba/F3 cell proliferation upon neuregulin-1 (NRG1) treatment. For comprehensive signaling studies we performed quantitative mass spectrometry (MS) experiments to compare the basal ERBB3/ERBB4 cell phosphoproteome to NRG1 treatment of ERBB3/ERBB4 and ERBB4 cells. We employed a workflow comprising differential isotope labeling with mTRAQ reagents followed by chromatographic peptide separation and final phosphopeptide enrichment prior to MS analysis. Overall, we identified 9686 phosphorylation sites which could be confidently localized to specific residues. Statistical analysis of three replicate experiments revealed 492 phosphorylation sites which were significantly changed in NRG1-treated ERBB3/ERBB4 cells. Bioinformatics data analysis recapitulated regulation of mitogen-activated protein kinase and Akt pathways, but also indicated signaling links to cytoskeletal functions and nuclear biology. Comparative assessment of NRG1-stimulated ERBB4 Ba/F3 cells revealed that ERBB3 did not trigger defined signaling pathways but more broadly enhanced phosphoproteome regulation in cells expressing both receptors. In conclusion, our data provide the first global picture of ERBB3/ERBB4 signaling and provide numerous potential starting points for further mechanistic studies. PMID:26745281

  6. The utility of accurate mass and LC elution time information in the analysis of complex proteomes

    SciTech Connect

    Norbeck, Angela D.; Monroe, Matthew E.; Adkins, Joshua N.; Anderson, Kevin K.; Daly, Don S.; Smith, Richard D.

    2005-08-01

    Theoretical tryptic digests of all predicted proteins from the genomes of three organisms of varying complexity were evaluated for specificity and possible utility of combined peptide accurate mass and predicted LC normalized elution time (NET) information. The uniqueness of each peptide was evaluated using its combined mass (+/- 5 ppm and 1 ppm) and NET value (no constraint, +/- 0.05 and 0.01 on a 0-1 NET scale). The set of peptides both underestimates actual biological complexity due to the lack of specific modifications, and overestimates the expected complexity since many proteins will not be present in the sample or observable on the mass spectrometer because of dynamic range limitations. Once a peptide is identified from an LCMS/MS experiment, its mass and elution time is representative of a unique fingerprint for that peptide. The uniqueness of that fingerprint in comparison to that for the other peptides present is indicative of the ability to confidently identify that peptide based on accurate mass and NET measurements. These measurements can be made using HPLC coupled with high resolution MS in a high-throughput manner. Results show that for organisms with comparatively small proteomes, such as Deinococcus radiodurans, modest mass and elution time accuracies are generally adequate for peptide identifications. For more complex proteomes, increasingly accurate easurements are required. However, the majority of proteins should be uniquely identifiable by using LC-MS with mass accuracies within +/- 1 ppm and elution time easurements within +/- 0.01 NET.

  7. Inside Single Cells: Quantitative Analysis with Advanced Optics and Nanomaterials

    PubMed Central

    Cui, Yi; Irudayaraj, Joseph

    2014-01-01

    Single cell explorations offer a unique window to inspect molecules and events relevant to mechanisms and heterogeneity constituting the central dogma of biology. A large number of nucleic acids, proteins, metabolites and small molecules are involved in determining and fine-tuning the state and function of a single cell at a given time point. Advanced optical platforms and nanotools provide tremendous opportunities to probe intracellular components with single-molecule accuracy, as well as promising tools to adjust single cell activity. In order to obtain quantitative information (e.g. molecular quantity, kinetics and stoichiometry) within an intact cell, achieving the observation with comparable spatiotemporal resolution is a challenge. For single cell studies both the method of detection and the biocompatibility are critical factors as they determine the feasibility, especially when considering live cell analysis. Although a considerable proportion of single cell methodologies depend on specialized expertise and expensive instruments, it is our expectation that the information content and implication will outweigh the costs given the impact on life science enabled by single cell analysis. PMID:25430077

  8. Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes

    SciTech Connect

    Xu, Zhe; Wu, Chaochao; Xie, Fang; Slysz, Gordon W.; Tolic, Nikola; Monroe, Matthew E.; Petyuk, Vladislav A.; Payne, Samuel H.; Fujimoto, Grant M.; Moore, Ronald J.; Fillmore, Thomas L.; Schepmoes, Athena A.; Levine, Douglas; Townsend, Reid; Davies, Sherri; Li, Shunqiang; Ellis, Matthew; Boja, Emily; Rivers, Robert; Rodriguez, Henry; Rodland, Karin D.; Liu, Tao; Smith, Richard D.

    2015-01-02

    Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective and robust analytical platform for comprehensive analyses of tissue peptidomes, which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Moreover, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. Peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.

  9. Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes

    SciTech Connect

    Xu, Zhe; Wu, Chaochao; Xie, Fang; Slysz, Gordon W.; Tolic, Nikola; Monroe, Matthew E.; Petyuk, Vladislav A.; Payne, Samuel H.; Fujimoto, Grant M.; Moore, Ronald J.; Fillmore, Thomas L.; Schepmoes, Athena A.; Levine, Douglas; Townsend, Reid; Davies, Sherri; Li, Shunqiang; Ellis, Matthew; Boja, Emily; Rivers, Robert; Rodriguez, Henry; Rodland, Karin D.; Liu, Tao; Smith, Richard D.

    2015-01-01

    Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective and robust analytical platform for comprehensive analyses of tissue peptidomes, and which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Moreover, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. Peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.

  10. Inside single cells: quantitative analysis with advanced optics and nanomaterials.

    PubMed

    Cui, Yi; Irudayaraj, Joseph

    2015-01-01

    Single-cell explorations offer a unique window to inspect molecules and events relevant to mechanisms and heterogeneity constituting the central dogma of biology. A large number of nucleic acids, proteins, metabolites, and small molecules are involved in determining and fine-tuning the state and function of a single cell at a given time point. Advanced optical platforms and nanotools provide tremendous opportunities to probe intracellular components with single-molecule accuracy, as well as promising tools to adjust single-cell activity. To obtain quantitative information (e.g., molecular quantity, kinetics, and stoichiometry) within an intact cell, achieving the observation with comparable spatiotemporal resolution is a challenge. For single-cell studies, both the method of detection and the biocompatibility are critical factors as they determine the feasibility, especially when considering live-cell analysis. Although a considerable proportion of single-cell methodologies depend on specialized expertise and expensive instruments, it is our expectation that the information content and implication will outweigh the costs given the impact on life science enabled by single-cell analysis. PMID:25430077

  11. Quantitative genetic analysis of the metabolic syndrome in Hispanic children.

    PubMed

    Butte, Nancy F; Comuzzie, Anthony G; Cole, Shelley A; Mehta, Nitesh R; Cai, Guowen; Tejero, Maria; Bastarrachea, Raul; Smith, E O'Brian

    2005-12-01

    Childhood obesity is associated with a constellation of metabolic derangements including glucose intolerance, hypertension, and dyslipidemia, referred to as metabolic syndrome. The purpose of this study was to investigate genetic and environmental factors contributing to the metabolic syndrome in Hispanic children. Metabolic syndrome, defined as having three or more metabolic risk components, was determined in 1030 Hispanic children, ages 4-19 y, from 319 families enrolled in the VIVA LA FAMILIA study. Anthropometry, body composition by dual energy x-ray absorptiometry, clinical signs, and serum biochemistries were measured using standard techniques. Risk factor analysis and quantitative genetic analysis were performed. Of the overweight children, 20%, or 28% if abnormal liver function is included in the definition, presented with the metabolic syndrome. Odds ratios for the metabolic syndrome were significantly increased by body mass index z-score and fasting serum insulin; independent effects of sex, age, puberty, and body composition were not seen. Heritabilities +/- SE for waist circumference, triglycerides (TG), HDL, systolic blood pressure (SBP), glucose, and alanine aminotransferase (ALT) were highly significant. Pleiotropy (a common set of genes affecting two traits) detected between SBP and waist circumference, SBP and glucose, HDL and waist circumference, ALT and waist circumference, and TG and ALT may underlie the clustering of the components of the metabolic syndrome. Significant heritabilities and pleiotropy seen for the components of the metabolic syndrome indicate a strong genetic contribution to the metabolic syndrome in overweight Hispanic children. PMID:16306201

  12. Optimal display conditions for quantitative analysis of stereoscopic cerebral angiograms

    SciTech Connect

    Charland, P.; Peters, T. |

    1996-10-01

    For several years the authors have been using a stereoscopic display as a tool in the planning of stereotactic neurosurgical techniques. This PC-based workstation allows the surgeon to interact with and view vascular images in three dimensions, as well as to perform quantitative analysis of the three-dimensional (3-D) space. Some of the perceptual issues relevant to the presentation of medical images on this stereoscopic display were addressed in five experiments. The authors show that a number of parameters--namely the shape, color, and depth cue, associated with a cursor--as well as the image filtering and observer position, have a role in improving the observer`s perception of a 3-D image and his ability to localize points within the stereoscopically presented 3-D image. However, an analysis of the results indicates that while varying these parameters can lead to an effect on the performance of individual observers, the effects are not consistent across observers, and the mean accuracy remains relatively constant under the different experimental conditions.

  13. Quantitative analysis of noninvasive diagnostic procedures for induction motor drives

    NASA Astrophysics Data System (ADS)

    Eltabach, Mario; Antoni, Jerome; Najjar, Micheline

    2007-10-01

    This paper reports quantitative analyses of spectral fault components in five noninvasive diagnostic procedures that use input electric signals to detect different types of abnormalities in induction motors. Besides the traditional one phase current spectrum analysis "SC", the diagnostic procedures based on spectrum analysis of the instantaneous partial powers " P ab", " P cb", total power " P abc", and the current space vector modulus " csvm" are considered. The aim of this comparison study is to improve the diagnosis tools for detection of electromechanical faults in electrical machines by using the best suitable diagnostic procedure knowing some motor and fault characteristics. Defining a severity factor as the increase in amplitude of the fault characteristic frequency, with respect to the healthy condition, enables us to study the sensitivity of the electrical diagnostic tools. As a result, it is shown that the relationship between the angular displacement of the current side-bands components at frequencies ( f± fosc) is directly related to the type of induction motor faults. It is also proved that the total instantaneous power diagnostic procedure was observed to exhibit the highest values of the detection criterion in case of mechanical faults while in case of electrical ones the most reliable diagnostic procedure is tightly related to the value of the motor power factor angle and the group motor-load inertia. Finally, simulation and experimental results show good agreement with the fault modeling theoretical results.

  14. Quantitative assessment of human body shape using Fourier analysis

    NASA Astrophysics Data System (ADS)

    Friess, Martin; Rohlf, F. J.; Hsiao, Hongwei

    2004-04-01

    Fall protection harnesses are commonly used to reduce the number and severity of injuries. Increasing the efficiency of harness design requires the size and shape variation of the user population to be assessed as detailed and as accurately as possible. In light of the unsatisfactory performance of traditional anthropometry with respect to such assessments, we propose the use of 3D laser surface scans of whole bodies and the statistical analysis of elliptic Fourier coefficients. Ninety-eight male and female adults were scanned. Key features of each torso were extracted as a 3D curve along front, back and the thighs. A 3D extension of Elliptic Fourier analysis4 was used to quantify their shape through multivariate statistics. Shape change as a function of size (allometry) was predicted by regressing the coefficients onto stature, weight and hip circumference. Upper and lower limits of torso shape variation were determined and can be used to redefine the design of the harness that will fit most individual body shapes. Observed allometric changes are used for adjustments to the harness shape in each size. Finally, the estimated outline data were used as templates for a free-form deformation of the complete torso surface using NURBS models (non-uniform rational B-splines).

  15. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  16. Rapid inorganic ion analysis using quantitative microchip capillary electrophoresis.

    PubMed

    Vrouwe, Elwin X; Luttge, Regina; Olthuis, Wouter; van den Berg, Albert

    2006-01-13

    Rapid quantitative microchip capillary electrophoresis (CE) for online monitoring of drinking water enabling inorganic ion separation in less than 15 s is presented. Comparing cationic and anionic standards at different concentrations the analysis of cationic species resulted in non-linear calibration curves. We interpret this effect as a variation in the volume of the injected sample plug caused by changes of the electroosmotic flow (EOF) due to the strong interaction of bivalent cations with the glass surface. This explanation is supported by the observation of severe peak tailing. Conducting microchip CE analysis in a glass microchannel, optimized conditions are received for the cationic species K+, Na+, Ca2+, Mg2+ using a background electrolyte consisting of 30 mmol/L histidine and 2-(N-morpholino)ethanesulfonic acid, containing 0.5 mmol/L potassium chloride to reduce surface interaction and 4 mmol/L tartaric acid as a complexing agent resulting in a pH-value of 5.8. Applying reversed EOF co-migration for the anionic species Cl-, SO42- and HCO3- optimized separation occurs in a background electrolyte consisting of 10 mmol/L 4-(2-hydroxyethyl)-1-piperazineethanesulfonic acid (HEPES) and 10 mmol/L HEPES sodium salt, containing 0.05 mmol/L CTAB (cetyltrimethylammonium bromide) resulting in a pH-value of 7.5. The detection limits are 20 micromol/L for the monovalent cationic and anionic species and 10 micromol/L for the divalent species. These values make the method very suitable for many applications including the analysis of abundant ions in tap water as demonstrated in this paper. PMID:16310794

  17. The Measles Vaccination Narrative in Twitter: A Quantitative Analysis

    PubMed Central

    Radzikowski, Jacek; Jacobsen, Kathryn H; Croitoru, Arie; Crooks, Andrew; Delamater, Paul L

    2016-01-01

    Background The emergence of social media is providing an alternative avenue for information exchange and opinion formation on health-related issues. Collective discourse in such media leads to the formation of a complex narrative, conveying public views and perceptions. Objective This paper presents a study of Twitter narrative regarding vaccination in the aftermath of the 2015 measles outbreak, both in terms of its cyber and physical characteristics. We aimed to contribute to the analysis of the data, as well as presenting a quantitative interdisciplinary approach to analyze such open-source data in the context of health narratives. Methods We collected 669,136 tweets referring to vaccination from February 1 to March 9, 2015. These tweets were analyzed to identify key terms, connections among such terms, retweet patterns, the structure of the narrative, and connections to the geographical space. Results The data analysis captures the anatomy of the themes and relations that make up the discussion about vaccination in Twitter. The results highlight the higher impact of stories contributed by news organizations compared to direct tweets by health organizations in communicating health-related information. They also capture the structure of the antivaccination narrative and its terms of reference. Analysis also revealed the relationship between community engagement in Twitter and state policies regarding child vaccination. Residents of Vermont and Oregon, the two states with the highest rates of non-medical exemption from school-entry vaccines nationwide, are leading the social media discussion in terms of participation. Conclusions The interdisciplinary study of health-related debates in social media across the cyber-physical debate nexus leads to a greater understanding of public concerns, views, and responses to health-related issues. Further coalescing such capabilities shows promise towards advancing health communication, thus supporting the design of more

  18. Nanotechnology patents in the automotive industry (a quantitative & qualitative analysis).

    PubMed

    Prasad, Raghavendra; Bandyopadhyay, Tapas K

    2014-01-01

    The aim of the article is to present a trend in patent filings for application of nanotechnology to the automobile sector across the world, using the keyword-based patent search. Overviews of the patents related to nano technology in the automobile industry have been provided. The current work has started from the worldwide patent search to find the patents on nanotechnology in the automobile industry and classify the patents according to the various parts of an automobile to which they are related and the solutions which they are providing. In the next step various graphs have been produced to get an insight into various trends. In next step, analysis of patents in various classifications, have been performed. The trends shown in graphs provide the quantitative analysis whereas; the qualitative analysis has been done in another section. The classifications of patents based on the solution they provide have been performed by reading the claims, titles, abstract and full texts separately. Patentability of nano technology inventions have been discussed in a view to give an idea of requirements and statutory bars to the patentability of nanotechnology inventions. Another objective of the current work is to suggest appropriate framework for the companies regarding use of nano technology in the automobile industry and a suggestive strategy for patenting of the inventions related to the same. For example, US Patent, with patent number US2008-019426A1 discusses the invention related to Lubricant composition. This patent has been studied and classified to fall under classification of automobile parts. After studying this patent, it is deduced that, the problem of friction in engine is being solved by this patent. One classification is the "automobile part" based while other is the basis of "problem being solved". Hence, two classifications, namely reduction in friction and engine were created. Similarly, after studying all the patents, a similar matrix has been created

  19. Quantitative analysis of routine chemical constituents in tobacco by near-infrared spectroscopy and support vector machine.

    PubMed

    Zhang, Yong; Cong, Qian; Xie, Yunfei; JingxiuYang; Zhao, Bing

    2008-12-15

    It is important to monitor quality of tobacco during the production of cigarette. Therefore, in order to scientifically control the tobacco raw material and guarantee the cigarette quality, fast and accurate determination routine chemical of constituents of tobacco, including the total sugar, reducing sugar, Nicotine, the total nitrogen and so on, is needed. In this study, 50 samples of tobacco from different cultivation areas were surveyed by near-infrared (NIR) spectroscopy, and the spectral differences provided enough quantitative analysis information for the tobacco. Partial least squares regression (PLSR), artificial neural network (ANN), and support vector machine (SVM), were applied. The quantitative analysis models of 50 tobacco samples were studied comparatively in this experiment using PLSR, ANN, radial basis function (RBF) SVM regression, and the parameters of the models were also discussed. The spectrum variables of 50 samples had been compressed through the wavelet transformation technology before the models were established. The best experimental results were obtained using the (RBF) SVM regression with gamma=1.5, 1.3, 0.9, and 0.1, separately corresponds to total sugar, reducing sugar, Nicotine, and total nitrogen, respectively. Finally, compared with the back propagation (BP-ANN) and PLSR approach, SVM algorithm showed its excellent generalization for quantitative analysis results, while the number of samples for establishing the model is smaller. The overall results show that NIR spectroscopy combined with SVM can be efficiently utilized for rapid and accurate analysis of routine chemical compositions in tobacco. Simultaneously, the research can serve as the technical support and the foundation of quantitative analysis of other NIR applications. PMID:18538628

  20. Quantitative analysis of routine chemical constituents in tobacco by near-infrared spectroscopy and support vector machine

    NASA Astrophysics Data System (ADS)

    Zhang, Yong; Cong, Qian; Xie, Yunfei; Yang, Jingxiu; Zhao, Bing

    2008-12-01

    It is important to monitor quality of tobacco during the production of cigarette. Therefore, in order to scientifically control the tobacco raw material and guarantee the cigarette quality, fast and accurate determination routine chemical of constituents of tobacco, including the total sugar, reducing sugar, Nicotine, the total nitrogen and so on, is needed. In this study, 50 samples of tobacco from different cultivation areas were surveyed by near-infrared (NIR) spectroscopy, and the spectral differences provided enough quantitative analysis information for the tobacco. Partial least squares regression (PLSR), artificial neural network (ANN), and support vector machine (SVM), were applied. The quantitative analysis models of 50 tobacco samples were studied comparatively in this experiment using PLSR, ANN, radial basis function (RBF) SVM regression, and the parameters of the models were also discussed. The spectrum variables of 50 samples had been compressed through the wavelet transformation technology before the models were established. The best experimental results were obtained using the (RBF) SVM regression with γ = 1.5, 1.3, 0.9, and 0.1, separately corresponds to total sugar, reducing sugar, Nicotine, and total nitrogen, respectively. Finally, compared with the back propagation (BP-ANN) and PLSR approach, SVM algorithm showed its excellent generalization for quantitative analysis results, while the number of samples for establishing the model is smaller. The overall results show that NIR spectroscopy combined with SVM can be efficiently utilized for rapid and accurate analysis of routine chemical compositions in tobacco. Simultaneously, the research can serve as the technical support and the foundation of quantitative analysis of other NIR applications.

  1. Time-Accurate Simulations and Acoustic Analysis of Slat Free-Shear-Layer. Part 2

    NASA Technical Reports Server (NTRS)

    Khorrami, Mehdi R.; Singer, Bart A.; Lockard, David P.

    2002-01-01

    Unsteady computational simulations of a multi-element, high-lift configuration are performed. Emphasis is placed on accurate spatiotemporal resolution of the free shear layer in the slat-cove region. The excessive dissipative effects of the turbulence model, so prevalent in previous simulations, are circumvented by switching off the turbulence-production term in the slat cove region. The justifications and physical arguments for taking such a step are explained in detail. The removal of this excess damping allows the shear layer to amplify large-scale structures, to achieve a proper non-linear saturation state, and to permit vortex merging. The large-scale disturbances are self-excited, and unlike our prior fully turbulent simulations, no external forcing of the shear layer is required. To obtain the farfield acoustics, the Ffowcs Williams and Hawkings equation is evaluated numerically using the simulated time-accurate flow data. The present comparison between the computed and measured farfield acoustic spectra shows much better agreement for the amplitude and frequency content than past calculations. The effect of the angle-of-attack on the slat's flow features radiated acoustic field are also simulated presented.

  2. Analysis of hydraulic fracturing flowback and produced waters using accurate mass: identification of ethoxylated surfactants.

    PubMed

    Thurman, E Michael; Ferrer, Imma; Blotevogel, Jens; Borch, Thomas

    2014-10-01

    Two series of ethylene oxide (EO) surfactants, polyethylene glycols (PEGs from EO3 to EO33) and linear alkyl ethoxylates (LAEs C-9 to C-15 with EO3-EO28), were identified in hydraulic fracturing flowback and produced water using a new application of the Kendrick mass defect and liquid chromatography/quadrupole-time-of-flight mass spectrometry. The Kendrick mass defect differentiates the proton, ammonium, and sodium adducts in both singly and doubly charged forms. A structural model of adduct formation is presented, and binding constants are calculated, which is based on a spherical cagelike conformation, where the central cation (NH4(+) or Na(+)) is coordinated with ether oxygens. A major purpose of the study was the identification of the ethylene oxide (EO) surfactants and the construction of a database with accurate masses and retention times in order to unravel the mass spectral complexity of surfactant mixtures used in hydraulic fracturing fluids. For example, over 500 accurate mass assignments are made in a few seconds of computer time, which then is used as a fingerprint chromatogram of the water samples. This technique is applied to a series of flowback and produced water samples to illustrate the usefulness of ethoxylate "fingerprinting", in a first application to monitor water quality that results from fluids used in hydraulic fracturing. PMID:25164376

  3. Teaching Quantitative Literacy through a Regression Analysis of Exam Performance

    ERIC Educational Resources Information Center

    Lindner, Andrew M.

    2012-01-01

    Quantitative literacy is increasingly essential for both informed citizenship and a variety of careers. Though regression is one of the most common methods in quantitative sociology, it is rarely taught until late in students' college careers. In this article, the author describes a classroom-based activity introducing students to regression…

  4. Separation and quantitative analysis of alkyl sulfate ethoxymers by HPLC.

    PubMed

    Morvan, Julien; Hubert-Roux, Marie; Agasse, Valérie; Cardinael, Pascal; Barbot, Florence; Decock, Gautier; Bouillon, Jean-Philippe

    2008-01-01

    Separation of alkyl sulfate ethoxymers is investigated on various high-performance liquid chromatography (HPLC) stationary phases: Acclaim C18 Surfactant, Surfactant C8, and Hypercarb. For a fixed alkyl chain length, ethoxymers are eluted in the order of increasing number of ethoxylated units on Acclaim C18 Surfactant, whereas a reversed elution order is observed on Surfactant C8 and Hypercarb. Moreover, on an Acclaim C18 Surfactant column, non-ethoxylated compounds are eluted in their ethoxymers distribution and the use of sodium acetate additive in mobile phase leads to a co-elution of ethoxymers. HPLC stationary phases dedicated to surfactants analysis are evaluated by means of the Tanaka test. Surfactant C8 presents a great silanol activity whereas Acclaim C18 Surfactant shows a high steric selectivity. For alkyl sulfates, linearity of the calibration curve and limits of detection and quantitation are evaluated. The amount of sodium laureth sulfate raw material found in commercial body product is in agreement with the specification of the manufacturer. PMID:19007494

  5. Copulation patterns in captive hamadryas baboons: a quantitative analysis.

    PubMed

    Nitsch, Florian; Stueckle, Sabine; Stahl, Daniel; Zinner, Dietmar

    2011-10-01

    For primates, as for many other vertebrates, copulation which results in ejaculation is a prerequisite for reproduction. The probability of ejaculation is affected by various physiological and social factors, for example reproductive state of male and female and operational sex-ratio. In this paper, we present quantitative and qualitative data on patterns of sexual behaviour in a captive group of hamadryas baboons (Papio hamadryas), a species with a polygynous-monandric mating system. We observed more than 700 copulations and analysed factors that can affect the probability of ejaculation. Multilevel logistic regression analysis and Akaike's information criterion (AIC) model selection procedures revealed that the probability of successful copulation increased as the size of female sexual swellings increased, indicating increased probability of ovulation, and as the number of females per one-male unit (OMU) decreased. In contrast, occurrence of female copulation calls, sex of the copulation initiator, and previous male aggression toward females did not affect the probability of ejaculation. Synchrony of oestrus cycles also had no effect (most likely because the sample size was too small). We also observed 29 extra-group copulations by two non-adult males. Our results indicate that male hamadryas baboons copulated more successfully around the time of ovulation and that males in large OMUs with many females may be confronted by time or energy-allocation problems. PMID:21710159

  6. Quantitative analysis of protein dynamics during asymmetric cell division.

    PubMed

    Mayer, Bernd; Emery, Gregory; Berdnik, Daniela; Wirtz-Peitz, Frederik; Knoblich, Juergen A

    2005-10-25

    In dividing Drosophila sensory organ precursor (SOP) cells, the fate determinant Numb and its associated adaptor protein Pon localize asymmetrically and segregate into the anterior daughter cell, where Numb influences cell fate by repressing Notch signaling. Asymmetric localization of both proteins requires the protein kinase aPKC and its substrate Lethal (2) giant larvae (Lgl). Because both Numb and Pon localization require actin and myosin, lateral transport along the cell cortex has been proposed as a possible mechanism for their asymmetric distribution. Here, we use quantitative live analysis of GFP-Pon and Numb-GFP fluorescence and fluorescence recovery after photobleaching (FRAP) to characterize the dynamics of Numb and Pon localization during SOP division. We demonstrate that Numb and Pon rapidly exchange between a cytoplasmic pool and the cell cortex and that preferential recruitment from the cytoplasm is responsible for their asymmetric distribution during mitosis. Expression of a constitutively active form of aPKC impairs membrane recruitment of GFP-Pon. This defect can be rescued by coexpression of nonphosphorylatable Lgl, indicating that Lgl is the main target of aPKC. We propose that a high-affinity binding site is asymmetrically distributed by aPKC and Lgl and is responsible for asymmetric localization of cell-fate determinants during mitosis. PMID:16243032

  7. Hyperspectral imaging and quantitative analysis for prostate cancer detection

    PubMed Central

    Akbari, Hamed; Halig, Luma V.; Schuster, David M.; Osunkoya, Adeboye; Master, Viraj; Nieh, Peter T.; Chen, Georgia Z.

    2012-01-01

    Abstract. Hyperspectral imaging (HSI) is an emerging modality for various medical applications. Its spectroscopic data might be able to be used to noninvasively detect cancer. Quantitative analysis is often necessary in order to differentiate healthy from diseased tissue. We propose the use of an advanced image processing and classification method in order to analyze hyperspectral image data for prostate cancer detection. The spectral signatures were extracted and evaluated in both cancerous and normal tissue. Least squares support vector machines were developed and evaluated for classifying hyperspectral data in order to enhance the detection of cancer tissue. This method was used to detect prostate cancer in tumor-bearing mice and on pathology slides. Spatially resolved images were created to highlight the differences of the reflectance properties of cancer versus those of normal tissue. Preliminary results with 11 mice showed that the sensitivity and specificity of the hyperspectral image classification method are 92.8% to 2.0% and 96.9% to 1.3%, respectively. Therefore, this imaging method may be able to help physicians to dissect malignant regions with a safe margin and to evaluate the tumor bed after resection. This pilot study may lead to advances in the optical diagnosis of prostate cancer using HSI technology. PMID:22894488

  8. Quantitative Analysis of Cellular Metabolic Dissipative, Self-Organized Structures

    PubMed Central

    de la Fuente, Ildefonso Martínez

    2010-01-01

    One of the most important goals of the postgenomic era is understanding the metabolic dynamic processes and the functional structures generated by them. Extensive studies during the last three decades have shown that the dissipative self-organization of the functional enzymatic associations, the catalytic reactions produced during the metabolite channeling, the microcompartmentalization of these metabolic processes and the emergence of dissipative networks are the fundamental elements of the dynamical organization of cell metabolism. Here we present an overview of how mathematical models can be used to address the properties of dissipative metabolic structures at different organizational levels, both for individual enzymatic associations and for enzymatic networks. Recent analyses performed with dissipative metabolic networks have shown that unicellular organisms display a singular global enzymatic structure common to all living cellular organisms, which seems to be an intrinsic property of the functional metabolism as a whole. Mathematical models firmly based on experiments and their corresponding computational approaches are needed to fully grasp the molecular mechanisms of metabolic dynamical processes. They are necessary to enable the quantitative and qualitative analysis of the cellular catalytic reactions and also to help comprehend the conditions under which the structural dynamical phenomena and biological rhythms arise. Understanding the molecular mechanisms responsible for the metabolic dissipative structures is crucial for unraveling the dynamics of cellular life. PMID:20957111

  9. Hyperspectral imaging and quantitative analysis for prostate cancer detection

    NASA Astrophysics Data System (ADS)

    Akbari, Hamed; Halig, Luma V.; Schuster, David M.; Osunkoya, Adeboye; Master, Viraj; Nieh, Peter T.; Chen, Georgia Z.; Fei, Baowei

    2012-07-01

    Hyperspectral imaging (HSI) is an emerging modality for various medical applications. Its spectroscopic data might be able to be used to noninvasively detect cancer. Quantitative analysis is often necessary in order to differentiate healthy from diseased tissue. We propose the use of an advanced image processing and classification method in order to analyze hyperspectral image data for prostate cancer detection. The spectral signatures were extracted and evaluated in both cancerous and normal tissue. Least squares support vector machines were developed and evaluated for classifying hyperspectral data in order to enhance the detection of cancer tissue. This method was used to detect prostate cancer in tumor-bearing mice and on pathology slides. Spatially resolved images were created to highlight the differences of the reflectance properties of cancer versus those of normal tissue. Preliminary results with 11 mice showed that the sensitivity and specificity of the hyperspectral image classification method are 92.8% to 2.0% and 96.9% to 1.3%, respectively. Therefore, this imaging method may be able to help physicians to dissect malignant regions with a safe margin and to evaluate the tumor bed after resection. This pilot study may lead to advances in the optical diagnosis of prostate cancer using HSI technology.

  10. Quantitative SERS sensors for environmental analysis of naphthalene.

    PubMed

    Péron, O; Rinnert, E; Toury, T; Lamy de la Chapelle, M; Compère, C

    2011-03-01

    In the investigation of chemical pollutants, such as PAHs (Polycyclic Aromatic Hydrocarbons) at low concentration in aqueous medium, Surface-Enhanced Raman Scattering (SERS) stands for an alternative to the inherent low cross-section of normal Raman scattering. Indeed, SERS is a very sensitive spectroscopic technique due to the excitation of the surface plasmon modes of the nanostructured metallic film. The surface of quartz substrates was coated with a hydrophobic film obtained by silanization and subsequently reacted with polystyrene (PS) beads coated with gold nanoparticles. The hydrophobic surface of the SERS substrates pre-concentrates non-polar molecules such as naphthalene. Under laser excitation, the SERS-active substrates allow the detection and the identification of the target molecules localized close to the gold nanoparticles. The morphology of the SERS substrates based on polystyrene beads surrounded by gold nanoparticles was characterized by scanning electron microscopy (SEM). Furthermore, the Raman fingerprint of the polystyrene stands for an internal spectral reference. To this extent, an innovative method to detect and to quantify organic molecules, as naphthalene in the range of 1 to 20 ppm, in aqueous media was carried out. Such SERS-active substrates tend towards an application as quantitative SERS sensors for the environmental analysis of naphthalene. PMID:21165476

  11. Quantitative Financial Analysis of Alternative Energy Efficiency Shareholder Incentive Mechanisms

    SciTech Connect

    Cappers, Peter; Goldman, Charles; Chait, Michele; Edgar, George; Schlegel, Jeff; Shirley, Wayne

    2008-08-03

    Rising energy prices and climate change are central issues in the debate about our nation's energy policy. Many are demanding increased energy efficiency as a way to help reduce greenhouse gas emissions and lower the total cost of electricity and energy services for consumers and businesses. Yet, as the National Action Plan on Energy Efficiency (NAPEE) pointed out, many utilities continue to shy away from seriously expanding their energy efficiency program offerings because they claim there is insufficient profit-motivation, or even a financial disincentive, when compared to supply-side investments. With the recent introduction of Duke Energy's Save-a-Watt incentive mechanism and ongoing discussions about decoupling, regulators and policymakers are now faced with an expanded and diverse landscape of financial incentive mechanisms, Determining the 'right' way forward to promote deep and sustainable demand side resource programs is challenging. Due to the renaissance that energy efficiency is currently experiencing, many want to better understand the tradeoffs in stakeholder benefits between these alternative incentive structures before aggressively embarking on a path for which course corrections can be time-consuming and costly. Using a prototypical Southwest utility and a publicly available financial model, we show how various stakeholders (e.g. shareholders, ratepayers, etc.) are affected by these different types of shareholder incentive mechanisms under varying assumptions about program portfolios. This quantitative analysis compares the financial consequences associated with a wide range of alternative incentive structures. The results will help regulators and policymakers better understand the financial implications of DSR program incentive regulation.

  12. Quantitative analysis of biomedical samples using synchrotron radiation microbeams

    NASA Astrophysics Data System (ADS)

    Ektessabi, Ali; Shikine, Shunsuke; Yoshida, Sohei

    2001-07-01

    X-ray fluorescence (XRF) using a synchrotron radiation (SR) microbeam was applied to investigate distributions and concentrations of elements in single neurons of patients with neurodegenerative diseases. In this paper we introduce a computer code that has been developed to quantify the trace elements and matrix elements at the single cell level. This computer code has been used in studies of several important neurodegenerative diseases such as Alzheimer's disease (AD), Parkinson's disease (PD) and parkinsonism-dementia complex (PDC), as well as in basic biological experiments to determine the elemental changes in cells due to incorporation of foreign metal elements. The substantial nigra (SN) tissue obtained from the autopsy specimens of patients with Guamanian parkinsonism-dementia complex (PDC) and control cases were examined. Quantitative XRF analysis showed that neuromelanin granules of Parkinsonian SN contained higher levels of Fe than those of the control. The concentrations were in the ranges of 2300-3100 ppm and 2000-2400 ppm respectively. On the contrary, Zn and Ni in neuromelanin granules of SN tissue from the PDC case were lower than those of the control. Especially Zn was less than 40 ppm in SN tissue from the PDC case while it was 560-810 ppm in the control. These changes are considered to be closely related to the neuro-degeneration and cell death.

  13. Hyperspectral imaging and quantitative analysis for prostate cancer detection.

    PubMed

    Akbari, Hamed; Halig, Luma V; Schuster, David M; Osunkoya, Adeboye; Master, Viraj; Nieh, Peter T; Chen, Georgia Z; Fei, Baowei

    2012-07-01

    Hyperspectral imaging (HSI) is an emerging modality for various medical applications. Its spectroscopic data might be able to be used to noninvasively detect cancer. Quantitative analysis is often necessary in order to differentiate healthy from diseased tissue. We propose the use of an advanced image processing and classification method in order to analyze hyperspectral image data for prostate cancer detection. The spectral signatures were extracted and evaluated in both cancerous and normal tissue. Least squares support vector machines were developed and evaluated for classifying hyperspectral data in order to enhance the detection of cancer tissue. This method was used to detect prostate cancer in tumor-bearing mice and on pathology slides. Spatially resolved images were created to highlight the differences of the reflectance properties of cancer versus those of normal tissue. Preliminary results with 11 mice showed that the sensitivity and specificity of the hyperspectral image classification method are 92.8% to 2.0% and 96.9% to 1.3%, respectively. Therefore, this imaging method may be able to help physicians to dissect malignant regions with a safe margin and to evaluate the tumor bed after resection. This pilot study may lead to advances in the optical diagnosis of prostate cancer using HSI technology. PMID:22894488

  14. Quantitative image analysis of HIV-1 infection in lymphoid tissue

    SciTech Connect

    Haase, A.T.; Zupancic, M.; Cavert, W.

    1996-11-08

    Tracking human immunodeficiency virus-type 1 (HIV-1) infection at the cellular level in tissue reservoirs provides opportunities to better understand the pathogenesis of infection and to rationally design and monitor therapy. A quantitative technique was developed to determine viral burden in two important cellular compartments in lymphoid developed to determine viral burden in two important cellular compartments in lymphoid tissues. Image analysis and in situ hybridization were combined to show that in the presymptomatic stages of infection there is a large, relatively stable pool of virions on the surfaces of follicular dendritic cells and a smaller pool of productivity infected cells. Despite evidence of constraints on HIV-1 replication in the infected cell population in lymphoid tissues, estimates of the numbers of these cells and the virus they could produce are consistent with the quantities of virus that have been detected in the bloodstream. The cellular sources of virus production and storage in lymphoid tissues can now be studied with this approach over the course of infection and treatment. 22 refs., 2 figs., 2 tabs.

  15. Quantitative trait locus analysis for hemostasis and thrombosis

    PubMed Central

    Sa, Qila; Hart, Erika; Hill, Annie E.; Nadeau, Joseph H.

    2009-01-01

    Susceptibility to thrombosis varies in human populations as well as many in inbred mouse strains. The objective of this study was to characterize the genetic control of thrombotic risk on three chromosomes. Previously, utilizing a tail-bleeding/rebleeding assay as a surrogate of hemostasis and thrombosis function, three mouse chromosome substitution strains (CSS) (B6-Chr5A/J, Chr11A/J, Chr17A/J) were identified (Hmtb1, Hmtb2, Hmtb3). The tailbleeding/rebleeding assay is widely used and distinguishes mice with genetic defects in blood clot formation or dissolution. In the present study, quantitative trait locus (QTL) analysis revealed a significant locus for rebleeding (clot stability) time (time between cessation of initial bleeding and start of the second bleeding) on chromosome 5, suggestive loci for bleeding time (time between start of bleeding and cessation of bleeding) also on chromosomes 5, and two suggestive loci for clot stability on chromosome 17 and one on chromosome 11. The three CSS and the parent A/J had elevated clot stability time. There was no interaction of genes on chromosome 11 with genes on chromosome 5 or chromosome 17. On chromosome 17, twenty-three candidate genes were identified in synteny with previously identified loci for thrombotic risk on human chromosome 18. Thus, we have identified new QTLs and candidate genes not previously known to influence thrombotic risk. PMID:18787898

  16. Machine learning methods for quantitative analysis of Raman spectroscopy data

    NASA Astrophysics Data System (ADS)

    Madden, Michael G.; Ryder, Alan G.

    2003-03-01

    The automated identification and quantification of illicit materials using Raman spectroscopy is of significant importance for law enforcement agencies. This paper explores the use of Machine Learning (ML) methods in comparison with standard statistical regression techniques for developing automated identification methods. In this work, the ML task is broken into two sub-tasks, data reduction and prediction. In well-conditioned data, the number of samples should be much larger than the number of attributes per sample, to limit the degrees of freedom in predictive models. In this spectroscopy data, the opposite is normally true. Predictive models based on such data have a high number of degrees of freedom, which increases the risk of models over-fitting to the sample data and having poor predictive power. In the work described here, an approach to data reduction based on Genetic Algorithms is described. For the prediction sub-task, the objective is to estimate the concentration of a component in a mixture, based on its Raman spectrum and the known concentrations of previously seen mixtures. Here, Neural Networks and k-Nearest Neighbours are used for prediction. Preliminary results are presented for the problem of estimating the concentration of cocaine in solid mixtures, and compared with previously published results in which statistical analysis of the same dataset was performed. Finally, this paper demonstrates how more accurate results may be achieved by using an ensemble of prediction techniques.

  17. A novel procedure of quantitation of virus based on microflow cytometry analysis.

    PubMed

    Vazquez, Diego; López-Vázquez, Carmen; Cutrín, Juan Manuel; Dopazo, Carlos P

    2016-03-01

    The accurate and fast titration of viruses is a critical step in research laboratories and biotechnology industries. Different approaches are commonly applied which either are time consuming (like the plaque and endpoint dilution assays) or do not ensure quantification of only infective particles (like quantitative real-time PCR). In the last decade, a methodology based on the analysis of infected cells by flow cytometry and fluorescence-activated cell sorting (FACS) has been reported as a fast and reliable test for the titration of some viruses. However, this technology needs expensive equipment and expert technicians to operate it. Recently, the "lab on a chip" integrated devices have brought about the miniaturization of this equipment, turning this technology into an affordable and easy-to-use alternative to traditional flow cytometry. In the present study, we have designed a microflow cytometry (μFC) procedure for the quantitation of viruses, using the infectious pancreatic necrosis virus (IPNV) as a model. The optimization of conditions and validation of the method are reported here. PMID:26728015

  18. Quantitative human chorionic gonadotropin analysis. I. Comparison of an immunoradiometric assay and a radioimmunoassay

    SciTech Connect

    Shapiro, A.I.; Wu, T.F.; Ballon, S.C.; Lamb, E.J.

    1984-01-01

    An immunoradiometric assay (IRMA) for the quantitative analysis of human chorionic gonadotropin (hCG) was evaluated for specificity, sensitivity, accuracy and precision. The results were compared with those of the conventional radioimmunoassay (RIA) used in our laboratory. The IRMA is a solid-phase, double-antibody immunoassay that sandwiches the intact hCG molecule between the two antibodies. It has specificity, accuracy, and precision which are similar to those of the RIA. The RIA is based upon the assumptions that the antigenicity of the tracer is not altered by the iodination process and that the antibody reacts equally with all of the antigens, including the radiolabeled antigen. The IRMA does not use radiolabeled antigens and thus is free of the assumptions made in the conventional RIA. The IRMA may be more accurate at the lower limits of the assay because it does not require logarithmic transformations. Since the IRMA does not measure the free beta-subunit of hCG, it cannot be endorsed as the sole technique to quantitate hCG in patients with gestational trophoblastic neoplasia until the significance of the free beta-subunit in these patients is determined.

  19. Quantitative Analysis of Subcellular Distribution of the SUMO Conjugation System by Confocal Microscopy Imaging.

    PubMed

    Mas, Abraham; Amenós, Montse; Lois, L Maria

    2016-01-01

    Different studies point to an enrichment in SUMO conjugation in the cell nucleus, although non-nuclear SUMO targets also exist. In general, the study of subcellular localization of proteins is essential for understanding their function within a cell. Fluorescence microscopy is a powerful tool for studying subcellular protein partitioning in living cells, since fluorescent proteins can be fused to proteins of interest to determine their localization. Subcellular distribution of proteins can be influenced by binding to other biomolecules and by posttranslational modifications. Sometimes these changes affect only a portion of the protein pool or have a partial effect, and a quantitative evaluation of fluorescence images is required to identify protein redistribution among subcellular compartments. In order to obtain accurate data about the relative subcellular distribution of SUMO conjugation machinery members, and to identify the molecular determinants involved in their localization, we have applied quantitative confocal microscopy imaging. In this chapter, we will describe the fluorescent protein fusions used in these experiments, and how to measure, evaluate, and compare average fluorescence intensities in cellular compartments by image-based analysis. We show the distribution of some components of the Arabidopsis SUMOylation machinery in epidermal onion cells and how they change their distribution in the presence of interacting partners or even when its activity is affected. PMID:27424751

  20. Quantitative Analysis of Mutant Subclones in Chronic Myeloid Leukemia: Comparison of Different Methodological Approaches.

    PubMed

    Preuner, Sandra; Barna, Agnes; Frommlet, Florian; Czurda, Stefan; Konstantin, Byrgazov; Alikian, Mary; Machova Polakova, Katerina; Sacha, Tomasz; Richter, Johan; Lion, Thomas; Gabriel, Christian

    2016-01-01

    Identification and quantitative monitoring of mutant BCR-ABL1 subclones displaying resistance to tyrosine kinase inhibitors (TKIs) have become important tasks in patients with Ph-positive leukemias. Different technologies have been established for patient screening. Various next-generation sequencing (NGS) platforms facilitating sensitive detection and quantitative monitoring of mutations in the ABL1-kinase domain (KD) have been introduced recently, and are expected to become the preferred technology in the future. However, broad clinical implementation of NGS methods has been hampered by the limited accessibility at different centers and the current costs of analysis which may not be regarded as readily affordable for routine diagnostic monitoring. It is therefore of interest to determine whether NGS platforms can be adequately substituted by other methodological approaches. We have tested three different techniques including pyrosequencing, LD (ligation-dependent)-PCR and NGS in a series of peripheral blood specimens from chronic myeloid leukemia (CML) patients carrying single or multiple mutations in the BCR-ABL1 KD. The proliferation kinetics of mutant subclones in serial specimens obtained during the course of TKI-treatment revealed similar profiles via all technical approaches, but individual specimens showed statistically significant differences between NGS and the other methods tested. The observations indicate that different approaches to detection and quantification of mutant subclones may be applicable for the monitoring of clonal kinetics, but careful calibration of each method is required for accurate size assessment of mutant subclones at individual time points. PMID:27136541

  1. Quantitative Analysis of Mutant Subclones in Chronic Myeloid Leukemia: Comparison of Different Methodological Approaches

    PubMed Central

    Preuner, Sandra; Barna, Agnes; Frommlet, Florian; Czurda, Stefan; Konstantin, Byrgazov; Alikian, Mary; Machova Polakova, Katerina; Sacha, Tomasz; Richter, Johan; Lion, Thomas; Gabriel, Christian

    2016-01-01

    Identification and quantitative monitoring of mutant BCR-ABL1 subclones displaying resistance to tyrosine kinase inhibitors (TKIs) have become important tasks in patients with Ph-positive leukemias. Different technologies have been established for patient screening. Various next-generation sequencing (NGS) platforms facilitating sensitive detection and quantitative monitoring of mutations in the ABL1-kinase domain (KD) have been introduced recently, and are expected to become the preferred technology in the future. However, broad clinical implementation of NGS methods has been hampered by the limited accessibility at different centers and the current costs of analysis which may not be regarded as readily affordable for routine diagnostic monitoring. It is therefore of interest to determine whether NGS platforms can be adequately substituted by other methodological approaches. We have tested three different techniques including pyrosequencing, LD (ligation-dependent)-PCR and NGS in a series of peripheral blood specimens from chronic myeloid leukemia (CML) patients carrying single or multiple mutations in the BCR-ABL1 KD. The proliferation kinetics of mutant subclones in serial specimens obtained during the course of TKI-treatment revealed similar profiles via all technical approaches, but individual specimens showed statistically significant differences between NGS and the other methods tested. The observations indicate that different approaches to detection and quantification of mutant subclones may be applicable for the monitoring of clonal kinetics, but careful calibration of each method is required for accurate size assessment of mutant subclones at individual time points. PMID:27136541

  2. A quantitative analysis of urban growth and associated thermal characteristics using Landsat satellite data

    NASA Astrophysics Data System (ADS)

    Zeng, Yongnain; Zhang, Honghui; Zou, Bin; Li, Hua

    2008-10-01

    Urbanization transforms the natural landscape to anthropogenic urban land use and changes surface physical characteristics. Accurate information on the extent of urban growth and its impacts on environment are of great interest for diverse purposes. As a result, increased research interest is being directed to the mapping and monitoring of urban land use using remote sensing techniques. However, there are many challenges in deriving urban extent and development densities quantitatively. This study utilized remote sensing data of Landsat TM/ETM+ to assess urban sprawl and its thermal characteristics in Changsha of central China. A new approach was proposed for quantitatively determining urban land use extents and development densities. Firstly, impervious surface areas were mapped by integrating spectral index derived from remotely sensed data. Then, the urban land extents and development densities were identified by using moving window calculation and selecting certain threshold values. The urban surface thermal patterns were investigated using Landsat thermal band. Analysis results suggest that urban extent and development density and surface thermal characteristics and patterns can be identified through qualitatively based remotely sensed index and land surface temperature. Results show the built-up area and urban development densities have increased significantly in Changsha city since 1990s. The differences of urban development densities correspond to thermal effects where higher percent imperviousness is usually associated with higher surface temperature. Remotely sensed index and land surface temperature are demonstrated to be very useful sources in quantifying urban land use extent, development intensity, and urban thermal patterns.

  3. Stable isotope labeling - Liquid chromatography/mass spectrometry for quantitative analysis of androgenic and progestagenic steroids.

    PubMed

    Guo, Ning; Liu, Ping; Ding, Jun; Zheng, Shu-Jian; Yuan, Bi-Feng; Feng, Yu-Qi

    2016-01-28

    Steroid hormones play important roles in mammal at very low concentrations and are associated with numerous endocrinology and oncology diseases. Therefore, quantitative analysis of steroid hormones can provide crucial information for uncovering underlying mechanisms of steroid hormones related diseases. In the current study, we developed a sensitive method for the detection of steroid hormones (progesterone, dehydroepiandrosterone, testosterone, pregnenolone, 17-hydroxyprogesterone, androstenedione and 17α-hydroxypregnenolone) in body fluids by stable isotope labeling coupled with liquid chromatography-electrospray ionization-tandem mass spectrometry (LC-ESI-MS/MS) analysis. In this respect, a pair of isotopes labeling reagents, Girard reagent P (GP) and d5-Girard reagent P (d5-GP), were synthesized and utilized to label steroid hormones in follicular fluid samples and steroid hormone standards, respectively. The heavy labeled standards were used as internal standards for quantification to minimize quantitation deviation in MS analysis due to the matrix and ion suppression effects. The ionization efficiencies of steroid hormones were greatly improved by 4-504 folds through the introduction of a permanent charged moiety of quaternary ammonium from GP. Using the developed method, we successfully quantified steroid hormones in human follicular fluid. We found that the contents of testosterone and androstenedione exhibited significant increase while the content of pregnenolone had significant decrease in follicular fluid of polycystic ovarian syndrome (PCOS) patients compared with healthy controls, indicating that these steroid hormones with significant change may contribute to the pathogenesis of PCOS. Taken together, the developed stable isotope labeling coupled LC-ESI-MS/MS analysis demonstrated to be a promising method for the sensitive and accurate determination of steroid hormones, which may facilitate the in-depth investigation of steroid hormones related

  4. Quantitative analysis for lumen and media-adventitia border detection in intravascular ultrasound images

    NASA Astrophysics Data System (ADS)

    Wang, Ling; Yu, Daoyin; Chen, Xiaodong; An, Zhiyong

    2008-12-01

    Therosclerosis causes partial or total obstruction of human arteries. Early quantitative analysis and accurate assessment of plaque position and volume are essential for the selection of the appropriate treatment. Several imaging techniques can be used for the estimation of the severity of the disease in vivo. Intravascular ultrasound (IVUS) is a commonly used diagnostic tool, which provides real-time visualization of plaque morphology and detection of typical plaque components, such as calcium, quantification of plaque eccentricity and wall thickness. In this paper, we firstly used a spatio-temporal filter to reduce the effect of speckles and enhance the image. Then we translated the problem of image segmentation to the problem of the minimum resolution of energy function. using an improved deformable models, we detected the border of lumen and media-adventitia in sequential intravascular ultrasound frames, and optimized it by dynamic programming. Finally, through the identification of the internal and external elastic lamina and the plaque-lumen interface, we figured out the parameter of plaque load, maximal and minimal diameters of the internal and external elastic lamina and so on. The obtained results demonstrate that our method is statistically accurate, reproducible, and capable to identify the regions of interest in sequences of IVUS frames.

  5. Quantitative Analysis of Venus Radar Backscatter Data in ArcGIS

    NASA Technical Reports Server (NTRS)

    Long, S. M.; Grosfils, E. B.

    2005-01-01

    Ongoing mapping of the Ganiki Planitia (V14) quadrangle of Venus and definition of material units has involved an integrated but qualitative analysis of Magellan radar backscatter images and topography using standard geomorphological mapping techniques. However, such analyses do not take full advantage of the quantitative information contained within the images. Analysis of the backscatter coefficient allows a much more rigorous statistical comparison between mapped units, permitting first order selfsimilarity tests of geographically separated materials assigned identical geomorphological labels. Such analyses cannot be performed directly on pixel (DN) values from Magellan backscatter images, because the pixels are scaled to the Muhleman law for radar echoes on Venus and are not corrected for latitudinal variations in incidence angle. Therefore, DN values must be converted based on pixel latitude back to their backscatter coefficient values before accurate statistical analysis can occur. Here we present a method for performing the conversions and analysis of Magellan backscatter data using commonly available ArcGIS software and illustrate the advantages of the process for geological mapping.

  6. Quantitative Analysis of Various Metalloprotein Compositional Stoichiometries with Simultaneous PIXE and NRA

    NASA Astrophysics Data System (ADS)

    McCubbin, Andrew; Deyoung, Paul; Peaslee, Graham; Sibley, Megan; Warner, Joshua

    2013-04-01

    Stoichiometric characterization has been carried out on multiple metalloproteins using a combination of Ion Beam Analysis methods and a newly modified preparation technique. Particle Induced X-ray emission (PIXE) spectroscopy is a non-destructive ion beam analysis technique well suited to determine the concentrations of heavy elements. Nuclear Reaction Analysis (NRA) is a technique which measures the areal density of a thin target from scattering cross sections of 3.4 MeV protons. A combination of NRA and PIXE has been developed to provide a quantitative technique for the determination of stoichiometric metal ion ratios in metalloproteins. About one third of all proteins are metalloproteins, and most do not have well determined stoichiometric compositions for the metals they contain. Current work focuses on establishing a standard method in which to prepare protein samples. The method involves placing drops of protein solutions on aluminized polyethylene terephthalate (Mylar) and allowing them to dry. This technique has been tested for several proteins of known stoichiometry to determine cofactor content and has proven to be a reliable analysis method, accurately determining metal stoichiometry in cytochrome c, superoxide dismutase, concanavalin A, vitamin B12, and hemoglobin.

  7. Communication about vaccinations in Italian websites: a quantitative analysis.

    PubMed

    Tafuri, Silvio; Gallone, Maria S; Gallone, Maria F; Zorico, Ivan; Aiello, Valeria; Germinario, Cinzia

    2014-01-01

    Babies' parents and people who look for information about vaccination often visit anti-vaccine movement's websites, blogs by naturopathic physicians or natural and alternative medicine practitioners. The aim of this work is to provide a quantitative analysis on the type of information available to Italian people regarding vaccination and a quality analysis of websites retrieved through our searches. A quality score was created to evaluate the technical level of websites. A research was performed through Yahoo, Google, and MSN using the keywords "vaccine" and "vaccination," with the function "OR" in order to identify the most frequently used websites. The 2 keywords were input in Italian, and the first 15 pages retrieved by each search engine were analyzed. 149 websites were selected through this methodology. Fifty-three per cent of the websites belonged to associations, groups, or scientific companies, 32.2% (n = 48) consisted of a personal blog and 14.8% (n = 22) belonged to some of the National Health System offices. Among all analyzed websites, 15.4% (n = 23) came from anti-vaccine movement groups. 37.6% reported webmaster name, 67.8% webmaster e-mail, 28.6% indicated the date of the last update and 46.6% the author's name. The quality score for government sites was higher on average than anti-vaccine websites; although, government sites don't use Web 2.0 functions, as the forums.: National Health System institutions who have to promote vaccination cannot avoid investing in web communication because it cannot be managed by private efforts but must be the result of Public Health, private and scientific association, and social movement synergy. PMID:24607988

  8. Quantitative Analysis of Human Cancer Cell Extravasation Using Intravital Imaging.

    PubMed

    Willetts, Lian; Bond, David; Stoletov, Konstantin; Lewis, John D

    2016-01-01

    Metastasis, or the spread of cancer cells from a primary tumor to distant sites, is the leading cause of cancer-associated death. Metastasis is a complex multi-step process comprised of invasion, intravasation, survival in circulation, extravasation, and formation of metastatic colonies. Currently, in vitro assays are limited in their ability to investigate these intricate processes and do not faithfully reflect metastasis as it occurs in vivo. Traditional in vivo models of metastasis are limited by their ability to visualize the seemingly sporadic behavior of where and when cancer cells spread (Reymond et al., Nat Rev Cancer 13:858-870, 2013). The avian embryo model of metastasis is a powerful platform to study many of the critical steps in the metastatic cascade including the migration, extravasation, and invasion of human cancer cells in vivo (Sung et al., Nat Commun 6:7164, 2015; Leong et al., Cell Rep 8, 1558-1570, 2014; Kain et al., Dev Dyn 243:216-28, 2014; Leong et al., Nat Protoc 5:1406-17, 2010; Zijlstra et al., Cancer Cell 13:221-234, 2008; Palmer et al., J Vis Exp 51:2815, 2011). The chicken chorioallantoic membrane (CAM) is a readily accessible and well-vascularized tissue that surrounds the developing embryo. When the chicken embryo is grown in a shell-less, ex ovo environment, the nearly transparent CAM provides an ideal environment for high-resolution fluorescent microcopy approaches. In this model, the embryonic chicken vasculature and labeled cancer cells can be visualized simultaneously to investigate specific steps in the metastatic cascade including extravasation. When combined with the proper image analysis tools, the ex ovo chicken embryo model offers a cost-effective and high-throughput platform for the quantitative analysis of tumor cell metastasis in a physiologically relevant in vivo setting. Here we discuss detailed procedures to quantify cancer cell extravasation in the shell-less chicken embryo model with advanced fluorescence

  9. Evaluating Multiplexed Quantitative Phosphopeptide Analysis on a Hybrid Quadrupole Mass Filter/Linear Ion Trap/Orbitrap Mass Spectrometer

    PubMed Central

    2015-01-01

    As a driver for many biological processes, phosphorylation remains an area of intense research interest. Advances in multiplexed quantitation utilizing isobaric tags (e.g., TMT and iTRAQ) have the potential to create a new paradigm in quantitative proteomics. New instrumentation and software are propelling these multiplexed workflows forward, which results in more accurate, sensitive, and reproducible quantitation across tens of thousands of phosphopeptides. This study assesses the performance of multiplexed quantitative phosphoproteomics on the Orbitrap Fusion mass spectrometer. Utilizing a two-phosphoproteome model of precursor ion interference, we assessed the accuracy of phosphopeptide quantitation across a variety of experimental approaches. These methods included the use of synchronous precursor selection (SPS) to enhance TMT reporter ion intensity and accuracy. We found that (i) ratio distortion remained a problem for phosphopeptide analysis in multiplexed quantitative workflows, (ii) ratio distortion can be overcome by the use of an SPS-MS3 scan, (iii) interfering ions generally possessed a different charge state than the target precursor, and (iv) selecting only the phosphate neutral loss peak (single notch) for the MS3 scan still provided accurate ratio measurements. Remarkably, these data suggest that the underlying cause of interference may not be due to coeluting and cofragmented peptides but instead from consistent, low level background fragmentation. Finally, as a proof-of-concept 10-plex experiment, we compared phosphopeptide levels from five murine brains to five livers. In total, the SPS-MS3 method quantified 38 247 phosphopeptides, corresponding to 11 000 phosphorylation sites. With 10 measurements recorded for each phosphopeptide, this equates to more than 628 000 binary comparisons collected in less than 48 h. PMID:25521595

  10. Achieving accurate nuetron-multiplicity analysis of metals and oxides with weighted point model equations.

    SciTech Connect

    Burward-Hoy, J. M.; Geist, W. H.; Krick, M. S.; Mayo, D. R.

    2004-01-01

    Neutron multiplicity counting is a technique for the rapid, nondestructive measurement of plutonium mass in pure and impure materials. This technique is very powerful because it uses the measured coincidence count rates to determine the sample mass without requiring a set of representative standards for calibration. Interpreting measured singles, doubles, and triples count rates using the three-parameter standard point model accurately determines plutonium mass, neutron multiplication, and the ratio of ({alpha},n) to spontaneous-fission neutrons (alpha) for oxides of moderate mass. However, underlying standard point model assumptions - including constant neutron energy and constant multiplication throughout the sample - cause significant biases for the mass, multiplication, and alpha in measurements of metal and large, dense oxides.

  11. Accurate Structure Prediction and Conformational Analysis of Cyclic Peptides with Residue-Specific Force Fields.

    PubMed

    Geng, Hao; Jiang, Fan; Wu, Yun-Dong

    2016-05-19

    Cyclic peptides (CPs) are promising candidates for drugs, chemical biology tools, and self-assembling nanomaterials. However, the development of reliable and accurate computational methods for their structure prediction has been challenging. Here, 20 all-trans CPs of 5-12 residues selected from Cambridge Structure Database have been simulated using replica-exchange molecular dynamics with four different force fields. Our recently developed residue-specific force fields RSFF1 and RSFF2 can correctly identify the crystal-like conformations of more than half CPs as the most populated conformation. The RSFF2 performs the best, which consistently predicts the crystal structures of 17 out of 20 CPs with rmsd < 1.1 Å. We also compared the backbone (ϕ, ψ) sampling of residues in CPs with those in short linear peptides and in globular proteins. In general, unlike linear peptides, CPs have local conformational free energies and entropies quite similar to globular proteins. PMID:27128113

  12. Is the Meta-Analysis of Correlation Coefficients Accurate When Population Correlations Vary?

    ERIC Educational Resources Information Center

    Field, Andy P.

    2005-01-01

    One conceptualization of meta-analysis is that studies within the meta-analysis are sampled from populations with mean effect sizes that vary (random-effects models). The consequences of not applying such models and the comparison of different methods have been hotly debated. A Monte Carlo study compared the efficacy of Hedges and Vevea's…

  13. TOPICA: an accurate and efficient numerical tool for analysis and design of ICRF antennas

    NASA Astrophysics Data System (ADS)

    Lancellotti, V.; Milanesio, D.; Maggiora, R.; Vecchi, G.; Kyrytsya, V.

    2006-07-01

    The demand for a predictive tool to help in designing ion-cyclotron radio frequency (ICRF) antenna systems for today's fusion experiments has driven the development of codes such as ICANT, RANT3D, and the early development of TOPICA (TOrino Polytechnic Ion Cyclotron Antenna) code. This paper describes the substantive evolution of TOPICA formulation and implementation that presently allow it to handle the actual geometry of ICRF antennas (with curved, solid straps, a general-shape housing, Faraday screen, etc) as well as an accurate plasma description, accounting for density and temperature profiles and finite Larmor radius effects. The antenna is assumed to be housed in a recess-like enclosure. Both goals have been attained by formally separating the problem into two parts: the vacuum region around the antenna and the plasma region inside the toroidal chamber. Field continuity and boundary conditions allow formulating of a set of two coupled integral equations for the unknown equivalent (current) sources; then the equations are reduced to a linear system by a method of moments solution scheme employing 2D finite elements defined over a 3D non-planar surface triangular-cell mesh. In the vacuum region calculations are done in the spatial (configuration) domain, whereas in the plasma region a spectral (wavenumber) representation of fields and currents is adopted, thus permitting a description of the plasma by a surface impedance matrix. Owing to this approach, any plasma model can be used in principle, and at present the FELICE code has been employed. The natural outcomes of TOPICA are the induced currents on the conductors (antenna, housing, etc) and the electric field in front of the plasma, whence the antenna circuit parameters (impedance/scattering matrices), the radiated power and the fields (at locations other than the chamber aperture) are then obtained. An accurate model of the feeding coaxial lines is also included. The theoretical model and its TOPICA

  14. The ALHAMBRA survey: accurate merger fractions derived by PDF analysis of photometrically close pairs

    NASA Astrophysics Data System (ADS)

    López-Sanjuan, C.; Cenarro, A. J.; Varela, J.; Viironen, K.; Molino, A.; Benítez, N.; Arnalte-Mur, P.; Ascaso, B.; Díaz-García, L. A.; Fernández-Soto, A.; Jiménez-Teja, Y.; Márquez, I.; Masegosa, J.; Moles, M.; Pović, M.; Aguerri, J. A. L.; Alfaro, E.; Aparicio-Villegas, T.; Broadhurst, T.; Cabrera-Caño, J.; Castander, F. J.; Cepa, J.; Cerviño, M.; Cristóbal-Hornillos, D.; Del Olmo, A.; González Delgado, R. M.; Husillos, C.; Infante, L.; Martínez, V. J.; Perea, J.; Prada, F.; Quintana, J. M.

    2015-04-01

    Aims: Our goal is to develop and test a novel methodology to compute accurate close-pair fractions with photometric redshifts. Methods: We improved the currently used methodologies to estimate the merger fraction fm from photometric redshifts by (i) using the full probability distribution functions (PDFs) of the sources in redshift space; (ii) including the variation in the luminosity of the sources with z in both the sample selection and the luminosity ratio constrain; and (iii) splitting individual PDFs into red and blue spectral templates to reliably work with colour selections. We tested the performance of our new methodology with the PDFs provided by the ALHAMBRA photometric survey. Results: The merger fractions and rates from the ALHAMBRA survey agree excellently well with those from spectroscopic work for both the general population and red and blue galaxies. With the merger rate of bright (MB ≤ -20-1.1z) galaxies evolving as (1 + z)n, the power-law index n is higher for blue galaxies (n = 2.7 ± 0.5) than for red galaxies (n = 1.3 ± 0.4), confirming previous results. Integrating the merger rate over cosmic time, we find that the average number of mergers per galaxy since z = 1 is Nmred = 0.57 ± 0.05 for red galaxies and Nmblue = 0.26 ± 0.02 for blue galaxies. Conclusions: Our new methodology statistically exploits all the available information provided by photometric redshift codes and yields accurate measurements of the merger fraction by close pairs from using photometric redshifts alone. Current and future photometric surveys will benefit from this new methodology. Based on observations collected at the German-Spanish Astronomical Center, Calar Alto, jointly operated by the Max-Planck-Institut für Astronomie (MPIA) at Heidelberg and the Instituto de Astrofísica de Andalucía (CSIC).The catalogues, probabilities, and figures of the ALHAMBRA close pairs detected in Sect. 5.1 are available at http://https://cloud.iaa.csic.es/alhambra/catalogues/ClosePairs

  15. Quantitative Analysis of the Effective Functional Structure in Yeast Glycolysis

    PubMed Central

    De la Fuente, Ildefonso M.; Cortes, Jesus M.

    2012-01-01

    The understanding of the effective functionality that governs the enzymatic self-organized processes in cellular conditions is a crucial topic in the post-genomic era. In recent studies, Transfer Entropy has been proposed as a rigorous, robust and self-consistent method for the causal quantification of the functional information flow among nonlinear processes. Here, in order to quantify the functional connectivity for the glycolytic enzymes in dissipative conditions we have analyzed different catalytic patterns using the technique of Transfer Entropy. The data were obtained by means of a yeast glycolytic model formed by three delay differential equations where the enzymatic rate equations of the irreversible stages have been explicitly considered. These enzymatic activity functions were previously modeled and tested experimentally by other different groups. The results show the emergence of a new kind of dynamical functional structure, characterized by changing connectivity flows and a metabolic invariant that constrains the activity of the irreversible enzymes. In addition to the classical topological structure characterized by the specific location of enzymes, substrates, products and feedback-regulatory metabolites, an effective functional structure emerges in the modeled glycolytic system, which is dynamical and characterized by notable variations of the functional interactions. The dynamical structure also exhibits a metabolic invariant which constrains the functional attributes of the enzymes. Finally, in accordance with the classical biochemical studies, our numerical analysis reveals in a quantitative manner that the enzyme phosphofructokinase is the key-core of the metabolic system, behaving for all conditions as the main source of the effective causal flows in yeast glycolysis. PMID:22393350

  16. Hydrocarbons on Phoebe, Iapetus, and Hyperion: Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Cruikshank, Dale P.; MoreauDalleOre, Cristina; Pendleton, Yvonne J.; Clark, Roger Nelson

    2012-01-01

    We present a quantitative analysis of the hydrocarbon spectral bands measured on three of Saturn's satellites, Phoebe, Iaperus, and Hyperion. These bands, measured with the Cassini Visible-Infrared Mapping Spectrometer on close fly-by's of these satellites, are the C-H stretching modes of aromatic hydrocarbons at approximately 3.28 micrometers (approximately 3050 per centimeter), and the are four blended bands of aliphatic -CH2- and -CH3 in the range approximately 3.36-3.52 micrometers (approximately 2980- 2840 per centimeter) bably indicating the presence of polycyclic aromatic hydrocarbons (PAH), is unusually strong in comparison to the aliphatic bands, resulting in a unique signarure among Solar System bodies measured so far, and as such offers a means of comparison among the three satellites. The ratio of the C-H bands in aromatic molecules to those in aliphatic molecules in the surface materials of Phoebe, NAro:NAliph approximately 24; for Hyperion the value is approximately 12, while laperus shows an intermediate value. In view of the trend of the evolution (dehydrogenation by heat and radiation) of aliphatic complexes toward more compact molecules and eventually to aromatics, the relative abundances of aliphatic -CH2- and -CH3- is an indication of the lengths of the molecular chain structures, hence the degree of modification of the original material. We derive CH2:CH3 approximately 2.2 in the spectrum of low-albedo material on laperus; this value is the same within measurement errors to the ratio in the diffuse interstellar medium. The similarity in the spectral signatures of the three satellites, plus the apparent weak trend of aromatic/aliphatic abundance from Phoebe to Hyperion, is consistent with, and effectively confirms that the source of the hydrocarbon-bearing material is Phoebe, and that the appearance of that material on the other two satellites arises from the deposition of the inward-spiraling dust that populates the Phoebe ring.

  17. Quantitative analysis of harmonic convergence in mosquito auditory interactions

    PubMed Central

    Aldersley, Andrew; Champneys, Alan; Robert, Daniel

    2016-01-01

    This article analyses the hearing and behaviour of mosquitoes in the context of inter-individual acoustic interactions. The acoustic interactions of tethered live pairs of Aedes aegypti mosquitoes, from same and opposite sex mosquitoes of the species, are recorded on independent and unique audio channels, together with the response of tethered individual mosquitoes to playbacks of pre-recorded flight tones of lone or paired individuals. A time-dependent representation of each mosquito's non-stationary wing beat frequency signature is constructed, based on Hilbert spectral analysis. A range of algorithmic tools is developed to automatically analyse these data, and used to perform a robust quantitative identification of the ‘harmonic convergence’ phenomenon. The results suggest that harmonic convergence is an active phenomenon, which does not occur by chance. It occurs for live pairs, as well as for lone individuals responding to playback recordings, whether from the same or opposite sex. Male–female behaviour is dominated by frequency convergence at a wider range of harmonic combinations than previously reported, and requires participation from both partners in the duet. New evidence is found to show that male–male interactions are more varied than strict frequency avoidance. Rather, they can be divided into two groups: convergent pairs, typified by tightly bound wing beat frequencies, and divergent pairs, that remain widely spaced in the frequency domain. Overall, the results reveal that mosquito acoustic interaction is a delicate and intricate time-dependent active process that involves both individuals, takes place at many different frequencies, and which merits further enquiry. PMID:27053654

  18. Quantitative analysis of harmonic convergence in mosquito auditory interactions.

    PubMed

    Aldersley, Andrew; Champneys, Alan; Homer, Martin; Robert, Daniel

    2016-04-01

    This article analyses the hearing and behaviour of mosquitoes in the context of inter-individual acoustic interactions. The acoustic interactions of tethered live pairs of Aedes aegypti mosquitoes, from same and opposite sex mosquitoes of the species, are recorded on independent and unique audio channels, together with the response of tethered individual mosquitoes to playbacks of pre-recorded flight tones of lone or paired individuals. A time-dependent representation of each mosquito's non-stationary wing beat frequency signature is constructed, based on Hilbert spectral analysis. A range of algorithmic tools is developed to automatically analyse these data, and used to perform a robust quantitative identification of the 'harmonic convergence' phenomenon. The results suggest that harmonic convergence is an active phenomenon, which does not occur by chance. It occurs for live pairs, as well as for lone individuals responding to playback recordings, whether from the same or opposite sex. Male-female behaviour is dominated by frequency convergence at a wider range of harmonic combinations than previously reported, and requires participation from both partners in the duet. New evidence is found to show that male-male interactions are more varied than strict frequency avoidance. Rather, they can be divided into two groups: convergent pairs, typified by tightly bound wing beat frequencies, and divergent pairs, that remain widely spaced in the frequency domain. Overall, the results reveal that mosquito acoustic interaction is a delicate and intricate time-dependent active process that involves both individuals, takes place at many different frequencies, and which merits further enquiry. PMID:27053654

  19. Analysis of quantitative phase detection based on optical information processing

    NASA Astrophysics Data System (ADS)

    Tao, Wang; Tu, Jiang-Chen; Chun, Kuang-Tao; Yu, Han-Wang; Xin, Du

    2009-07-01

    Phase object exists widely in nature, such as biological cells, optical components, atmospheric flow field and so on. The phase detection of objects has great significance in the basic research, nondestructive testing, aerospace, military weapons and other areas. The usual methods of phase object detection include interference method, grating method, schlieren method, and phase-contrast method etc. These methods have their own advantages, but they also have some disadvantages on detecting precision, environmental requirements, cost, detection rate, detection range, detection linearity in various applications, even the most sophisticated method-phase contrast method mainly used in microscopic structure, lacks quantitative analysis of the size of the phase of the object and the relationship between the image contrast and the optical system. In this paper, various phase detection means and the characteristics of different applications are analyzed based on the optical information processing, and a phase detection system based on optical filtering is formed. Firstly the frequency spectrum of the phase object is achieved by Fourier transform lens in the system, then the frequency spectrum is changed reasonably by the filter, at last the image which can represent the phase distribution through light intensity is achieved by the inverse Fourier transform. The advantages and disadvantages of the common used filters such as 1/4 wavelength phase filter, high-pass filter and edge filter are analyzed, and their phase resolution is analyzed in the same optical information processing system, and the factors impacting phase resolution are pointed out. The paper draws a conclusion that there exists an optimal filter which makes the detect accuracy best for any application. At last, we discussed how to design an optimal filter through which the ability of the phase testing of optical information processing system can be improved most.

  20. Descriptive Quantitative Analysis of Rearfoot Alignment Radiographic Parameters.

    PubMed

    Meyr, Andrew J; Wagoner, Matthew R

    2015-01-01

    Although the radiographic parameters of the transverse talocalcaneal angle (tTCA), calcaneocuboid angle (CCA), talar head uncovering (THU), calcaneal inclination angle (CIA), talar declination angle (TDA), lateral talar-first metatarsal angle (lTFA), and lateral talocalcaneal angle (lTCA) form the basis of the preoperative evaluation and procedure selection for pes planovalgus deformity, the so-called normal values of these measurements are not well-established. The objectives of the present study were to retrospectively evaluate the descriptive statistics of these radiographic parameters (tTCA, CCA, THU, CIA, TDA, lTFA, and lTCA) in a large population, and, second, to determine an objective basis for defining "normal" versus "abnormal" measurements. As a secondary outcome, the relationship of these variables to the body mass index was assessed. Anteroposterior and lateral foot radiographs from 250 consecutive patients without a history of previous foot and ankle surgery and/or trauma were evaluated. The results revealed a mean measurement of 24.12°, 13.20°, 74.32%, 16.41°, 26.64°, 8.37°, and 43.41° for the tTCA, CCA, THU, CIA, TDA, lTFA, and lTCA, respectively. These were generally in line with the reported historical normal values. Descriptive statistical analysis demonstrated that the tTCA, THU, and TDA met the standards to be considered normally distributed but that the CCA, CIA, lTFA, and lTCA demonstrated data characteristics of both parametric and nonparametric distributions. Furthermore, only the CIA (R = -0.2428) and lTCA (R = -0.2449) demonstrated substantial correlation with the body mass index. No differentiations in deformity progression were observed when the radiographic parameters were plotted against each other to lead to a quantitative basis for defining "normal" versus "abnormal" measurements. PMID:26002682

  1. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    SciTech Connect

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; Prowant, Matthew S.; Nettleship, Ian; Addleman, Raymond S.; Bonheyo, George T.

    2015-12-07

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results were compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.

  2. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    DOE PAGESBeta

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; Prowant, Matthew S.; Nettleship, Ian; Addleman, Raymond S.; Bonheyo, George T.

    2015-12-07

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less

  3. TOPLHA: an accurate and efficient numerical tool for analysis and design of LH antennas

    NASA Astrophysics Data System (ADS)

    Milanesio, D.; Lancellotti, V.; Meneghini, O.; Maggiora, R.; Vecchi, G.; Bilato, R.

    2007-09-01

    Auxiliary ICRF heating systems in tokamaks often involve large complex antennas, made up of several conducting straps hosted in distinct cavities that open towards the plasma. The same holds especially true in the LH regime, wherein the antennas are comprised of arrays of many phased waveguides. Upon observing that the various cavities or waveguides couple to each other only through the EM fields existing over the plasma-facing apertures, we self-consistently formulated the EM problem by a convenient set of multiple coupled integral equations. Subsequent application of the Method of Moments yields a highly sparse algebraic system; therefore formal inversion of the system matrix happens to be not so memory demanding, despite the number of unknowns may be quite large (typically 105 or so). The overall strategy has been implemented in an enhanced version of TOPICA (Torino Polytechnic Ion Cyclotron Antenna) and in a newly developed code named TOPLHA (Torino Polytechnic Lower Hybrid Antenna). Both are simulation and prediction tools for plasma facing antennas that incorporate commercial-grade 3D graphic interfaces along with an accurate description of the plasma. In this work we present the new proposed formulation along with examples of application to real life large LH antenna systems.

  4. Mechanical Analysis and Hierarchies of Multi-digit Synergies during Accurate Object Rotation

    PubMed Central

    Zhang, Wei; Olafsdottir, Halla B.; Zatsiorsky, Vladimir M.; Latash, Mark L.

    2009-01-01

    We studied the mechanical variables (the grip force and the total moment of force) and multi-digit synergies at two levels (the virtual finger-thumb level, VF-TH, and the individual finger level, IMRL) of a hypothetical control hierarchy during accurate rotation of a hand-held instrumented handle. Synergies were defined as co-varied changes in elemental variables (forces and moments of force) that stabilize the output at a particular level. Indices of multi-digit synergies showed higher values at the hierarchically higher level (VF-TH) for both normal and tangential forces. The moment of force was stabilized at both hierarchical levels during the steady-state phases but not during the movement. The results support the principles of superposition and of mechanical advantage. They also support an earlier hypothesis on an inherent trade-off between synergies at the two hierarchical levels, although the controller showed more subtle and versatile synergic control than the one hypothesized earlier. PMID:19799165

  5. Guided resonances on lithium niobate for extremely small electric field detection investigated by accurate sensitivity analysis.

    PubMed

    Qiu, Wentao; Ndao, Abdoulaye; Lu, Huihui; Bernal, Maria-Pilar; Baida, Fadi Issam

    2016-09-01

    We present a theoretical study of guided resonances (GR) on a thin film lithium niobate rectangular lattice photonic crystal by band diagram calculations and 3D Finite Difference Time Domain (FDTD) transmission investigations which cover a broad range of parameters. A photonic crystal with an active zone as small as 13μm×13μm×0.7μm can be easily designed to obtain a resonance Q value in the order of 1000. These resonances are then employed in electric field (E-field) sensing applications exploiting the electro optic (EO) effect of lithium niobate. A local field factor that is calculated locally for each FDTD cell is proposed to accurately estimate the sensitivity of GR based E-field sensor. The local field factor allows well agreement between simulations and reported experimental data therefore providing a valuable method in optimizing the GR structure to obtain high sensitivities. When these resonances are associated with sub-picometer optical spectrum analyzer and high field enhancement antenna design, an E-field probe with a sensitivity of 50 μV/m could be achieved. The results of our simulations could be also exploited in other EO based applications such as EEG (Electroencephalography) or ECG (Electrocardiography) probe and E-field frequency detector with an 'invisible' probe to the field being detected etc. PMID:27607627

  6. An integrative variant analysis pipeline for accurate genotype/haplotype inference in population NGS data.

    PubMed

    Wang, Yi; Lu, James; Yu, Jin; Gibbs, Richard A; Yu, Fuli

    2013-05-01

    Next-generation sequencing is a powerful approach for discovering genetic variation. Sensitive variant calling and haplotype inference from population sequencing data remain challenging. We describe methods for high-quality discovery, genotyping, and phasing of SNPs for low-coverage (approximately 5×) sequencing of populations, implemented in a pipeline called SNPTools. Our pipeline contains several innovations that specifically address challenges caused by low-coverage population sequencing: (1) effective base depth (EBD), a nonparametric statistic that enables more accurate statistical modeling of sequencing data; (2) variance ratio scoring, a variance-based statistic that discovers polymorphic loci with high sensitivity and specificity; and (3) BAM-specific binomial mixture modeling (BBMM), a clustering algorithm that generates robust genotype likelihoods from heterogeneous sequencing data. Last, we develop an imputation engine that refines raw genotype likelihoods to produce high-quality phased genotypes/haplotypes. Designed for large population studies, SNPTools' input/output (I/O) and storage aware design leads to improved computing performance on large sequencing data sets. We apply SNPTools to the International 1000 Genomes Project (1000G) Phase 1 low-coverage data set and obtain genotyping accuracy comparable to that of SNP microarray. PMID:23296920

  7. Fast and accurate single-cell RNA-seq analysis by clustering of transcript-compatibility counts.

    PubMed

    Ntranos, Vasilis; Kamath, Govinda M; Zhang, Jesse M; Pachter, Lior; Tse, David N

    2016-01-01

    Current approaches to single-cell transcriptomic analysis are computationally intensive and require assay-specific modeling, which limits their scope and generality. We propose a novel method that compares and clusters cells based on their transcript-compatibility read counts rather than on the transcript or gene quantifications used in standard analysis pipelines. In the reanalysis of two landmark yet disparate single-cell RNA-seq datasets, we show that our method is up to two orders of magnitude faster than previous approaches, provides accurate and in some cases improved results, and is directly applicable to data from a wide variety of assays. PMID:27230763

  8. Quantitative PCR analysis of salivary pathogen burden in periodontitis

    PubMed Central

    Salminen, Aino; Kopra, K. A. Elisa; Hyvärinen, Kati; Paju, Susanna; Mäntylä, Päivi; Buhlin, Kåre; Nieminen, Markku S.; Sinisalo, Juha; Pussinen, Pirkko J.

    2015-01-01

    Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9 ± 9.2 years) with coronary artery disease (CAD) diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR (qPCR). Median salivary concentrations of Porphyromonas gingivalis, Tannerella forsythia, and Prevotella intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary Aggregatibacter actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4–5 mm periodontal pockets, ≥6 mm pockets, and alveolar bone loss (ABL). High level of T. forsythia was associated also with bleeding on probing (BOP). The combination of the four bacteria, i.e., the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR) of 2.40 (95% CI 1.39–4.13). When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51–4.52). The highest OR 3.59 (95% CI 1.94–6.63) was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and

  9. High performance thin layer chromatography (HPTLC) and high performance liquid chromatography (HPLC) for the qualitative and quantitative analysis of Calendula officinalis-advantages and limitations.

    PubMed

    Loescher, Christine M; Morton, David W; Razic, Slavica; Agatonovic-Kustrin, Snezana

    2014-09-01

    Chromatography techniques such as HPTLC and HPLC are commonly used to produce a chemical fingerprint of a plant to allow identification and quantify the main constituents within the plant. The aims of this study were to compare HPTLC and HPLC, for qualitative and quantitative analysis of the major constituents of Calendula officinalis and to investigate the effect of different extraction techniques on the C. officinalis extract composition from different parts of the plant. The results found HPTLC to be effective for qualitative analysis, however, HPLC was found to be more accurate for quantitative analysis. A combination of the two methods may be useful in a quality control setting as it would allow rapid qualitative analysis of herbal material while maintaining accurate quantification of extract composition. PMID:24880991

  10. High throughput quantitative phenotyping of plant resistance using chlorophyll fluorescence image analysis

    PubMed Central

    2013-01-01

    Background In order to select for quantitative plant resistance to pathogens, high throughput approaches that can precisely quantify disease severity are needed. Automation and use of calibrated image analysis should provide more accurate, objective and faster analyses than visual assessments. In contrast to conventional visible imaging, chlorophyll fluorescence imaging is not sensitive to environmental light variations and provides single-channel images prone to a segmentation analysis by simple thresholding approaches. Among the various parameters used in chlorophyll fluorescence imaging, the maximum quantum yield of photosystem II photochemistry (Fv/Fm) is well adapted to phenotyping disease severity. Fv/Fm is an indicator of plant stress that displays a robust contrast between infected and healthy tissues. In the present paper, we aimed at the segmentation of Fv/Fm images to quantify disease severity. Results Based on the Fv/Fm values of each pixel of the image, a thresholding approach was developed to delimit diseased areas. A first step consisted in setting up thresholds to reproduce visual observations by trained raters of symptoms caused by Xanthomonas fuscans subsp. fuscans (Xff) CFBP4834-R on Phaseolus vulgaris cv. Flavert. In order to develop a thresholding approach valuable on any cultivars or species, a second step was based on modeling pixel-wise Fv/Fm-distributions as mixtures of Gaussian distributions. Such a modeling may discriminate various stages of the symptom development but over-weights artifacts that can occur on mock-inoculated samples. Therefore, we developed a thresholding approach based on the probability of misclassification of a healthy pixel. Then, a clustering step is performed on the diseased areas to discriminate between various stages of alteration of plant tissues. Notably, the use of chlorophyll fluorescence imaging could detect pre-symptomatic area. The interest of this image analysis procedure for assessing the levels of

  11. An accurate nonlinear finite element analysis and test correlation of a stiffened composite wing panel

    NASA Technical Reports Server (NTRS)

    Davis, D. D., Jr.; Krishnamurthy, T.; Stroud, W. J.; Mccleary, S. L.

    1991-01-01

    State-of-the-art nonlinear finite element analysis techniques are evaluated by applying them to a realistic aircraft structural component. A wing panel from the V-22 tiltrotor aircraft is chosen because it is a typical modern aircraft structural component for which there is experimental data for comparison of results. From blueprints and drawings, a very detailed finite element model containing 2284 9-node Assumed Natural-Coordinate Strain elements was generated. A novel solution strategy which accounts for geometric nonlinearity through the use of corotating element reference frames and nonlinear strain-displacement relations is used to analyze this detailed model. Results from linear analyses using the same finite element model are presented in order to illustrate the advantages and costs of the nonlinear analysis as compared with the more traditional linear analysis.

  12. Advances in Proteomics Data Analysis and Display Using an Accurate Mass and Time Tag Approach

    SciTech Connect

    Zimmer, Jennifer S.; Monroe, Matthew E.; Qian, Weijun; Smith, Richard D.

    2006-01-20

    Proteomics, and the larger field of systems biology, have recently demonstrated utility in both the understanding of cellular processes on the molecular level and the identification of potential biomarkers of various disease states. The large amount of data generated by utilizing high mass accuracy mass spectrometry for high-throughput proteomics analyses presents a challenge in data processing, analysis and display. This review focuses on recent advances in nanoLC-FTICR-MS-based proteomics analysis and the accompanying data processing tools that have been developed in order to interpret and display the large volumes of data produced.

  13. NREL's Field Data Repository Supports Accurate Home Energy Analysis (Fact Sheet)

    SciTech Connect

    None, None

    2012-02-01

    This fact sheet discusses NREL's work to develop a repository of research-level residential building characteristics and historical energy use data to support ongoing efforts to improve the accuracy of residential energy analysis tools and the efficiency of energy assessment processes. The objective of this project is to create a robust empirical data source to support the research goals of the Department of Energy's Building America program, which is to improve the efficiency of existing U.S. homes by 30% to 50%. Researchers can use this data source to test the accuracy of building energy simulation software and energy audit procedures, ultimately leading to more credible and less expensive energy analysis.

  14. Solid rocket booster internal flow analysis by highly accurate adaptive computational methods

    NASA Technical Reports Server (NTRS)

    Huang, C. Y.; Tworzydlo, W.; Oden, J. T.; Bass, J. M.; Cullen, C.; Vadaketh, S.

    1991-01-01

    The primary objective of this project was to develop an adaptive finite element flow solver for simulating internal flows in the solid rocket booster. Described here is a unique flow simulator code for analyzing highly complex flow phenomena in the solid rocket booster. New methodologies and features incorporated into this analysis tool are described.

  15. Can Raters with Reduced Job Descriptive Information Provide Accurate Position Analysis Questionnaire (PAQ) Ratings?

    ERIC Educational Resources Information Center

    Friedman, Lee; Harvey, Robert J.

    1986-01-01

    Job-naive raters provided with job descriptive information made Position Analysis Questionnaire (PAQ) ratings which were validated against ratings of job analysts who were also job content experts. None of the reduced job descriptive information conditions enabled job-naive raters to obtain either acceptable levels of convergent validity with…

  16. How to Construct More Accurate Student Models: Comparing and Optimizing Knowledge Tracing and Performance Factor Analysis

    ERIC Educational Resources Information Center

    Gong, Yue; Beck, Joseph E.; Heffernan, Neil T.

    2011-01-01

    Student modeling is a fundamental concept applicable to a variety of intelligent tutoring systems (ITS). However, there is not a lot of practical guidance on how to construct and train such models. This paper compares two approaches for student modeling, Knowledge Tracing (KT) and Performance Factors Analysis (PFA), by evaluating their predictive…

  17. Quantitative Analysis by Isotopic Dilution Using Mass Spectroscopy: The Determination of Caffeine by GC-MS.

    ERIC Educational Resources Information Center

    Hill, Devon W.; And Others

    1988-01-01

    Describes a laboratory technique for quantitative analysis of caffeine by an isotopic dilution method for coupled gas chromatography-mass spectroscopy. Discusses caffeine analysis and experimental methodology. Lists sample caffeine concentrations found in common products. (MVL)

  18. Accurate and noninvasive embryos screening during in vitro fertilization (IVF) assisted by Raman analysis of embryos culture medium Accurate and noninvasive embryos screening during IVF

    NASA Astrophysics Data System (ADS)

    Shen, A. G.; Peng, J.; Zhao, Q. H.; Su, L.; Wang, X. H.; Hu, J. M.; Yang, J.

    2012-04-01

    In combination with morphological evaluation tests, we employ Raman spectroscopy to select higher potential reproductive embryos during in vitro fertilization (IVF) based on chemical composition of embryos culture medium. In this study, 57 Raman spectra are acquired from both higher and lower quality embryos culture medium (ECM) from 10 patients which have been preliminarily confirmed by clinical assay. Data are fit by using a linear combination model of least squares method in which 12 basis spectra represent the chemical features of ECM. The final fitting coefficients provide insight into the chemical compositions of culture medium samples and are subsequently used as criterion to evaluate the quality of embryos. The relative fitting coefficients ratios of sodium pyruvate/albumin and phenylalanine/albumin seem act as key roles in the embryo screening, attaining 85.7% accuracy in comparison with clinical pregnancy. The good results demonstrate that Raman spectroscopy therefore is an important candidate for an accurate and noninvasive screening of higher quality embryos, which potentially decrease the time-consuming clinical trials during IVF.

  19. Fully automated quantitative analysis of breast cancer risk in DCE-MR images

    NASA Astrophysics Data System (ADS)

    Jiang, Luan; Hu, Xiaoxin; Gu, Yajia; Li, Qiang

    2015-03-01

    Amount of fibroglandular tissue (FGT) and background parenchymal enhancement (BPE) in dynamic contrast enhanced magnetic resonance (DCE-MR) images are two important indices for breast cancer risk assessment in the clinical practice. The purpose of this study is to develop and evaluate a fully automated scheme for quantitative analysis of FGT and BPE in DCE-MR images. Our fully automated method consists of three steps, i.e., segmentation of whole breast, fibroglandular tissues, and enhanced fibroglandular tissues. Based on the volume of interest extracted automatically, dynamic programming method was applied in each 2-D slice of a 3-D MR scan to delineate the chest wall and breast skin line for segmenting the whole breast. This step took advantages of the continuity of chest wall and breast skin line across adjacent slices. We then further used fuzzy c-means clustering method with automatic selection of cluster number for segmenting the fibroglandular tissues within the segmented whole breast area. Finally, a statistical method was used to set a threshold based on the estimated noise level for segmenting the enhanced fibroglandular tissues in the subtraction images of pre- and post-contrast MR scans. Based on the segmented whole breast, fibroglandular tissues, and enhanced fibroglandular tissues, FGT and BPE were automatically computed. Preliminary results of technical evaluation and clinical validation showed that our fully automated scheme could obtain good segmentation of the whole breast, fibroglandular tissues, and enhanced fibroglandular tissues to achieve accurate assessment of FGT and BPE for quantitative analysis of breast cancer risk.

  20. Chromatic distortion during angioscopy: assessment and correction by quantitative colorimetric angioscopic analysis.

    PubMed

    Lehmann, K G; Oomen, J A; Slager, C J; deFeyter, P J; Serruys, P W

    1998-10-01

    Angioscopy represents a diagnostic tool with the unique ability of assessing the true color of intravascular structures. Current angioscopic interpretation is entirely subjective, however, and the visual interpretation of color has been shown to be marginal at best. The quantitative colorimetric angioscopic analysis system permits the full characterization of angioscopic color using two parameters (C1 and C2), derived from a custom color coordinate system, that are independent of illuminating light intensity. Measurement variability was found to be low (coefficient of variation = 0.06-0.64%), and relatively stable colorimetric values were obtained even at the extremes of illumination power. Variability between different angioscopic catheters was good (maximum difference for C1, 0.022; for C2, 0.015). Catheter flexion did not significantly distort color transmission. Although the fiber optic illumination bundle was found to impart a slight yellow tint to objects in view (deltaC1 = 0.020, deltaC2 = 0.024, P < 0.0001) and the imaging bundle in isolation imparted a slight red tint (deltaC1 = 0.043, deltaC2 = -0.027, P < 0.0001), both of these artifacts could be corrected by proper white balancing. Finally, evaluation of regional chromatic characteristics revealed a radially symmetric and progressive blue shift in measured color when moving from the periphery to the center of an angioscopic image. An algorithm was developed that could automatically correct 93.0-94.3% of this error and provide accurate colorimetric measurements independent of spatial location within the angioscopic field. In summary, quantitative colorimetric angioscopic analysis provides objective and highly reproducible measurements of angioscopic color. This technique can correct for important chromatic distortions present in modern angioscopic systems. It can also help overcome current limitations in angioscopy research and clinical use imposed by the reliance on visual perception of color. PMID

  1. Quantitative hopanoid analysis enables robust pattern detection and comparison between laboratories.

    PubMed

    Wu, C-H; Kong, L; Bialecka-Fornal, M; Park, S; Thompson, A L; Kulkarni, G; Conway, S J; Newman, D K

    2015-07-01

    Hopanoids are steroid-like lipids from the isoprenoid family that are produced primarily by bacteria. Hopanes, molecular fossils of hopanoids, offer the potential to provide insight into environmental transitions on the early Earth, if their sources and biological functions can be constrained. Semiquantitative methods for mass spectrometric analysis of hopanoids from cultures and environmental samples have been developed in the last two decades. However, the structural diversity of hopanoids, and possible variability in their ionization efficiencies on different instruments, have thus far precluded robust quantification and hindered comparison of results between laboratories. These ionization inconsistencies give rise to the need to calibrate individual instruments with purified hopanoids to reliably quantify hopanoids. Here, we present new approaches to obtain both purified and synthetic quantification standards. We optimized 2-methylhopanoid production in Rhodopseudomonas palustris TIE-1 and purified 2Me-diplopterol, 2Me-bacteriohopanetetrol (2Me-BHT), and their unmethylated species (diplopterol and BHT). We found that 2-methylation decreases the signal intensity of diplopterol between 2 and 34% depending on the instrument used to detect it, but decreases the BHT signal less than 5%. In addition, 2Me-diplopterol produces 10× higher ion counts than equivalent quantities of 2Me-BHT. Similar deviations were also observed using a flame ionization detector for signal quantification in GC. In LC-MS, however, 2Me-BHT produces 11× higher ion counts than 2Me-diplopterol but only 1.2× higher ion counts than the sterol standard pregnane acetate. To further improve quantification, we synthesized tetradeuterated (D4) diplopterol, a precursor for a variety of hopanoids. LC-MS analysis on a mixture of (D4)-diplopterol and phospholipids showed that under the influence of co-eluted phospholipids, the D4-diplopterol internal standard quantifies diplopterol more accurately than

  2. Quantitative hopanoid analysis enables robust pattern detection and comparison between laboratories

    PubMed Central

    Wu, C-H; Kong, L; Bialecka-Fornal, M; Park, S; Thompson, A L; Kulkarni, G; Conway, S J; Newman, D K

    2015-01-01

    Hopanoids are steroid-like lipids from the isoprenoid family that are produced primarily by bacteria. Hopanes, molecular fossils of hopanoids, offer the potential to provide insight into environmental transitions on the early Earth, if their sources and biological functions can be constrained. Semiquantitative methods for mass spectrometric analysis of hopanoids from cultures and environmental samples have been developed in the last two decades. However, the structural diversity of hopanoids, and possible variability in their ionization efficiencies on different instruments, have thus far precluded robust quantification and hindered comparison of results between laboratories. These ionization inconsistencies give rise to the need to calibrate individual instruments with purified hopanoids to reliably quantify hopanoids. Here, we present new approaches to obtain both purified and synthetic quantification standards. We optimized 2-methylhopanoid production in Rhodopseudomonas palustris TIE-1 and purified 2Me-diplopterol, 2Me-bacteriohopanetetrol (2Me-BHT), and their unmethylated species (diplopterol and BHT). We found that 2-methylation decreases the signal intensity of diplopterol between 2 and 34% depending on the instrument used to detect it, but decreases the BHT signal less than 5%. In addition, 2Me-diplopterol produces 10× higher ion counts than equivalent quantities of 2Me-BHT. Similar deviations were also observed using a flame ionization detector for signal quantification in GC. In LC-MS, however, 2Me-BHT produces 11× higher ion counts than 2Me-diplopterol but only 1.2× higher ion counts than the sterol standard pregnane acetate. To further improve quantification, we synthesized tetradeuterated (D4) diplopterol, a precursor for a variety of hopanoids. LC-MS analysis on a mixture of (D4)-diplopterol and phospholipids showed that under the influence of co-eluted phospholipids, the D4-diplopterol internal standard quantifies diplopterol more accurately than

  3. Influence of storage time on DNA of Chlamydia trachomatis, Ureaplasma urealyticum, and Neisseria gonorrhoeae for accurate detection by quantitative real-time polymerase chain reaction.

    PubMed

    Lu, Y; Rong, C Z; Zhao, J Y; Lao, X J; Xie, L; Li, S; Qin, X

    2016-01-01

    The shipment and storage conditions of clinical samples pose a major challenge to the detection accuracy of Chlamydia trachomatis (CT), Neisseria gonorrhoeae (NG), and Ureaplasma urealyticum (UU) when using quantitative real-time polymerase chain reaction (qRT-PCR). The aim of the present study was to explore the influence of storage time at 4°C on the DNA of these pathogens and its effect on their detection by qRT-PCR. CT, NG, and UU positive genital swabs from 70 patients were collected, and DNA of all samples were extracted and divided into eight aliquots. One aliquot was immediately analyzed with qRT-PCR to assess the initial pathogen load, whereas the remaining samples were stored at 4°C and analyzed after 1, 2, 3, 7, 14, 21, and 28 days. No significant differences in CT, NG, and UU DNA loads were observed between baseline (day 0) and the subsequent time points (days 1, 2, 3, 7, 14, 21, and 28) in any of the 70 samples. Although a slight increase in DNA levels was observed at day 28 compared to day 0, paired sample t-test results revealed no significant differences between the mean DNA levels at different time points following storage at 4°C (all P>0.05). Overall, the CT, UU, and NG DNA loads from all genital swab samples were stable at 4°C over a 28-day period. PMID:27580005

  4. Influence of storage time on DNA of Chlamydia trachomatis, Ureaplasma urealyticum, and Neisseria gonorrhoeae for accurate detection by quantitative real-time polymerase chain reaction

    PubMed Central

    Lu, Y.; Rong, C.Z.; Zhao, J.Y.; Lao, X.J.; Xie, L.; Li, S.; Qin, X.

    2016-01-01

    The shipment and storage conditions of clinical samples pose a major challenge to the detection accuracy of Chlamydia trachomatis (CT), Neisseria gonorrhoeae (NG), and Ureaplasma urealyticum (UU) when using quantitative real-time polymerase chain reaction (qRT-PCR). The aim of the present study was to explore the influence of storage time at 4°C on the DNA of these pathogens and its effect on their detection by qRT-PCR. CT, NG, and UU positive genital swabs from 70 patients were collected, and DNA of all samples were extracted and divided into eight aliquots. One aliquot was immediately analyzed with qRT-PCR to assess the initial pathogen load, whereas the remaining samples were stored at 4°C and analyzed after 1, 2, 3, 7, 14, 21, and 28 days. No significant differences in CT, NG, and UU DNA loads were observed between baseline (day 0) and the subsequent time points (days 1, 2, 3, 7, 14, 21, and 28) in any of the 70 samples. Although a slight increase in DNA levels was observed at day 28 compared to day 0, paired sample t-test results revealed no significant differences between the mean DNA levels at different time points following storage at 4°C (all P>0.05). Overall, the CT, UU, and NG DNA loads from all genital swab samples were stable at 4°C over a 28-day period. PMID:27580005

  5. Accurate abundance analysis of late-type stars: advances in atomic physics

    NASA Astrophysics Data System (ADS)

    Barklem, Paul S.

    2016-05-01

    The measurement of stellar properties such as chemical compositions, masses and ages, through stellar spectra, is a fundamental problem in astrophysics. Progress in the understanding, calculation and measurement of atomic properties and processes relevant to the high-accuracy analysis of F-, G-, and K-type stellar spectra is reviewed, with particular emphasis on abundance analysis. This includes fundamental atomic data such as energy levels, wavelengths, and transition probabilities, as well as processes of photoionisation, collisional broadening and inelastic collisions. A recurring theme throughout the review is the interplay between theoretical atomic physics, laboratory measurements, and astrophysical modelling, all of which contribute to our understanding of atoms and atomic processes, as well as to modelling stellar spectra.

  6. Quantitative analysis of localized surface plasmons based on molecular probing.

    PubMed

    Deeb, Claire; Bachelot, Renaud; Plain, Jérôme; Baudrion, Anne-Laure; Jradi, Safi; Bouhelier, Alexandre; Soppera, Olivier; Jain, Prashant K; Huang, Libai; Ecoffet, Carole; Balan, Lavinia; Royer, Pascal

    2010-08-24

    We report on the quantitative characterization of the plasmonic optical near-field of a single silver nanoparticle. Our approach relies on nanoscale molecular molding of the confined electromagnetic field by photoactivated molecules. We were able to directly image the dipolar profile of the near-field distribution with a resolution better than 10 nm and to quantify the near-field depth and its enhancement factor. A single nanoparticle spectral signature was also assessed. This quantitative characterization constitutes a prerequisite for developing nanophotonic applications. PMID:20687536

  7. Quantitatively estimating defects in graphene devices using discharge current analysis method

    PubMed Central

    Jung, Ukjin; Lee, Young Gon; Kang, Chang Goo; Lee, Sangchul; Kim, Jin Ju; Hwang, Hyeon June; Lim, Sung Kwan; Ham, Moon-Ho; Lee, Byoung Hun

    2014-01-01

    Defects of graphene are the most important concern for the successful applications of graphene since they affect device performance significantly. However, once the graphene is integrated in the device structures, the quality of graphene and surrounding environment could only be assessed using indirect information such as hysteresis, mobility and drive current. Here we develop a discharge current analysis method to measure the quality of graphene integrated in a field effect transistor structure by analyzing the discharge current and examine its validity using various device structures. The density of charging sites affecting the performance of graphene field effect transistor obtained using the discharge current analysis method was on the order of 1014/cm2, which closely correlates with the intensity ratio of the D to G bands in Raman spectroscopy. The graphene FETs fabricated on poly(ethylene naphthalate) (PEN) are found to have a lower density of charging sites than those on SiO2/Si substrate, mainly due to reduced interfacial interaction between the graphene and the PEN. This method can be an indispensable means to improve the stability of devices using a graphene as it provides an accurate and quantitative way to define the quality of graphene after the device fabrication. PMID:24811431

  8. Quantitative analysis of eugenol in clove extract by a validated HPLC method.

    PubMed

    Yun, So-Mi; Lee, Myoung-Heon; Lee, Kwang-Jick; Ku, Hyun-Ok; Son, Seong-Wan; Joo, Yi-Seok

    2010-01-01

    Clove (Eugenia caryophyllata) is a well-known medicinal plant used for diarrhea, digestive disorders, or in antiseptics in Korea. Eugenol is the main active ingredient of clove and has been chosen as a marker compound for the chemical evaluation or QC of clove. This paper reports the development and validation of an HPLC-diode array detection (DAD) method for the determination of eugenol in clove. HPLC separation was accomplished on an XTerra RP18 column (250 x 4.6 mm id, 5 microm) with an isocratic mobile phase of 60% methanol and DAD at 280 nm. Calibration graphs were linear with very good correlation coefficients (r2 > 0.9999) from 12.5 to 1000 ng/mL. The LOD was 0.81 and the LOQ was 2.47 ng/mL. The method showed good intraday precision (%RSD 0.08-0.27%) and interday precision (%RSD 0.32-1.19%). The method was applied to the analysis of eugenol from clove cultivated in various countries (Indonesia, Singapore, and China). Quantitative analysis of the 15 clove samples showed that the content of eugenol varied significantly, ranging from 163 to 1049 ppb. The method of determination of eugenol by HPLC is accurate to evaluate the quality and safety assurance of clove, based on the results of this study. PMID:21313806

  9. Quantitative analysis of the central-chest lymph nodes based on 3D MDCT image data

    NASA Astrophysics Data System (ADS)

    Lu, Kongkuo; Bascom, Rebecca; Mahraj, Rickhesvar P. M.; Higgins, William E.

    2009-02-01

    Lung cancer is the leading cause of cancer death in the United States. In lung-cancer staging, central-chest lymph nodes and associated nodal stations, as observed in three-dimensional (3D) multidetector CT (MDCT) scans, play a vital role. However, little work has been done in relation to lymph nodes, based on MDCT data, due to the complicated phenomena that give rise to them. Using our custom computer-based system for 3D MDCT-based pulmonary lymph-node analysis, we conduct a detailed study of lymph nodes as depicted in 3D MDCT scans. In this work, the Mountain lymph-node stations are automatically defined by the system. These defined stations, in conjunction with our system's image processing and visualization tools, facilitate lymph-node detection, classification, and segmentation. An expert pulmonologist, chest radiologist, and trained technician verified the accuracy of the automatically defined stations and indicated observable lymph nodes. Next, using semi-automatic tools in our system, we defined all indicated nodes. Finally, we performed a global quantitative analysis of the characteristics of the observed nodes and stations. This study drew upon a database of 32 human MDCT chest scans. 320 Mountain-based stations (10 per scan) and 852 pulmonary lymph nodes were defined overall from this database. Based on the numerical results, over 90% of the automatically defined stations were deemed accurate. This paper also presents a detailed summary of central-chest lymph-node characteristics for the first time.

  10. Validation of reference genes for accurate normalization of gene expression for real time-quantitative PCR in strawberry fruits using different cultivars and osmotic stresses.

    PubMed

    Galli, Vanessa; Borowski, Joyce Moura; Perin, Ellen Cristina; Messias, Rafael da Silva; Labonde, Julia; Pereira, Ivan dos Santos; Silva, Sérgio Delmar Dos Anjos; Rombaldi, Cesar Valmor

    2015-01-10

    The increasing demand of strawberry (Fragaria×ananassa Duch) fruits is associated mainly with their sensorial characteristics and the content of antioxidant compounds. Nevertheless, the strawberry production has been hampered due to its sensitivity to abiotic stresses. Therefore, to understand the molecular mechanisms highlighting stress response is of great importance to enable genetic engineering approaches aiming to improve strawberry tolerance. However, the study of expression of genes in strawberry requires the use of suitable reference genes. In the present study, seven traditional and novel candidate reference genes were evaluated for transcript normalization in fruits of ten strawberry cultivars and two abiotic stresses, using RefFinder, which integrates the four major currently available software programs: geNorm, NormFinder, BestKeeper and the comparative delta-Ct method. The results indicate that the expression stability is dependent on the experimental conditions. The candidate reference gene DBP (DNA binding protein) was considered the most suitable to normalize expression data in samples of strawberry cultivars and under drought stress condition, and the candidate reference gene HISTH4 (histone H4) was the most stable under osmotic stresses and salt stress. The traditional genes GAPDH (glyceraldehyde-3-phosphate dehydrogenase) and 18S (18S ribosomal RNA) were considered the most unstable genes in all conditions. The expression of phenylalanine ammonia lyase (PAL) and 9-cis epoxycarotenoid dioxygenase (NCED1) genes were used to further confirm the validated candidate reference genes, showing that the use of an inappropriate reference gene may induce erroneous results. This study is the first survey on the stability of reference genes in strawberry cultivars and osmotic stresses and provides guidelines to obtain more accurate RT-qPCR results for future breeding efforts. PMID:25445290

  11. Apparatus for use in rapid and accurate controlled-potential coulometric analysis

    DOEpatents

    Frazzini, Thomas L.; Holland, Michael K.; Pietri, Charles E.; Weiss, Jon R.

    1981-01-01

    An apparatus for controlled-potential coulometric analysis of a solution includes a cell to contain the solution to be analyzed and a plurality of electrodes to contact the solution in the cell. Means are provided to stir the solution and to control the atmosphere above it. A potentiostat connected to the electrodes controls potential differences among the electrodes. An electronic circuit connected to the potentiostat provides analog-to-digital conversion and displays a precise count of charge transfer during a desired chemical process. This count provides a measure of the amount of an unknown substance in the solution.

  12. TOPLHA: an accurate and efficient numerical tool for analysis and design of LH antennas

    NASA Astrophysics Data System (ADS)

    Milanesio, D.; Meneghini, O.; Maggiora, R.; Guadamuz, S.; Hillairet, J.; Lancellotti, V.; Vecchi, G.

    2012-01-01

    This paper presents a self-consistent, integral-equation approach for the analysis of plasma-facing lower hybrid (LH) launchers; the geometry of the waveguide grill structure can be completely arbitrary, including the non-planar mouth of the grill. This work is based on the theoretical approach and code implementation of the TOPICA code, of which it shares the modular structure and constitutes the extension into the LH range. Code results are validated against the literature results and simulations from similar codes.

  13. Advances in Proteomics Data Analysis and Display Using an Accurate Mass and Time Tag Approach

    PubMed Central

    Zimmer, Jennifer S.D.; Monroe, Matthew E.; Qian, Wei-Jun; Smith, Richard D.

    2007-01-01

    Proteomics has recently demonstrated utility in understanding cellular processes on the molecular level as a component of systems biology approaches and for identifying potential biomarkers of various disease states. The large amount of data generated by utilizing high efficiency (e.g., chromatographic) separations coupled to high mass accuracy mass spectrometry for high-throughput proteomics analyses presents challenges related to data processing, analysis, and display. This review focuses on recent advances in nanoLC-FTICR-MS-based proteomics approaches and the accompanying data processing tools that have been developed to display and interpret the large volumes of data being produced. PMID:16429408

  14. A new quantitative method for gunshot residue analysis by ion beam analysis.

    PubMed

    Christopher, Matthew E; Warmenhoeven, John-William; Romolo, Francesco S; Donghi, Matteo; Webb, Roger P; Jeynes, Christopher; Ward, Neil I; Kirkby, Karen J; Bailey, Melanie J

    2013-08-21

    Imaging and analyzing gunshot residue (GSR) particles using the scanning electron microscope equipped with an energy dispersive X-ray spectrometer (SEM-EDS) is a standard technique that can provide important forensic evidence, but the discrimination power of this technique is limited due to low sensitivity to trace elements and difficulties in obtaining quantitative results from small particles. A new, faster method using a scanning proton microbeam and Particle Induced X-ray Emission (μ-PIXE), together with Elastic Backscattering Spectrometry (EBS) is presented for the non-destructive, quantitative analysis of the elemental composition of single GSR particles. In this study, the GSR particles were all Pb, Ba, Sb. The precision of the method is assessed. The grouping behaviour of different makes of ammunition is determined using multivariate analysis. The protocol correctly groups the cartridges studied here, with a confidence >99%, irrespective of the firearm or population of particles selected. PMID:23775063

  15. Teaching Quantitative Reasoning for Nonscience Majors through Carbon Footprint Analysis

    ERIC Educational Resources Information Center

    Boose, David L.

    2014-01-01

    Quantitative reasoning is a key intellectual skill, applicable across disciplines and best taught in the context of authentic, relevant problems. Here, I describe and assess a laboratory exercise that has students calculate their "carbon footprint" and evaluate the impacts of various behavior choices on that footprint. Students gather…

  16. MOLD SPECIFIC QUANTITATIVE PCR: THE EMERGING STANDARD IN MOLD ANALYSIS

    EPA Science Inventory

    Today I will talk about the use of quantitative or Real time PCR for the standardized identification and quantification of molds. There are probably at least 100,000 species of molds or fungi. But there are actually about 100 typically found indoors. Some pose a threat to human...

  17. Features of the Quantitative Analysis in Moessbauer Spectroscopy

    SciTech Connect

    Semenov, V. G.; Panchuk, V. V.; Irkaev, S. M.

    2010-07-13

    The results describing the effect of different factors on errors in quantitative determination of the phase composition of studied substances by Moessbauer spectroscopy absorption are presented, and the ways of using them are suggested. The effectiveness of the suggested methods is verified by an example of analyzing standard and unknown compositions.

  18. Quantitative and Qualitative Analysis of Biomarkers in Fusarium verticillioides

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In this study, a combination HPLC-DART-TOF-MS system was utilized to identify and quantitatively analyze carbohydrates in wild type and mutant strains of Fusarium verticillioides. Carbohydrate fractions were isolated from F. verticillioides cellular extracts by HPLC using a cation-exchange size-excl...

  19. Quantitative Analysis of Radionuclides in Process and Environmental Samples

    SciTech Connect

    Boni, A.L.

    2003-02-21

    An analytical method was developed for the radiochemical separation and quantitative recovery of ruthenium, zirconium, niobium, neptunium, cobalt, iron, zinc, strontium, rare earths, chromium and cesium from a wide variety of natural materials. This paper discusses this analytical method, based on the anion exchange properties of the various radionuclides, although both ion exchange and precipitation techniques are incorporated.

  20. Automated system for fast and accurate analysis of SF6 injected in the surface ocean.

    PubMed

    Koo, Chul-Min; Lee, Kitack; Kim, Miok; Kim, Dae-Ok

    2005-11-01

    This paper describes an automated sampling and analysis system for the shipboard measurement of dissolved sulfur hexafluoride (SF6) in surface marine environments into which SF6 has been deliberately released. This underway system includes a gas chromatograph associated with an electron capture detector, a fast and highly efficient SF6-extraction device, a global positioning system, and a data acquisition system based on Visual Basic 6.0/C 6.0. This work is distinct from previous studies in that it quantifies the efficiency of the SF6-extraction device and its carryover effect and examines the effect of surfactant on the SF6-extraction efficiency. Measurements can be continuously performed on seawater samples taken from a seawater line installed onboard a research vessel. The system runs on an hourly cycle during which one set of four SF6 standards is measured and SF6 derived from the seawater stream is subsequently analyzed for the rest of each 1 h period. This state-of-art system was successfully used to trace a water mass carrying Cochlodinium polykrikoides, which causes harmful algal blooms (HAB) in the coastal waters of southern Korea. The successful application of this analysis system in tracing the HAB-infected water mass suggests that the SF6 detection method described in this paper will improve the quality of the future study of biogeochemical processes in the marine environment. PMID:16294883

  1. Combining multiple regression and principal component analysis for accurate predictions for column ozone in Peninsular Malaysia

    NASA Astrophysics Data System (ADS)

    Rajab, Jasim M.; MatJafri, M. Z.; Lim, H. S.

    2013-06-01

    This study encompasses columnar ozone modelling in the peninsular Malaysia. Data of eight atmospheric parameters [air surface temperature (AST), carbon monoxide (CO), methane (CH4), water vapour (H2Ovapour), skin surface temperature (SSKT), atmosphere temperature (AT), relative humidity (RH), and mean surface pressure (MSP)] data set, retrieved from NASA's Atmospheric Infrared Sounder (AIRS), for the entire period (2003-2008) was employed to develop models to predict the value of columnar ozone (O3) in study area. The combined method, which is based on using both multiple regressions combined with principal component analysis (PCA) modelling, was used to predict columnar ozone. This combined approach was utilized to improve the prediction accuracy of columnar ozone. Separate analysis was carried out for north east monsoon (NEM) and south west monsoon (SWM) seasons. The O3 was negatively correlated with CH4, H2Ovapour, RH, and MSP, whereas it was positively correlated with CO, AST, SSKT, and AT during both the NEM and SWM season periods. Multiple regression analysis was used to fit the columnar ozone data using the atmospheric parameter's variables as predictors. A variable selection method based on high loading of varimax rotated principal components was used to acquire subsets of the predictor variables to be comprised in the linear regression model of the atmospheric parameter's variables. It was found that the increase in columnar O3 value is associated with an increase in the values of AST, SSKT, AT, and CO and with a drop in the levels of CH4, H2Ovapour, RH, and MSP. The result of fitting the best models for the columnar O3 value using eight of the independent variables gave about the same values of the R (≈0.93) and R2 (≈0.86) for both the NEM and SWM seasons. The common variables that appeared in both regression equations were SSKT, CH4 and RH, and the principal precursor of the columnar O3 value in both the NEM and SWM seasons was SSKT.

  2. Spectrally-accurate algorithm for the analysis of flows in two-dimensional vibrating channels

    NASA Astrophysics Data System (ADS)

    Zandi, S.; Mohammadi, A.; Floryan, J. M.

    2015-11-01

    A spectral algorithm based on the immersed boundary conditions (IBC) concept has been developed for the analysis of flows in channels bounded by vibrating walls. The vibrations take the form of travelling waves of arbitrary profile. The algorithm uses a fixed computational domain with the flow domain immersed in its interior. Boundary conditions enter the algorithm in the form of constraints. The spatial discretization uses a Fourier expansion in the stream-wise direction and a Chebyshev expansion in the wall-normal direction. Use of the Galileo transformation converts the unsteady problem into a steady one. An efficient solver which takes advantage of the structure of the coefficient matrix has been used. It is demonstrated that the method can be extended to more extreme geometries using the overdetermined formulation. Various tests confirm the spectral accuracy of the algorithm.

  3. Simulating Expert Clinical Comprehension: Adapting Latent Semantic Analysis to Accurately Extract Clinical Concepts from Psychiatric Narrative

    PubMed Central

    Cohen, Trevor; Blatter, Brett; Patel, Vimla

    2008-01-01

    Cognitive studies reveal that less-than-expert clinicians are less able to recognize meaningful patterns of data in clinical narratives. Accordingly, psychiatric residents early in training fail to attend to information that is relevant to diagnosis and the assessment of dangerousness. This manuscript presents cognitively motivated methodology for the simulation of expert ability to organize relevant findings supporting intermediate diagnostic hypotheses. Latent Semantic Analysis is used to generate a semantic space from which meaningful associations between psychiatric terms are derived. Diagnostically meaningful clusters are modeled as geometric structures within this space and compared to elements of psychiatric narrative text using semantic distance measures. A learning algorithm is defined that alters components of these geometric structures in response to labeled training data. Extraction and classification of relevant text segments is evaluated against expert annotation, with system-rater agreement approximating rater-rater agreement. A range of biomedical informatics applications for these methods are suggested. PMID:18455483

  4. Computer-implemented system and method for automated and highly accurate plaque analysis, reporting, and visualization

    NASA Technical Reports Server (NTRS)

    Kemp, James Herbert (Inventor); Talukder, Ashit (Inventor); Lambert, James (Inventor); Lam, Raymond (Inventor)

    2008-01-01

    A computer-implemented system and method of intra-oral analysis for measuring plaque removal is disclosed. The system includes hardware for real-time image acquisition and software to store the acquired images on a patient-by-patient basis. The system implements algorithms to segment teeth of interest from surrounding gum, and uses a real-time image-based morphing procedure to automatically overlay a grid onto each segmented tooth. Pattern recognition methods are used to classify plaque from surrounding gum and enamel, while ignoring glare effects due to the reflection of camera light and ambient light from enamel regions. The system integrates these components into a single software suite with an easy-to-use graphical user interface (GUI) that allows users to do an end-to-end run of a patient record, including tooth segmentation of all teeth, grid morphing of each segmented tooth, and plaque classification of each tooth image.

  5. Alternative Filament Loading Solution for Accurate Analysis of Boron Isotopes by Negative Thermal Ionization Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Dwyer, G. S.; Vengosh, A.

    2008-12-01

    The negative thermal ionization mass spectrometry technique has become the major tool for investigating boron isotopes in the environment. The high sensitivity of BO2- ionization enables measurements of ng levels of boron. However, B isotope measurement by this technique suffers from two fundamental problems (1) fractionation induced by selective ionization of B isotopes in the mass spectrometer; and (2) CNO- interference on mass 42 that is often present in some load solutions (such as B-free seawater processed through ion-exchange resin). Here we report a potentially improved methodology using an alternative filament loading solution with a recently-installed Thermo Scientific TRITON thermal ionization mass spectrometer. Our initial results suggest that this solution -- prepared by combining high-purity single- element standard solutions of Ca, Mg, Na, and K in proportions similar to those in seawater in a 5% HCl matrix -- may offer significant improvement over some other commonly used load solutions. Total loading blank is around 15pg as determined by isotope dilution (NIST952). Replicate analyses of NIST SRM951 and modern seawater thus far have yielded 11B/10B ratios of 4.0057 (±0.0008, n=14) and 4.1645 (±0.0017, n=7; δ11B=39.6 permil), respectively. Replicate analyses of samples and SRM951 yield an average standard deviation (1 σ) of approximately 0.001 (0.25 permil). Fractionation during analysis (60-90 minutes) has thus far typically been less than 0.002 (0.5 permil). The load solution delivers ionization efficiency similar to directly-loaded seawater and has negligible signal at mass 26 (CN-), a proxy for the common interfering molecular ion (CNO-) on mass 42. Standards and samples loaded with the solution behave fairly predictably during filament heating and analysis, thus allowing for the possibility of fully automated data collection.

  6. Wind effect on PV module temperature: Analysis of different techniques for an accurate estimation.

    NASA Astrophysics Data System (ADS)

    Schwingshackl, Clemens; Petitta, Marcello; Ernst Wagner, Jochen; Belluardo, Giorgio; Moser, David; Castelli, Mariapina; Zebisch, Marc; Tetzlaff, Anke

    2013-04-01

    In this abstract a study on the influence of wind to model the PV module temperature is presented. This study is carried out in the framework of the PV-Alps INTERREG project in which the potential of different photovoltaic technologies is analysed for alpine regions. The PV module temperature depends on different parameters, such as ambient temperature, irradiance, wind speed and PV technology [1]. In most models, a very simple approach is used, where the PV module temperature is calculated from NOCT (nominal operating cell temperature), ambient temperature and irradiance alone [2]. In this study the influence of wind speed on the PV module temperature was investigated. First, different approaches suggested by various authors were tested [1], [2], [3], [4], [5]. For our analysis, temperature, irradiance and wind data from a PV test facility at the airport Bolzano (South Tyrol, Italy) from the EURAC Institute of Renewable Energies were used. The PV module temperature was calculated with different models and compared to the measured PV module temperature at the single panels. The best results were achieved with the approach suggested by Skoplaki et al. [1]. Preliminary results indicate that for all PV technologies which were tested (monocrystalline, amorphous, microcrystalline and polycrystalline silicon and cadmium telluride), modelled and measured PV module temperatures show a higher agreement (RMSE about 3-4 K) compared to standard approaches in which wind is not considered. For further investigation the in-situ measured wind velocities were replaced with wind data from numerical weather forecast models (ECMWF, reanalysis fields). Our results show that the PV module temperature calculated with wind data from ECMWF is still in very good agreement with the measured one (R² > 0.9 for all technologies). Compared to the previous analysis, we find comparable mean values and an increasing standard deviation. These results open a promising approach for PV module

  7. Flow injection combined with ICP-MS for accurate high throughput analysis of elemental impurities in pharmaceutical products according to USP <232>/<233>.

    PubMed

    Fischer, Lisa; Zipfel, Barbara; Koellensperger, Gunda; Kovac, Jessica; Bilz, Susanne; Kunkel, Andrea; Venzago, Cornel; Hann, Stephan

    2014-07-01

    New guidelines of the United States Pharmacopeia (USP), European Pharmacopeia (EP) and international organization (ICH, International Conference on Harmonization) regulating elemental impurity limits in pharmaceuticals seal the end of unspecific analysis of metal(oid)s as outlined in USP <231> and EP 2.4.8. Chapter USP <232> and EP 5.20 as well as drafts from ICH Q3D specify both daily doses and concentration limits of metallic impurities in pharmaceutical final products and in active pharmaceutical ingredients (API) and excipients. In chapters USP <233> and EP 2.4.20 method implementation, validation and quality control during the analytical process are described. By contrast with the--by now--applied methods, substance specific quantitative analysis features new basic requirements, further, significantly lower detection limits ask for the necessity of a general changeover of the methodology toward sensitive multi element analysis by ICP-AES and ICP-MS, respectively. A novel methodological approach based on flow injection analysis and ICP-SFMS/ICP-QMS for the quick and accurate analysis of Cd, Pb, As, Hg, Ir, Os, Pd, Pt, Rh, Ru, Cr, Mo, Ni, V, Cu, Mn, Fe and Zn in drug products by prior dilution, dissolution or microwave assisted closed vessel digestion according to the regulations is presented. In comparison to the acquisition of continuous signals, this method is advantageous with respect to the unprecedented high sample throughput due to a total analysis time of approximately 30s and the low sample consumption of below 50 μL, while meeting the strict USP demands on detection/quantification limits, precision and accuracy. PMID:24667566

  8. Integrative subcellular proteomic analysis allows accurate prediction of human disease-causing genes.

    PubMed

    Zhao, Li; Chen, Yiyun; Bajaj, Amol Onkar; Eblimit, Aiden; Xu, Mingchu; Soens, Zachry T; Wang, Feng; Ge, Zhongqi; Jung, Sung Yun; He, Feng; Li, Yumei; Wensel, Theodore G; Qin, Jun; Chen, Rui

    2016-05-01

    Proteomic profiling on subcellular fractions provides invaluable information regarding both protein abundance and subcellular localization. When integrated with other data sets, it can greatly enhance our ability to predict gene function genome-wide. In this study, we performed a comprehensive proteomic analysis on the light-sensing compartment of photoreceptors called the outer segment (OS). By comparing with the protein profile obtained from the retina tissue depleted of OS, an enrichment score for each protein is calculated to quantify protein subcellular localization, and 84% accuracy is achieved compared with experimental data. By integrating the protein OS enrichment score, the protein abundance, and the retina transcriptome, the probability of a gene playing an essential function in photoreceptor cells is derived with high specificity and sensitivity. As a result, a list of genes that will likely result in human retinal disease when mutated was identified and validated by previous literature and/or animal model studies. Therefore, this new methodology demonstrates the synergy of combining subcellular fractionation proteomics with other omics data sets and is generally applicable to other tissues and diseases. PMID:26912414

  9. Borehole flowmeter logging for the accurate design and analysis of tracer tests.

    PubMed

    Basiricò, Stefano; Crosta, Giovanni B; Frattini, Paolo; Villa, Alberto; Godio, Alberto

    2015-04-01

    Tracer tests often give ambiguous interpretations that may be due to the erroneous location of sampling points and/or the lack of flow rate measurements through the sampler. To obtain more reliable tracer test results, we propose a methodology that optimizes the design and analysis of tracer tests in a cross borehole mode by using vertical borehole flow rate measurements. Experiments using this approach, herein defined as the Bh-flow tracer test, have been performed by implementing three sequential steps: (1) single-hole flowmeter test, (2) cross-hole flowmeter test, and (3) tracer test. At the experimental site, core logging, pumping tests, and static water-level measurements were previously carried out to determine stratigraphy, fracture characteristics, and bulk hydraulic conductivity. Single-hole flowmeter testing makes it possible to detect the presence of vertical flows as well as inflow and outflow zones, whereas cross-hole flowmeter testing detects the presence of connections along sets of flow conduits or discontinuities intercepted by boreholes. Finally, the specific pathways and rates of groundwater flow through selected flowpaths are determined by tracer testing. We conclude that the combined use of single and cross-borehole flowmeter tests is fundamental to the formulation of the tracer test strategy and interpretation of the tracer test results. PMID:25417730

  10. SMRT Sequencing for Parallel Analysis of Multiple Targets and Accurate SNP Phasing.

    PubMed

    Guo, Xiaoge; Lehner, Kevin; O'Connell, Karen; Zhang, Jenny; Dave, Sandeep S; Jinks-Robertson, Sue

    2015-12-01

    Single-molecule real-time (SMRT) sequencing generates much longer reads than other widely used next-generation (next-gen) sequencing methods, but its application to whole genome/exome analysis has been limited. Here, we describe the use of SMRT sequencing coupled with barcoding to simultaneously analyze one or a small number of genomic targets derived from multiple sources. In the budding yeast system, SMRT sequencing was used to analyze strand-exchange intermediates generated during mitotic recombination and to analyze genetic changes in a forward mutation assay. The general barcoding-SMRT approach was then extended to diffuse large B-cell lymphoma primary tumors and cell lines, where detected changes agreed with prior Illumina exome sequencing. A distinct advantage afforded by SMRT sequencing over other next-gen methods is that it immediately provides the linkage relationships between SNPs in the target segment sequenced. The strength of our approach for mutation/recombination studies (as well as linkage identification) derives from its inherent computational simplicity coupled with a lack of reliance on sophisticated statistical analyses. PMID:26497143

  11. Accurate determination of the diffusion coefficient of proteins by Fourier analysis with whole column imaging detection.

    PubMed

    Zarabadi, Atefeh S; Pawliszyn, Janusz

    2015-02-17

    Analysis in the frequency domain is considered a powerful tool to elicit precise information from spectroscopic signals. In this study, the Fourier transformation technique is employed to determine the diffusion coefficient (D) of a number of proteins in the frequency domain. Analytical approaches are investigated for determination of D from both experimental and data treatment viewpoints. The diffusion process is modeled to calculate diffusion coefficients based on the Fourier transformation solution to Fick's law equation, and its results are compared to time domain results. The simulations characterize optimum spatial and temporal conditions and demonstrate the noise tolerance of the method. The proposed model is validated by its application for the electropherograms from the diffusion path of a set of proteins. Real-time dynamic scanning is conducted to monitor dispersion by employing whole column imaging detection technology in combination with capillary isoelectric focusing (CIEF) and the imaging plug flow (iPF) experiment. These experimental techniques provide different peak shapes, which are utilized to demonstrate the Fourier transformation ability in extracting diffusion coefficients out of irregular shape signals. Experimental results confirmed that the Fourier transformation procedure substantially enhanced the accuracy of the determined values compared to those obtained in the time domain. PMID:25607375

  12. Parents were accurate proxy reporters of urgent pediatric asthma health services—a retrospective agreement analysis

    PubMed Central

    Ungar, Wendy J.; Davidson-Grimwood, Sara R.; Cousins, Martha

    2016-01-01

    Objective To assess agreement between parents’ proxy reports of children’s respiratory-related health service use and administrative data. Study Design and Setting A retrospective analysis of statistical agreement between clinical and claims data for reports of physician visits, emergency department (ED) visits, and hospitalizations in 545 asthmatic children recruited from sites in the greater Toronto area was conducted. Health services use data were extracted from the Ontario Health Insurance Plan and Canadian Institute for Health Information databases for each child for the interval coinciding with the proxy report for each health service type. Results Agreement between administrative data and respondent reports (n =545) was substantial for hospitalizations in the past year (κ =0.80 [0.74, 0.86]), moderate for ED visits in the past year (κ =0.60 [0.53, 0.67]), and slight for physician visits (κ =0.13 [0.00, 0.27]) in the past 6 months. Income, parent’s education, and child quality-of-life symptom scores did not affect agreement. Agreement for ED visits was significantly higher (P <0.05) for children who had an asthma attack in the past 6 months (κ =0.61 [0.54, 0.68]) compared to children who did not (κ =0.25 [0.00, 0.59]). Conclusion Parents of asthmatic children are reliable reporters of their child’s respiratory-related urgent health services utilization. PMID:17938060

  13. [An Accurate Diagnosis is Possible with a Systematic Analysis of Routine Laboratory Data].

    PubMed

    Yonekawa, Osamu

    2015-09-01

    Routine laboratory tests are ordered for almost all in- and outpatients. A systematic analysis of routine laboratory data can give doctors valuable clinical information about patients. In some cases, a correct diag- nosis can be made using laboratory data alone. In our laboratory, we use five processes to evaluate routine laboratory data. Firstly, we estimate the patient's general condition based on A/G, Hb, TP, Alb, ChE, and platelet (PLT) levels. Secondly, we look for inflammation and malignancy based on WBC, CRP, PLT, fibrinogen, and ESR levels and the protein electrophoresis pattern. Thirdly, we examine the major organs, especially the liver and kidney. We check the liver for hepatocyte damage, obstruction, hepatic synthetic function, infection, and malignancy. We estimate GFR and check the kidney for any localized damage. We then check the chemistry, hematology, and immunology. Finally, we form a conclusion after a comprehensive interpretation of the above four processes. With this systematic approach, any members of the laboratory unit can easily estimate the exact pathological status of the patient. In this case study, marked change of TP indicated non-selective loss from the skin; namely a burn. Tissue injury and infections due to different focuses were the most likely causes of severe inflammation. Neither the liver nor kidney was severely damaged. Continual bleeding and hemolysis through the clinical course probably caused anemia. Hypooxygenic respiratory failure and metabolic alkalosis were confirmed by blood gasses. Multiple organ failure was suggested. PMID:26731896

  14. Accurate analysis of taurine, anserine, carnosine and free amino acids in a cattle muscle biopsy sample.

    PubMed

    Imanari, Mai; Higuchi, Mikito; Shiba, Nobuya; Watanabe, Akira

    2010-06-01

    We have established an analysis method for some free amino acids (FAAs), as well as taurine (Tau), anserine (Ans) and carnosine (Car), in a fresh biopsy sample from cattle muscle. A series of model biopsy samples, corresponding to the mixtures of lean meat, fat and connective tissue, was prepared and showed high correlation coefficients between the compound concentration and the 3-methylhistidine (3-MeHis) content derived from hydrolysis of the biopsy sample (r = 0.74-0.95, P < 0.01). Interference from blood contamination could not be neglected, because the concentration of some FAAs in blood was comparable to that in muscle. However, it was possible to control the contamination of Tau, Ans, Car, glutamic acid, glutamine, asparatic acid and alanine to less than 5.0% when the blood contamination was controlled to less than 23%.These results suggest the necessity of measuring 3-MeHis as an index of lean meat and hemoglobin as an index of blood contamination when compounds in muscle biopsy samples are evaluated. We have carried out a series of these analyses using one biopsy sample and reveal differences in Tau, Ans, Car and some FAAs in beef muscle after different feeding regimes. PMID:20597895

  15. SMRT Sequencing for Parallel Analysis of Multiple Targets and Accurate SNP Phasing

    PubMed Central

    Guo, Xiaoge; Lehner, Kevin; O’Connell, Karen; Zhang, Jenny; Dave, Sandeep S.; Jinks-Robertson, Sue

    2015-01-01

    Single-molecule real-time (SMRT) sequencing generates much longer reads than other widely used next-generation (next-gen) sequencing methods, but its application to whole genome/exome analysis has been limited. Here, we describe the use of SMRT sequencing coupled with barcoding to simultaneously analyze one or a small number of genomic targets derived from multiple sources. In the budding yeast system, SMRT sequencing was used to analyze strand-exchange intermediates generated during mitotic recombination and to analyze genetic changes in a forward mutation assay. The general barcoding-SMRT approach was then extended to diffuse large B-cell lymphoma primary tumors and cell lines, where detected changes agreed with prior Illumina exome sequencing. A distinct advantage afforded by SMRT sequencing over other next-gen methods is that it immediately provides the linkage relationships between SNPs in the target segment sequenced. The strength of our approach for mutation/recombination studies (as well as linkage identification) derives from its inherent computational simplicity coupled with a lack of reliance on sophisticated statistical analyses. PMID:26497143

  16. iTRAQ-Based Quantitative Proteomic Analysis Identified HSC71 as a Novel Serum Biomarker for Renal Cell Carcinoma

    PubMed Central

    Zhang, Yushi; Cai, Yi; Yu, Hongyan; Li, Hanzhong

    2015-01-01

    Renal cell carcinoma (RCC) is one of the most lethal urologic cancers and about 80% of RCC are of the clear-cell type (ccRCC). However, there are no serum biomarkers for the accurate diagnosis of RCC. In this study, we performed a quantitative proteomic analysis on serum samples from ccRCC patients and control group by using isobaric tag for relative and absolute quantitation (iTRAQ) labeling and LC-MS/MS analysis to access differentially expressed proteins. Overall, 16 proteins were significantly upregulated (ratio > 1.5) and 14 proteins were significantly downregulated (ratio < 0.67) in early-stage ccRCC compared to control group. HSC71 was selected and subsequently validated by Western blot in six independent sets of patients. ELISA subsequently confirmed HSC71 as a potential serum biomarker for distinguishing RCC from benign urologic disease with an operating characteristic curve (ROC) area under the curve (AUC) of 0.86 (95% confidence interval (CI), 0.76~0.96), achieving sensitivity of 87% (95% CI 69%~96%) at a specificity of 80% (95% CI 61~92%) with a threshold of 15 ng/mL. iTRAQ-based quantitative proteomic analysis led to identification of serum HSC71 as a novel serum biomarker of RCC, particularly useful in early diagnosis of ccRCC. PMID:26425554

  17. Whole-Genome Sequencing Analysis Accurately Predicts Antimicrobial Resistance Phenotypes in Campylobacter spp.

    PubMed

    Zhao, S; Tyson, G H; Chen, Y; Li, C; Mukherjee, S; Young, S; Lam, C; Folster, J P; Whichard, J M; McDermott, P F

    2016-01-01

    The objectives of this study were to identify antimicrobial resistance genotypes for Campylobacter and to evaluate the correlation between resistance phenotypes and genotypes using in vitro antimicrobial susceptibility testing and whole-genome sequencing (WGS). A total of 114 Campylobacter species isolates (82 C. coli and 32 C. jejuni) obtained from 2000 to 2013 from humans, retail meats, and cecal samples from food production animals in the United States as part of the National Antimicrobial Resistance Monitoring System were selected for study. Resistance phenotypes were determined using broth microdilution of nine antimicrobials. Genomic DNA was sequenced using the Illumina MiSeq platform, and resistance genotypes were identified using assembled WGS sequences through blastx analysis. Eighteen resistance genes, including tet(O), blaOXA-61, catA, lnu(C), aph(2″)-Ib, aph(2″)-Ic, aph(2')-If, aph(2″)-Ig, aph(2″)-Ih, aac(6')-Ie-aph(2″)-Ia, aac(6')-Ie-aph(2″)-If, aac(6')-Im, aadE, sat4, ant(6'), aad9, aph(3')-Ic, and aph(3')-IIIa, and mutations in two housekeeping genes (gyrA and 23S rRNA) were identified. There was a high degree of correlation between phenotypic resistance to a given drug and the presence of one or more corresponding resistance genes. Phenotypic and genotypic correlation was 100% for tetracycline, ciprofloxacin/nalidixic acid, and erythromycin, and correlations ranged from 95.4% to 98.7% for gentamicin, azithromycin, clindamycin, and telithromycin. All isolates were susceptible to florfenicol, and no genes associated with florfenicol resistance were detected. There was a strong correlation (99.2%) between resistance genotypes and phenotypes, suggesting that WGS is a reliable indicator of resistance to the nine antimicrobial agents assayed in this study. WGS has the potential to be a powerful tool for antimicrobial resistance surveillance programs. PMID:26519386

  18. Quantitative analysis of HSV gene expression during lytic infection

    PubMed Central

    Turner, Anne-Marie W.; Arbuckle, Jesse H.; Kristie, Thomas M.

    2014-01-01

    Herpes Simplex Virus (HSV) is a human pathogen that establishes latency and undergoes periodic reactivation, resulting in chronic recurrent lytic infection. HSV lytic infection is characterized by an organized cascade of three gene classes, however successful transcription and expression of the first, the immediate early class, is critical to the overall success of viral infection. This initial event of lytic infection is also highly dependent on host cell factors. This unit uses RNA interference and small molecule inhibitors to examine the role of host and viral proteins in HSV lytic infection. Methods detailing isolation of viral and host RNA and genomic DNA, followed by quantitative real-time PCR, allow characterization of impacts on viral transcription and replication respectively. Western blot can be used to confirm quantitative PCR results. This combination of protocols represents a starting point for researchers interested in virus-host interactions during HSV lytic infection. PMID:25367270

  19. Quantitative and qualitative HPLC analysis of thermogenic weight loss products.

    PubMed

    Schaneberg, B T; Khan, I A

    2004-11-01

    An HPLC qualitative and quantitative method of seven analytes (caffeine, ephedrine, forskolin, icariin, pseudoephedrine, synephrine, and yohimbine) in thermogenic weight loss preparations available on the market is described in this paper. After 45 min the seven analytes were separated and detected in the acetonitrile: water (80:20) extract. The method uses a Waters XTerra RP18 (5 microm particle size) column as the stationary phase, a gradient mobile phase of water (5.0 mM SDS) and acetonitrile, and a UV detection of 210 nm. The correlation coefficients for the calibration curves and the recovery rates ranged from 0.994 to 0.999 and from 97.45% to 101.05%, respectively. The qualitative and quantitative results are discussed. PMID:15587578

  20. Fluorescent microscopy approaches of quantitative soil microbial analysis

    NASA Astrophysics Data System (ADS)

    Ivanov, Konstantin; Polyanskaya, Lubov

    2015-04-01

    Classical fluorescent microscopy method was used during the last decades in various microbiological studies of terrestrial ecosystems. The method provides representative results and simple application which is allow to use it both as routine part of amplitudinous research and in small-scaled laboratories. Furthermore, depending on research targets a lot of modifications of fluorescent microscopy method were established. Combination and comparison of several approaches is an opportunity of quantitative estimation of microbial community in soil. The first analytical part of the study was dedicated to soil bacterial density estimation by fluorescent microscopy in dynamic of several 30-days experiments. The purpose of research was estimation of changes in soil bacterial community on the different soil horizons under aerobic and anaerobic conditions with adding nutrients in two experimental sets: cellulose and chitin. Was modified the nalidixic acid method for inhibition of DNA division of gram-negative bacteria, and the method provides the quantification of this bacterial group by fluorescent microscopy. Established approach allowed to estimate 3-4 times more cells of gram-negative bacteria in soil. The functions of actinomyces in soil polymer destruction are traditionally considered as dominant in comparison to gram-negative bacterial group. However, quantification of gram-negative bacteria in chernozem and peatland provides underestimation of classical notion for this bacterial group. Chitin introduction had no positive effect to gram-negative bacterial population density changes in chernozem but concurrently this nutrient provided the fast growing dynamics at the first 3 days of experiment both under aerobic and anaerobic conditions. This is confirming chitinolytic activity of gram-negative bacteria in soil organic matter decomposition. At the next part of research modified method for soil gram-negative bacteria quantification was compared to fluorescent in situ

  1. Comprehensive objective maps of macromolecular conformations by quantitative SAXS analysis

    PubMed Central

    Hura, Greg L.; Budworth, Helen; Dyer, Kevin N.; Rambo, Robert P.; Hammel, Michal

    2013-01-01

    Comprehensive perspectives of macromolecular conformations are required to connect structure to biology. Here we present a small angle X-ray scattering (SAXS) Structural Similarity Map (SSM) and Volatility of Ratio (VR) metric providing comprehensive, quantitative and objective (superposition-independent) perspectives on solution state conformations. We validate VR and SSM utility on human MutSβ, a key ABC ATPase and chemotherapeutic target, by revealing MutSβ DNA sculpting and identifying multiple conformational states for biological activity. PMID:23624664

  2. Quantitative compositional analysis of sedimentary materials using thermal emission spectroscopy: 1. Application to sedimentary rocks

    NASA Astrophysics Data System (ADS)

    Thorpe, Michael T.; Rogers, A. Deanne; Bristow, Thomas F.; Pan, Cong

    2015-11-01

    Thermal emission spectroscopy is used to determine the mineralogy of sandstone and mudstone rocks as part of an investigation of linear spectral mixing between sedimentary constituent phases. With widespread occurrences of sedimentary rocks on the surface of Mars, critical examination of the accuracy associated with quantitative models of mineral abundances derived from thermal emission spectra of sedimentary materials is necessary. Although thermal emission spectroscopy has been previously proven to be a viable technique to obtain quantitative mineralogy from igneous and metamorphic materials, sedimentary rocks, with natural variation of composition, compaction, and grain size, have yet to be examined. In this work, we present an analysis of the thermal emission spectral (~270-1650 cm-1) characteristics of a suite of 13 sandstones and 14 mudstones. X-ray diffraction and traditional point counting procedures were all evaluated in comparison with thermal emission spectroscopy. Results from this work are consistent with previous thermal emission spectroscopy studies and indicate that bulk rock mineral abundances can be estimated within 11.2% for detrital grains (i.e., quartz and feldspars) and 14.8% for all other mineral phases present in both sandstones and mudstones, in comparison to common in situ techniques used for determining bulk rock composition. Clay-sized to fine silt-sized grained phase identification is less accurate, with differences from the known ranging from ~5 to 24% on average. Nevertheless, linear least squares modeling of thermal emission spectra is an advantageous technique for determining abundances of detrital grains and sedimentary matrix and for providing a rapid classification of clastic rocks.

  3. An Inexpensive Electrodeposition Device and Its Use in a Quantitative Analysis Laboratory Exercise

    ERIC Educational Resources Information Center

    Parker, Richard H.

    2011-01-01

    An experimental procedure, using an apparatus that is easy to construct, was developed to incorporate a quantitative electrogravimetric determination of the solution nickel content into an undergraduate or advanced high school quantitative analysis laboratory. This procedure produces results comparable to the procedure used for the gravimetric…

  4. Predicting outbreak detection in public health surveillance: quantitative analysis to enable evidence-based method selection.

    PubMed

    Buckeridge, David L; Okhmatovskaia, Anna; Tu, Samson; O'Connor, Martin; Nyulas, Csongor; Musen, Mark A

    2008-01-01

    Public health surveillance is critical for accurate and timely outbreak detection and effective epidemic control. A wide range of statistical algorithms is used for surveillance, and important differences have been noted in the ability of these algorithms to detect outbreaks. The evidence about the relative performance of these algorithms, however, remains limited and mainly qualitative. Using simulated outbreak data, we developed and validated quantitative models for predicting the ability of commonly used surveillance algorithms to detect different types of outbreaks. The developed models accurately predict the ability of different algorithms to detect different types of outbreaks. These models enable evidence-based algorithm selection and can guide research into algorithm development. PMID:18999264

  5. Predicting Outbreak Detection in Public Health Surveillance: Quantitative Analysis to Enable Evidence-Based Method Selection

    PubMed Central

    Buckeridge, David L.; Okhmatovskaia, Anna; Tu, Samson; O’Connor, Martin; Nyulas, Csongor; Musen, Mark A.

    2008-01-01

    Public health surveillance is critical for accurate and timely outbreak detection and effective epidemic control. A wide range of statistical algorithms is used for surveillance, and important differences have been noted in the ability of these algorithms to detect outbreaks. The evidence about the relative performance of these algorithms, however, remains limited and mainly qualitative. Using simulated outbreak data, we developed and validated quantitative models for predicting the ability of commonly used surveillance algorithms to detect different types of outbreaks. The developed models accurately predict the ability of different algorithms to detect different types of outbreaks. These models enable evidence-based algorithm selection and can guide research into algorithm development. PMID:18999264

  6. Quantitative analysis of single amino acid variant peptides associated with pancreatic cancer in serum by an isobaric labeling quantitative method.

    PubMed

    Nie, Song; Yin, Haidi; Tan, Zhijing; Anderson, Michelle A; Ruffin, Mack T; Simeone, Diane M; Lubman, David M

    2014-12-01

    Single amino acid variations are highly associated with many human diseases. The direct detection of peptides containing single amino acid variants (SAAVs) derived from nonsynonymous single nucleotide polymorphisms (SNPs) in serum can provide unique opportunities for SAAV associated biomarker discovery. In the present study, an isobaric labeling quantitative strategy was applied to identify and quantify variant peptides in serum samples of pancreatic cancer patients and other benign controls. The largest number of SAAV peptides to date in serum including 96 unique variant peptides were quantified in this quantitative analysis, of which five variant peptides showed a statistically significant difference between pancreatic cancer and other controls (p-value < 0.05). Significant differences in the variant peptide SDNCEDTPEAGYFAVAVVK from serotransferrin were detected between pancreatic cancer and controls, which was further validated by selected reaction monitoring (SRM) analysis. The novel biomarker panel obtained by combining α-1-antichymotrypsin (AACT), Thrombospondin-1 (THBS1) and this variant peptide showed an excellent diagnostic performance in discriminating pancreatic cancer from healthy controls (AUC = 0.98) and chronic pancreatitis (AUC = 0.90). These results suggest that large-scale analysis of SAAV peptides in serum may provide a new direction for biomarker discovery research. PMID:25393578

  7. Self-awareness in Mild Cognitive Impairment: Quantitative evidence from systematic review and meta-analysis.

    PubMed

    Piras, Federica; Piras, Fabrizio; Orfei, Maria Donata; Caltagirone, Carlo; Spalletta, Gianfranco

    2016-02-01

    Here we quantitatively summarized evidence of impaired awareness in Mild Cognitive Impairment (MCI) and meta-analytically explored the relationship between Subjective Cognitive Complaints (SCC) and actual cognitive impairment. Twenty-three studies were included, 14 comparing awareness measures in MCI and healthy elderly subjects, and 16 also exploring the neuropsychological underpinnings of impaired awareness. Moderator analyses were conducted to determine whether self-awareness varied according to patient group, the particular state in relation to which insight was assessed, or the approach to measuring awareness. The meta-analysis shows that MCI patients have knowledge of their neuropsychological deficits and that level of awareness varies according to cognitive status, language and memory abilities. The assessment technique employed impacted on the insight phenomena. Specifically, MCI patients seem particularly accurate in evaluating the current state of their performance during an ongoing task and this could be essential in regulating their behavior so that compensative strategies are practiced and greater cognitive independence is achieved. Thus, assessment technique and cognitive status are crucial factors that influence level of awareness and should be taken into consideration in awareness evaluation and rehabilitation. PMID:26639655

  8. Absolute quantitative analysis for sorbic acid in processed foods using proton nuclear magnetic resonance spectroscopy.

    PubMed

    Ohtsuki, Takashi; Sato, Kyoko; Sugimoto, Naoki; Akiyama, Hiroshi; Kawamura, Yoko

    2012-07-13

    An analytical method using solvent extraction and quantitative proton nuclear magnetic resonance (qHNMR) spectroscopy was applied and validated for the absolute quantification of sorbic acid (SA) in processed foods. The proposed method showed good linearity. The recoveries for samples spiked at the maximum usage level specified for food in Japan and at 0.13 g kg(-1) (beverage: 0.013 g kg(-1)) were larger than 80%, whereas those for samples spiked at 0.063 g kg(-1) (beverage: 0.0063 g kg(-1)) were between 56.9 and 83.5%. The limit of quantification was 0.063 g kg(-1) for foods (and 0.0063 g kg(-1) for beverages containing Lactobacillus species). Analysis of the SA content of commercial processed foods revealed quantities equal to or greater than those measured using conventional steam-distillation extraction and high-performance liquid chromatography quantification. The proposed method was rapid, simple, accurate, and precise, and provided International System of Units traceability without the need for authentic analyte standards. It could therefore be used as an alternative to the quantification of SA in processed foods using conventional method. PMID:22704472

  9. A field instrument for quantitative determination of beryllium by activation analysis

    USGS Publications Warehouse

    Vaughn, William W.; Wilson, E.E.; Ohm, J.M.

    1960-01-01

    A low-cost instrument has been developed for quantitative determinations of beryllium in the field by activation analysis. The instrument makes use of the gamma-neutron reaction between gammas emitted by an artificially radioactive source (Sb124) and beryllium as it occurs in nature. The instrument and power source are mounted in a panel-type vehicle. Samples are prepared by hand-crushing the rock to approximately ?-inch mesh size and smaller. Sample volumes are kept constant by means of a standard measuring cup. Instrument calibration, made by using standards of known BeO content, indicates the analyses are reproducible and accurate to within ? 0.25 percent BeO in the range from 1 to 20 percent BeO with a sample counting time of 5 minutes. Sensitivity of the instrument maybe increased somewhat by increasing the source size, the sample size, or by enlarging the cross-sectional area of the neutron-sensitive phosphor normal to the neutron flux.

  10. Rapid quantitative analysis of lipids using a colorimetric method in a microplate format.

    PubMed

    Cheng, Yu-Shen; Zheng, Yi; VanderGheynst, Jean S

    2011-01-01

    A colorimetric sulfo-phospho-vanillin (SPV) method was developed for high throughput analysis of total lipids. The developed method uses a reaction mixture that is maintained in a 96-well microplate throughout the entire assay. The new assay provides the following advantages over other methods of lipid measurement: (1) background absorbance can be easily corrected for each well, (2) there is less risk of handling and transferring sulfuric acid contained in reaction mixtures, (3) color develops more consistently providing more accurate measurement of absorbance, and (4) the assay can be used for quantitative measurement of lipids extracted from a wide variety of sources. Unlike other spectrophotometric approaches that use fluorescent dyes, the optimal spectra and reaction conditions for the developed assay do not vary with the sample source. The developed method was used to measure lipids in extracts from four strains of microalgae. No significant difference was found in lipid determination when lipid content was measured using the new method and compared to results obtained using a macro-gravimetric method. PMID:21069472

  11. A quantitative analysis of hydraulic interaction processes in stream-aquifer systems

    DOE PAGESBeta

    Wang, Wenke; Dai, Zhenxue; Zhao, Yaqian; Li, Junting; Duan, Lei; Wang, Zhoufeng; Zhu, Lin

    2016-01-28

    The hydraulic relationship between the stream and aquifer can be altered from hydraulic connection to disconnection when the pumping rate exceeds the maximum seepage flux of the streambed. This study proposes to quantitatively analyze the physical processes of stream-aquifer systems from connection to disconnection. A free water table equation is adopted to clarify under what conditions a stream starts to separate hydraulically from an aquifer. Both the theoretical analysis and laboratory tests have demonstrated that the hydraulic connectedness of the stream-aquifer system can reach a critical disconnection state when the horizontal hydraulic gradient at the free water surface is equalmore » to zero and the vertical is equal to 1. A boundary-value problem for movement of the critical point of disconnection is established for an analytical solution of the inverted water table movement beneath the stream. The result indicates that the maximum distance or thickness of the inverted water table is equal to the water depth in the stream, and at a steady state of disconnection, the maximum hydraulic gradient at the streambed center is 2. In conclusion, this study helps us to understand the hydraulic phenomena of water flow near streams and accurately assess surface water and groundwater resources.« less

  12. A quantitative analysis of hydraulic interaction processes in stream-aquifer systems

    PubMed Central

    Wang, Wenke; Dai, Zhenxue; Zhao, Yaqian; Li, Junting; Duan, Lei; Wang, Zhoufeng; Zhu, Lin

    2016-01-01

    The hydraulic relationship between the stream and aquifer can be altered from hydraulic connection to disconnection when the pumping rate exceeds the maximum seepage flux of the streambed. This study proposes to quantitatively analyze the physical processes of stream-aquifer systems from connection to disconnection. A free water table equation is adopted to clarify under what conditions a stream starts to separate hydraulically from an aquifer. Both the theoretical analysis and laboratory tests have demonstrated that the hydraulic connectedness of the stream-aquifer system can reach a critical disconnection state when the horizontal hydraulic gradient at the free water surface is equal to zero and the vertical is equal to 1. A boundary-value problem for movement of the critical point of disconnection is established for an analytical solution of the inverted water table movement beneath the stream. The result indicates that the maximum distance or thickness of the inverted water table is equal to the water depth in the stream, and at a steady state of disconnection, the maximum hydraulic gradient at the streambed center is 2. This study helps us to understand the hydraulic phenomena of water flow near streams and accurately assess surface water and groundwater resources. PMID:26818442

  13. A quantitative analysis of hydraulic interaction processes in stream-aquifer systems

    NASA Astrophysics Data System (ADS)

    Wang, Wenke; Dai, Zhenxue; Zhao, Yaqian; Li, Junting; Duan, Lei; Wang, Zhoufeng; Zhu, Lin

    2016-01-01

    The hydraulic relationship between the stream and aquifer can be altered from hydraulic connection to disconnection when the pumping rate exceeds the maximum seepage flux of the streambed. This study proposes to quantitatively analyze the physical processes of stream-aquifer systems from connection to disconnection. A free water table equation is adopted to clarify under what conditions a stream starts to separate hydraulically from an aquifer. Both the theoretical analysis and laboratory tests have demonstrated that the hydraulic connectedness of the stream-aquifer system can reach a critical disconnection state when the horizontal hydraulic gradient at the free water surface is equal to zero and the vertical is equal to 1. A boundary-value problem for movement of the critical point of disconnection is established for an analytical solution of the inverted water table movement beneath the stream. The result indicates that the maximum distance or thickness of the inverted water table is equal to the water depth in the stream, and at a steady state of disconnection, the maximum hydraulic gradient at the streambed center is 2. This study helps us to understand the hydraulic phenomena of water flow near streams and accurately assess surface water and groundwater resources.

  14. A quantitative analysis of hydraulic interaction processes in stream-aquifer systems.

    PubMed

    Wang, Wenke; Dai, Zhenxue; Zhao, Yaqian; Li, Junting; Duan, Lei; Wang, Zhoufeng; Zhu, Lin

    2016-01-01

    The hydraulic relationship between the stream and aquifer can be altered from hydraulic connection to disconnection when the pumping rate exceeds the maximum seepage flux of the streambed. This study proposes to quantitatively analyze the physical processes of stream-aquifer systems from connection to disconnection. A free water table equation is adopted to clarify under what conditions a stream starts to separate hydraulically from an aquifer. Both the theoretical analysis and laboratory tests have demonstrated that the hydraulic connectedness of the stream-aquifer system can reach a critical disconnection state when the horizontal hydraulic gradient at the free water surface is equal to zero and the vertical is equal to 1. A boundary-value problem for movement of the critical point of disconnection is established for an analytical solution of the inverted water table movement beneath the stream. The result indicates that the maximum distance or thickness of the inverted water table is equal to the water depth in the stream, and at a steady state of disconnection, the maximum hydraulic gradient at the streambed center is 2. This study helps us to understand the hydraulic phenomena of water flow near streams and accurately assess surface water and groundwater resources. PMID:26818442

  15. Quantitative analysis of a wind energy conversion model

    NASA Astrophysics Data System (ADS)

    Zucker, Florian; Gräbner, Anna; Strunz, Andreas; Meyn, Jan-Peter

    2015-03-01

    A rotor of 12 cm diameter is attached to a precision electric motor, used as a generator, to make a model wind turbine. Output power of the generator is measured in a wind tunnel with up to 15 m s-1 air velocity. The maximum power is 3.4 W, the power conversion factor from kinetic to electric energy is cp = 0.15. The v3 power law is confirmed. The model illustrates several technically important features of industrial wind turbines quantitatively.

  16. Software for quantitative analysis of radiotherapy: overview, requirement analysis and design solutions.

    PubMed

    Zhang, Lanlan; Hub, Martina; Mang, Sarah; Thieke, Christian; Nix, Oliver; Karger, Christian P; Floca, Ralf O

    2013-06-01

    Radiotherapy is a fast-developing discipline which plays a major role in cancer care. Quantitative analysis of radiotherapy data can improve the success of the treatment and support the prediction of outcome. In this paper, we first identify functional, conceptional and general requirements on a software system for quantitative analysis of radiotherapy. Further we present an overview of existing radiotherapy analysis software tools and check them against the stated requirements. As none of them could meet all of the demands presented herein, we analyzed possible conceptional problems and present software design solutions and recommendations to meet the stated requirements (e.g. algorithmic decoupling via dose iterator pattern; analysis database design). As a proof of concept we developed a software library "RTToolbox" following the presented design principles. The RTToolbox is available as open source library and has already been tested in a larger-scale software system for different use cases. These examples demonstrate the benefit of the presented design principles. PMID:23523366

  17. Qualitative and quantitative analysis of volatile constituents from latrines.

    PubMed

    Lin, Jianming; Aoll, Jackline; Niclass, Yvan; Velazco, Maria Inés; Wünsche, Laurent; Pika, Jana; Starkenmann, Christian

    2013-07-16

    More than 2.5 billion people defecate in the open. The increased commitment of private and public organizations to improving this situation is driving the research and development of new technologies for toilets and latrines. Although key technical aspects are considered by researchers when designing new technologies for developing countries, the basic aspect of offending malodors from human waste is often neglected. With the objective of contributing to technical solutions that are acceptable to global consumers, we investigated the chemical composition of latrine malodors sampled in Africa and India. Field latrines in four countries were evaluated olfactively and the odors qualitatively and quantitatively characterized with three analytical techniques. Sulfur compounds including H2S, methyl mercaptan, and dimethyl-mono-(di;tri) sulfide are important in sewage-like odors of pit latrines under anaerobic conditions. Under aerobic conditions, in Nairobi for example, paracresol and indole reached concentrations of 89 and 65 μg/g, respectively, which, along with short chain fatty acids such as butyric acid (13 mg/g) explained the strong rancid, manure and farm yard odor. This work represents the first qualitative and quantitative study of volatile compounds sampled from seven pit latrines in a variety of geographic, technical, and economic contexts in addition to three single stools from India and a pit latrine model system. PMID:23829328

  18. Quantitative analysis of radiation-induced changes in sperm morphology.

    PubMed

    Young, I T; Gledhill, B L; Lake, S; Wyrobek, A J

    1982-09-01

    When developing spermatogenic cells are exposed to radiation, chemical carcinogens or mutagens, the transformation in the morphology of the mature sperm can be used to determine the severity of the exposure. In this study five groups of mice with three mice per group received testicular doses of X irradiation at dosage levels ranging from 0 rad to 120 rad. A random sample of 100 mature sperm per mouse was analyzed five weeks later for the quantitative morphologic transformation as a function of dosage level. The cells were stained with gallocyanin chrome alum (GCA) so that only the DNA in the sperm head was visible. The ACUity quantitative microscopy system at Lawrence Livermore National Laboratory was used to scan the sperm at a sampling density of 16 points per linear micrometer and with 256 brightness levels per point. The contour of each cell was extracted using conventional thresholding techniques on the high-contrast images. For each contour a variety of shape features was then computed to characterize the morphology of that cell. Using the control group and the distribution of their shape features to establish the variability of a normal sperm population, the 95% limits on normal morphology were established. Using only four shape features, a doubling dose of approximately 39 rad was determined. That is, at 39 rad exposure the percentage of abnormal cells was twice that occurring in the control population. This compared to a doubling dose of approximately 70 rad obtained from a concurrent visual procedure. PMID:6184000

  19. Quantitative phenotypic analysis of multistress response in Zygosaccharomyces rouxii complex.

    PubMed

    Solieri, Lisa; Dakal, Tikam C; Bicciato, Silvio

    2014-06-01

    Zygosaccharomyces rouxii complex comprises three yeasts clusters sourced from sugar- and salt-rich environments: haploid Zygosaccharomyces rouxii, diploid Zygosaccharomyces sapae and allodiploid/aneuploid strains of uncertain taxonomic affiliations. These yeasts have been characterized with respect to gene copy number variation, karyotype variability and change in ploidy, but functional diversity in stress responses has not been explored yet. Here, we quantitatively analysed the stress response variation in seven strains of the Z. rouxii complex by modelling growth variables via model and model-free fitting methods. Based on the spline fit as most reliable modelling method, we resolved different interstrain responses to 15 environmental perturbations. Compared with Z. rouxii CBS 732(T) and Z. sapae strains ABT301(T) and ABT601, allodiploid strain ATCC 42981 and aneuploid strains CBS 4837 and CBS 4838 displayed higher multistress resistance and better performance in glycerol respiration even in the presence of copper. μ-based logarithmic phenotypic index highlighted that ABT601 is a slow-growing strain insensitive to stress, whereas ABT301(T) grows fast on rich medium and is sensitive to suboptimal conditions. Overall, the differences in stress response could imply different adaptation mechanisms to sugar- and salt-rich niches. The obtained phenotypic profiling contributes to provide quantitative insights for elucidating the adaptive mechanisms to stress in halo- and osmo-tolerant Zygosaccharomyces yeasts. PMID:24533625

  20. Quantitative analysis of laminin 5 gene expression in human keratinocytes.

    PubMed

    Akutsu, Nobuko; Amano, Satoshi; Nishiyama, Toshio

    2005-05-01

    To examine the expression of laminin 5 genes (LAMA3, LAMB3, and LAMC2) encoding the three polypeptide chains alpha3, beta3, and gamma2, respectively, in human keratinocytes, we developed novel quantitative polymerase chain reaction (PCR) methods utilizing Thermus aquaticus DNA polymerase, specific primers, and fluorescein-labeled probes with the ABI PRISM 7700 sequence detector system. Gene expression levels of LAMA3, LAMB3, and LAMC2 and glyceraldehyde-3-phosphate dehydrogenase were quantitated reproducibly and sensitively in the range from 1 x 10(2) to 1 x 10(8) gene copies. Basal gene expression level of LAMB3 was about one-tenth of that of LAMA3 or LAMC2 in human keratinocytes, although there was no clear difference among immunoprecipitated protein levels of alpha3, beta3, and gamma2 synthesized in radio-labeled keratinocytes. Human serum augmented gene expressions of LAMA3, LAMB3, and LAMC2 in human keratinocytes to almost the same extent, and this was associated with an increase of the laminin 5 protein content measured by a specific sandwich enzyme-linked immunosorbent assay. These results demonstrate that the absolute mRNA levels generated from the laminin 5 genes do not determine the translated protein levels of the laminin 5 chains in keratinocytes, and indicate that the expression of the laminin 5 genes may be controlled by common regulation mechanisms. PMID:15854126

  1. Quantitative Analysis of the Enhanced Permeation and Retention (EPR) Effect

    PubMed Central

    Ulmschneider, Martin B.; Searson, Peter C.

    2015-01-01

    Tumor vasculature is characterized by a variety of abnormalities including irregular architecture, poor lymphatic drainage, and the upregulation of factors that increase the paracellular permeability. The increased permeability is important in mediating the uptake of an intravenously administered drug in a solid tumor and is known as the enhanced permeation and retention (EPR) effect. Studies in animal models have demonstrated a cut-off size of 500 nm - 1 µm for molecules or nanoparticles to extravasate into a tumor, however, surprisingly little is known about the kinetics of the EPR effect. Here we present a pharmacokinetic model to quantitatively assess the influence of the EPR effect on the uptake of a drug into a solid tumor. We use pharmacokinetic data for Doxil and doxorubicin from human clinical trials to illustrate how the EPR effect influences tumor uptake. This model provides a quantitative framework to guide preclinical trials of new chemotherapies and ultimately to develop design rules that can increase targeting efficiency and decrease unwanted side effects in normal tissue. PMID:25938565

  2. Quantitative analysis of Babesia ovis infection in sheep and ticks.

    PubMed

    Erster, Oran; Roth, Asael; Wollkomirsky, Ricardo; Leibovich, Benjamin; Savitzky, Igor; Zamir, Shmuel; Molad, Thea; Shkap, Varda

    2016-05-15

    A quantitative PCR, based on the gene encoding Babesia ovis Surface Protein D (BoSPD) was developed and applied to investigate the presence of Babesia ovis (B. ovis) in its principal vector, the tick Rhipicephalus bursa (R. bursa), and in the ovine host. Quantification of B. ovis in experimentally-infected lambs showed a sharp increase in parasitemia 10-11days in blood-inoculated and adult tick-infested lambs, and 24days in a larvae-infested lamb. A gradual decrease of parasitemia was observed in the following months, with parasites detectable 6-12 months post-infection. Examination of the parasite load in adult R. bursa during the post-molting period using the quantitative PCR assay revealed a low parasite load during days 2-7 post-molting, followed by a sharp increase, until day 11, which corresponded to the completion of the pre-feeding period. The assay was then used to detect B. ovis in naturally-infected sheep and ticks. Examination of samples from 8 sheep and 2 goats from infected flocks detected B. ovis in both goats and in 7 out of the 8 sheep. Additionally, B. ovis was detected in 9 tick pools (5 ticks in each pool) and two individual ticks removed from sheep in infected flocks. PMID:27084469

  3. Quantitative analysis of thermal spray deposits using stereology

    SciTech Connect

    Leigh, S.H.; Sampath, S.; Herman, H.; Berndt, C.C.; Montavon, G.; Coddet, C.

    1995-12-31

    Stereology deals with protocols for describing a 3-D space, when only 2-D sections through solid bodies are available. This paper describes a stereological characterization of the microstructure of a thermal spray deposit. The aim of this work is to present results on the stereological characterization of a thermal spray deposit, using two approaches known as DeHoff`s and Cruz-Orive`s protocols. The individual splats are assumed to have an oblate spheroidal shape. The splat size distribution and elongation ratio distribution of splats are calculated using quantitative information from 2-D plane sections. The stereological methods are implemented to investigate the microstructure of a water stabilized plasma spray-formed Al{sub 2}O{sub 3}-13wt.%TiO{sub 2}. Results are obtained with both protocols. The splat sizes range from 0 to 60 {micro}m and shape factors from 0.4 to 1.0. The splats within the deposit seem to be much smaller and thicker (i.e., lower spreading) than those of the first layer deposited onto the substrate. The approach described in this work provides helpful quantitative information on the 3-D microstructure of thermal spray deposit.

  4. Quantitative Sulfur Analysis using Stand-off Laser-Induced Breakdown Spectroscopy

    NASA Astrophysics Data System (ADS)

    Dyar, M. D.; Tucker, J. M.; Clegg, S. M.; Barefield, J. E.; Wiens, R. C.

    2008-12-01

    The laser-induced breakdown spectrometer (LIBS) in the ChemCam instrument on Mars Science Laboratory has the capability to produce robust, quantitative analyses not only for major elements, but also for a large range of light elements and trace elements that are of great interest to geochemists. However, sulfur presents a particular challenge because it reacts easily with oxygen in the plasma and because the brightest S emission lines lie outside ChemCam's spectral range. This work was undertaken within the context of our larger effort to identify and compensate for matrix effects, which are chemical properties of the material that influence the ratio of a given emission line to the abundance of the element producing that line. Samples for this study include two suites of rocks: a suite of 12 samples that are mixtures of sulfate minerals and host rocks, generally with high S contents (0.1-26.0 wt% S), and a large suite of 118 igneous rocks from varying parageneses with S contents in the 0-2 wt% range. These compositions provide several different types of matrices to challenge our calibration procedures. Samples were analyzed under ChemCam-like conditions: a Nd:YAG laser producing 17 mJ per 10ns pulse was directed onto samples positioned 5-9 m away from the laser and tele­scope. The samples were placed in a vacuum chamber filled with 7 Torr CO2 to replicate the Martian surface pressure as the atmospheric pressure influences the LIBS plasma. Some of the LIBS plasma emission is collected with a telescope and transmitted through a 1 m, 300 um, 0.22NA optical fiber connected to a commercial Ocean Optics spectrometer. We are testing and comparing three different strategies to evaluate sulfur contents. 1) We have calculated regression lines comparing the intensity at each channel to the S content. This analysis shows that there are dozens of S emission lines in the ChemCam wavelength range that are suitable for use in quantitative analysis, even in the presence of Fe. 2

  5. On-site semi-quantitative analysis for ammonium nitrate detection using digital image colourimetry.

    PubMed

    Choodum, Aree; Boonsamran, Pichapat; NicDaeid, Niamh; Wongniramaikul, Worawit

    2015-12-01

    Digital image colourimetry was successfully applied in the semi-quantitative analysis of ammonium nitrate using Griess's test with zinc reduction. A custom-built detection box was developed to enable reproducible lighting of samples, and was used with the built-in webcams of a netbook and an ultrabook for on-site detection. The webcams were used for colour imaging of chemical reaction products in the samples, while the netbook was used for on-site colour analysis. The analytical performance was compared to a commercial external webcam and a digital single-lens reflex (DSLR) camera. The relationship between Red-Green-Blue intensities and ammonium nitrate concentration was investigated. The green channel intensity (IG) was the most sensitive for the pink-violet products from ammonium nitrate that revealed a spectrometric absorption peak at 546 nm. A wide linear range (5 to 250 mgL⁻¹) with a high sensitivity was obtained with the built-in webcam of the ultrabook. A considerably lower detection limit (1.34 ± 0.05mgL⁻¹) was also obtained using the ultrabook, in comparison with the netbook (2.6 ± 0.2 mgL⁻¹), the external web cam (3.4 ± 0.1 mgL⁻¹) and the DSLR (8.0 ± 0.5 mgL⁻¹). The best inter-day precision (over 3 days) was obtained with the external webcam (0.40 to 1.34%RSD), while the netbook and the ultrabook had 0.52 to 3.62% and 1.25 to 4.99% RSDs, respectively. The relative errors were +3.6, +5.6 and -7.1%, on analysing standard ammonium nitrate solutions of known concentration using IG, for the ultrabook, the external webcam, and the netbook, respectively, while the DSLR gave -4.4% relative error. However, the IG of the pink-violet reaction product suffers from interference by soil, so that blank subtraction (|IG-IGblank| or |AG-AGblank|) is recommended for soil sample analysis. This method also gave very good accuracies of -0.11 to -5.61% for spiked soil samples and the results presented for five seized samples showed good correlations between

  6. Fourier Transform Mass Spectrometry and Nuclear Magnetic Resonance Analysis for the Rapid and Accurate Characterization of Hexacosanoylceramide.

    PubMed

    Ross, Charles W; Simonsick, William J; Bogusky, Michael J; Celikay, Recep W; Guare, James P; Newton, Randall C

    2016-01-01

    Ceramides are a central unit of all sphingolipids which have been identified as sites of biological recognition on cellular membranes mediating cell growth and differentiation. Several glycosphingolipids have been isolated, displaying immunomodulatory and anti-tumor activities. These molecules have generated considerable interest as potential vaccine adjuvants in humans. Accurate analyses of these and related sphingosine analogues are important for the characterization of structure, biological function, and metabolism. We report the complementary use of direct laser desorption ionization (DLDI), sheath flow electrospray ionization (ESI) Fourier transform ion cyclotron resonance mass spectrometry (FTICR MS) and high-field nuclear magnetic resonance (NMR) analysis for the rapid, accurate identification of hexacosanoylceramide and starting materials. DLDI does not require stringent sample preparation and yields representative ions. Sheath-flow ESI yields ions of the product and byproducts and was significantly better than monospray ESI due to improved compound solubility. Negative ion sheath flow ESI provided data of starting materials and products all in one acquisition as hexacosanoic acid does not ionize efficiently when ceramides are present. NMR provided characterization of these lipid molecules complementing the results obtained from MS analyses. NMR data was able to differentiate straight chain versus branched chain alkyl groups not easily obtained from mass spectrometry. PMID:27367671

  7. Fourier Transform Mass Spectrometry and Nuclear Magnetic Resonance Analysis for the Rapid and Accurate Characterization of Hexacosanoylceramide

    PubMed Central

    Ross, Charles W.; Simonsick, William J.; Bogusky, Michael J.; Celikay, Recep W.; Guare, James P.; Newton, Randall C.

    2016-01-01

    Ceramides are a central unit of all sphingolipids which have been identified as sites of biological recognition on cellular membranes mediating cell growth and differentiation. Several glycosphingolipids have been isolated, displaying immunomodulatory and anti-tumor activities. These molecules have generated considerable interest as potential vaccine adjuvants in humans. Accurate analyses of these and related sphingosine analogues are important for the characterization of structure, biological function, and metabolism. We report the complementary use of direct laser desorption ionization (DLDI), sheath flow electrospray ionization (ESI) Fourier transform ion cyclotron resonance mass spectrometry (FTICR MS) and high-field nuclear magnetic resonance (NMR) analysis for the rapid, accurate identification of hexacosanoylceramide and starting materials. DLDI does not require stringent sample preparation and yields representative ions. Sheath-flow ESI yields ions of the product and byproducts and was significantly better than monospray ESI due to improved compound solubility. Negative ion sheath flow ESI provided data of starting materials and products all in one acquisition as hexacosanoic acid does not ionize efficiently when ceramides are present. NMR provided characterization of these lipid molecules complementing the results obtained from MS analyses. NMR data was able to differentiate straight chain versus branched chain alkyl groups not easily obtained from mass spectrometry. PMID:27367671

  8. Degradation of reactive dyes in wastewater from the textile industry by ozone: analysis of the products by accurate masses.

    PubMed

    Constapel, Marc; Schellenträger, Marc; Marzinkowski, Joachim Michael; Gäb, Siegmar

    2009-02-01

    The present work describes the use of ozone to degrade selected reactive dyes from the textile industry and the analysis of the resulting complex mixture by liquid chromatography/mass spectrometry (LC-MS). To allow certain identification of the substances detected in the wastewater, the original dyes were also investigated either separately or in a synthetic mixture of three dyes (trichromie). Since the reactive dyes are hydrolyzed during the dyeing process, procedures for the hydrolysis were worked out first for the individual dyes. The ozonated solutions were concentrated by solid-phase extraction, which separated very polar or ionic substances from moderately polar degradation products. The latter, which are the primary degradation products, were investigated by liquid chromatography/mass spectrometry with a tandem quadrupole time-of-flight mass analyzer. Accurate masses, which in most cases could be determined with a deviation of accurate masses, UV-vis spectra and, of course, knowledge of the structures of the original dyes, plausible structures could be proposed for most of the components of the moderately polar fraction. These structures were confirmed by 1H NMR in cases where it was practical to isolate the degradation products by preparative HPLC. PMID:19110293

  9. An accurate retrieval of leaf water content from mid to thermal infrared spectra using continuous wavelet analysis.

    PubMed

    Ullah, Saleem; Skidmore, Andrew K; Naeem, Mohammad; Schlerf, Martin

    2012-10-15

    Leaf water content determines plant health, vitality, photosynthetic efficiency and is an important indicator of drought assessment. The retrieval of leaf water content from the visible to shortwave infrared spectra is well known. Here for the first time, we estimated leaf water content from the mid to thermal infrared (2.5-14.0 μm) spectra, based on continuous wavelet analysis. The dataset comprised 394 spectra from nine plant species, with different water contents achieved through progressive drying. To identify the spectral feature most sensitive to the variations in leaf water content, first the Directional Hemispherical Reflectance (DHR) spectra were transformed into a wavelet power scalogram, and then linear relations were established between the wavelet power scalogram and leaf water content. The six individual wavelet features identified in the mid infrared yielded high correlations with leaf water content (R(2)=0.86 maximum, 0.83 minimum), as well as low RMSE (minimum 8.56%, maximum 9.27%). The combination of four wavelet features produced the most accurate model (R(2)=0.88, RMSE=8.00%). The models were consistent in terms of accuracy estimation for both calibration and validation datasets, indicating that leaf water content can be accurately retrieved from the mid to thermal infrared domain of the electromagnetic radiation. PMID:22940042

  10. A new method of accurate broken rotor bar diagnosis based on modulation signal bispectrum analysis of motor current signals

    NASA Astrophysics Data System (ADS)

    Gu, F.; Wang, T.; Alwodai, A.; Tian, X.; Shao, Y.; Ball, A. D.

    2015-01-01

    Motor current signature analysis (MCSA) has been an effective way of monitoring electrical machines for many years. However, inadequate accuracy in diagnosing incipient broken rotor bars (BRB) has motivated many studies into improving this method. In this paper a modulation signal bispectrum (MSB) analysis is applied to motor currents from different broken bar cases and a new MSB based sideband estimator (MSB-SE) and sideband amplitude estimator are introduced for obtaining the amplitude at (1 ± 2 s)fs (s is the rotor slip and fs is the fundamental supply frequency) with high accuracy. As the MSB-SE has a good performance of noise suppression, the new estimator produces more accurate results in predicting the number of BRB, compared with conventional power spectrum analysis. Moreover, the paper has also developed an improved model for motor current signals under rotor fault conditions and an effective method to decouple the BRB current which interferes with that of speed oscillations associated with BRB. These provide theoretical supports for the new estimators and clarify the issues in using conventional bispectrum analysis.

  11. How accurate are polymer models in the analysis of Förster resonance energy transfer experiments on proteins?

    NASA Astrophysics Data System (ADS)

    O'Brien, Edward P.; Morrison, Greg; Brooks, Bernard R.; Thirumalai, D.

    2009-03-01

    Single molecule Förster resonance energy transfer (FRET) experiments are used to infer the properties of the denatured state ensemble (DSE) of proteins. From the measured average FRET efficiency, ⟨E⟩, the distance distribution P(R ) is inferred by assuming that the DSE can be described as a polymer. The single parameter in the appropriate polymer model (Gaussian chain, wormlike chain, or self-avoiding walk) for P(R ) is determined by equating the calculated and measured ⟨E⟩. In order to assess the accuracy of this "standard procedure," we consider the generalized Rouse model (GRM), whose properties [⟨E⟩ and P(R )] can be analytically computed, and the Molecular Transfer Model for protein L for which accurate simulations can be carried out as a function of guanadinium hydrochloride (GdmCl) concentration. Using the precisely computed ⟨E⟩ for the GRM and protein L, we infer P(R ) using the standard procedure. We find that the mean end-to-end distance can be accurately inferred (less than 10% relative error) using ⟨E⟩ and polymer models for P(R ). However, the value extracted for the radius of gyration (Rg) and the persistence length (lp) are less accurate. For protein L, the errors in the inferred properties increase as the GdmCl concentration increases for all polymer models. The relative error in the inferred Rg and lp, with respect to the exact values, can be as large as 25% at the highest GdmCl concentration. We propose a self-consistency test, requiring measurements of ⟨E⟩ by attaching dyes to different residues in the protein, to assess the validity of describing DSE using the Gaussian model. Application of the self-consistency test to the GRM shows that even for this simple model, which exhibits an order→disorder transition, the Gaussian P(R ) is inadequate. Analysis of experimental data of FRET efficiencies with dyes at several locations for the cold shock protein, and simulations results for protein L, for which accurate FRET

  12. Anniversary Paper: History and status of CAD and quantitative image analysis: The role of Medical Physics and AAPM

    PubMed Central

    Giger, Maryellen L.; Chan, Heang-Ping; Boone, John

    2008-01-01

    algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more—from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis. PMID:19175137

  13. Anniversary Paper: History and status of CAD and quantitative image analysis: The role of Medical Physics and AAPM

    SciTech Connect

    Giger, Maryellen L.; Chan, Heang-Ping; Boone, John

    2008-12-15

    algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more--from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis.

  14. Anniversary paper: History and status of CAD and quantitative image analysis: the role of Medical Physics and AAPM.

    PubMed

    Giger, Maryellen L; Chan, Heang-Ping; Boone, John

    2008-12-01

    using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more-from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis. PMID:19175137

  15. Bridging the gaps for global sustainable development: a quantitative analysis.

    PubMed

    Udo, Victor E; Jansson, Peter Mark

    2009-09-01

    Global human progress occurs in a complex web of interactions between society, technology and the environment as driven by governance and infrastructure management capacity among nations. In our globalizing world, this complex web of interactions over the last 200 years has resulted in the chronic widening of economic and political gaps between the haves and the have-nots with consequential global cultural and ecosystem challenges. At the bottom of these challenges is the issue of resource limitations on our finite planet with increasing population. The problem is further compounded by pleasure-driven and poverty-driven ecological depletion and pollution by the haves and the have-nots respectively. These challenges are explored in this paper as global sustainable development (SD) quantitatively; in order to assess the gaps that need to be bridged. Although there has been significant rhetoric on SD with very many qualitative definitions offered, very few quantitative definitions of SD exist. The few that do exist tend to measure SD in terms of social, energy, economic and environmental dimensions. In our research, we used several human survival, development, and progress variables to create an aggregate SD parameter that describes the capacity of nations in three dimensions: social sustainability, environmental sustainability and technological sustainability. Using our proposed quantitative definition of SD and data from relatively reputable secondary sources, 132 nations were ranked and compared. Our comparisons indicate a global hierarchy of needs among nations similar to Maslow's at the individual level. As in Maslow's hierarchy of needs, nations that are struggling to survive are less concerned with environmental sustainability than advanced and stable nations. Nations such as the United States, Canada, Finland, Norway and others have higher SD capacity, and thus, are higher on their hierarchy of needs than nations such as Nigeria, Vietnam, Mexico and other

  16. A method to accurately quantitate intensities of (32)P-DNA bands when multiple bands appear in a single lane of a gel is used to study dNTP insertion opposite a benzo[a]pyrene-dG adduct by Sulfolobus DNA polymerases Dpo4 and Dbh.

    PubMed

    Sholder, Gabriel; Loechler, Edward L

    2015-01-01

    Quantitating relative (32)P-band intensity in gels is desired, e.g., to study primer-extension kinetics of DNA polymerases (DNAPs). Following imaging, multiple (32)P-bands are often present in lanes. Though individual bands appear by eye to be simple and well-resolved, scanning reveals they are actually skewed-Gaussian in shape and neighboring bands are overlapping, which complicates quantitation, because slower migrating bands often have considerable contributions from the trailing edges of faster migrating bands. A method is described to accurately quantitate adjacent (32)P-bands, which relies on having a standard: a simple skewed-Gaussian curve from an analogous pure, single-component band (e.g., primer alone). This single-component scan/curve is superimposed on its corresponding band in an experimentally determined scan/curve containing multiple bands (e.g., generated in a primer-extension reaction); intensity exceeding the single-component scan/curve is attributed to other components (e.g., insertion products). Relative areas/intensities are determined via pixel analysis, from which relative molarity of components is computed. Common software is used. Commonly used alternative methods (e.g., drawing boxes around bands) are shown to be less accurate. Our method was used to study kinetics of dNTP primer-extension opposite a benzo[a]pyrene-N(2)-dG-adduct with four DNAPs, including Sulfolobus solfataricus Dpo4 and Sulfolobus acidocaldarius Dbh. Vmax/Km is similar for correct dCTP insertion with Dpo4 and Dbh. Compared to Dpo4, Dbh misinsertion is slower for dATP (∼20-fold), dGTP (∼110-fold) and dTTP (∼6-fold), due to decreases in Vmax. These findings provide support that Dbh is in the same Y-Family DNAP class as eukaryotic DNAP κ and bacterial DNAP IV, which accurately bypass N(2)-dG adducts, as well as establish the scan-method described herein as an accurate method to quantitate relative intensity of overlapping bands in a single lane, whether generated

  17. [Quantitative analysis of alloy steel based on laser induced breakdown spectroscopy with partial least squares method].

    PubMed

    Cong, Zhi-Bo; Sun, Lan-Xiang; Xin, Yong; Li, Yang; Qi, Li-Feng; Yang, Zhi-Jia

    2014-02-01

    In the present paper both the partial least squares (PLS) method and the calibration curve (CC) method are used to quantitatively analyze the laser induced breakdown spectroscopy data obtained from the standard alloy steel samples. Both the major and trace elements were quantitatively analyzed. By comparing the results of two different calibration methods some useful results were obtained: for major elements, the PLS method is better than the CC method in quantitative analysis; more importantly, for the trace elements, the CC method can not give the quantitative results due to the extremely weak characteristic spectral lines, but the PLS method still has a good ability of quantitative analysis. And the regression coefficient of PLS method is compared with the original spectral data with background interference to explain the advantage of the PLS method in the LIBS quantitative analysis. Results proved that the PLS method used in laser induced breakdown spectroscopy is suitable for quantitative analysis of trace elements such as C in the metallurgical industry. PMID:24822436

  18. Quantitative analysis on electric dipole energy in Rashba band splitting

    NASA Astrophysics Data System (ADS)

    Hong, Jisook; Rhim, Jun-Won; Kim, Changyoung; Ryong Park, Seung; Hoon Shim, Ji

    2015-09-01

    We report on quantitative comparison between the electric dipole energy and the Rashba band splitting in model systems of Bi and Sb triangular monolayers under a perpendicular electric field. We used both first-principles and tight binding calculations on p-orbitals with spin-orbit coupling. First-principles calculation shows Rashba band splitting in both systems. It also shows asymmetric charge distributions in the Rashba split bands which are induced by the orbital angular momentum. We calculated the electric dipole energies from coupling of the asymmetric charge distribution and external electric field, and compared it to the Rashba splitting. Remarkably, the total split energy is found to come mostly from the difference in the electric dipole energy for both Bi and Sb systems. A perturbative approach for long wave length limit starting from tight binding calculation also supports that the Rashba band splitting originates mostly from the electric dipole energy difference in the strong atomic spin-orbit coupling regime.

  19. Quantitative analysis of CT scans of ceramic candle filters

    SciTech Connect

    Ferer, M.V.; Smith, D.H.

    1996-12-31

    Candle filters are being developed to remove coal ash and other fine particles (<15{mu}m) from hot (ca. 1000 K) gas streams. In the present work, a color scanner was used to digitize hard-copy CT X-ray images of cylindrical SiC filters, and linear regressions converted the scanned (color) data to a filter density for each pixel. These data, with the aid of the density of SiC, gave a filter porosity for each pixel. Radial averages, density-density correlation functions, and other statistical analyses were performed on the density data. The CT images also detected the presence and depth of cracks that developed during usage of the filters. The quantitative data promise to be a very useful addition to the color images.

  20. Quantitation of phosphorus excretion in sheep by compartmental analysis

    SciTech Connect

    Schneider, K.M.; Boston, R.C.; Leaver, D.D.

    1987-04-01

    The control of phosphorus excretion in sheep has been examined by constructing a kinetic model that contains a mechanistic set of connections between blood and gastrointestinal tract. The model was developed using experimental data from chaff-fed sheep and gives an accurate description of the absorption and excretion of /sup 32/P phosphorus in feces and urine of the ruminating sheep. These results indicated the main control site for phosphorus excretion in the ruminating sheep was the gastrointestinal tract, whereas for the non-ruminating sheep fed the liquid diet, control was exerted by the kidney. A critical factor in the induction of adaptation of phosphorus reabsorption by the kidney was the reduction in salivation, and since this response occurred independently of marked changes in the delivery of phosphorus to the kidney, a humoral factor may be involved in this communication between salivary gland and kidney.

  1. Quantitative Analysis of Cancer Metastasis using an Avian Embryo Model

    PubMed Central

    Palmer, Trenis D.; Lewis, John; Zijlstra, Andries

    2011-01-01

    During metastasis cancer cells disseminate from the primary tumor, invade into surrounding tissues, and spread to distant organs. Metastasis is a complex process that can involve many tissue types, span variable time periods, and often occur deep within organs, making it difficult to investigate and quantify. In addition, the efficacy of the metastatic process is influenced by multiple steps in the metastatic cascade making it difficult to evaluate the contribution of a single aspect of tumor cell behavior. As a consequence, metastasis assays are frequently performed in experimental animals to provide a necessarily realistic context in which to study metastasis. Unfortunately, these models are further complicated by their complex physiology. The chick embryo is a unique in vivo model that overcomes many limitations to studying metastasis, due to the accessibility of the chorioallantoic membrane (CAM), a well-vascularized extra-embryonic tissue located underneath the eggshell that is receptive to the xenografting of tumor cells (figure 1). Moreover, since the chick embryo is naturally immunodeficient, the CAM readily supports the engraftment of both normal and tumor tissues. Most importantly, the avian CAM successfully supports most cancer cell characteristics including growth, invasion, angiogenesis, and remodeling of the microenvironment. This makes the model exceptionally useful for the investigation of the pathways that lead to cancer metastasis and to predict the response of metastatic cancer to new potential therapeutics. The detection of disseminated cells by species-specific Alu PCR makes it possible to quantitatively assess metastasis in organs that are colonized by as few as 25 cells. Using the Human Epidermoid Carcinoma cell line (HEp3) we use this model to analyze spontaneous metastasis of cancer cells to distant organs, including the chick liver and lung. Furthermore, using the Alu-PCR protocol we demonstrate the sensitivity and reproducibility of the

  2. Quantitative error analysis for computer assisted navigation: a feasibility study

    PubMed Central

    Güler, Ö.; Perwög, M.; Kral, F.; Schwarm, F.; Bárdosi, Z. R.; Göbel, G.; Freysinger, W.

    2013-01-01

    Purpose The benefit of computer-assisted navigation depends on the registration process, at which patient features are correlated to some preoperative imagery. The operator-induced uncertainty in localizing patient features – the User Localization Error (ULE) - is unknown and most likely dominating the application accuracy. This initial feasibility study aims at providing first data for ULE with a research navigation system. Methods Active optical navigation was done in CT-images of a plastic skull, an anatomic specimen (both with implanted fiducials) and a volunteer with anatomical landmarks exclusively. Each object was registered ten times with 3, 5, 7, and 9 registration points. Measurements were taken at 10 (anatomic specimen and volunteer) and 11 targets (plastic skull). The active NDI Polaris system was used under ideal working conditions (tracking accuracy 0.23 mm root mean square, RMS; probe tip calibration was 0.18 mm RMS. Variances of tracking along the principal directions were measured as 0.18 mm2, 0.32 mm2, and 0.42 mm2. ULE was calculated from predicted application accuracy with isotropic and anisotropic models and from experimental variances, respectively. Results The ULE was determined from the variances as 0.45 mm (plastic skull), 0.60 mm (anatomic specimen), and 4.96 mm (volunteer). The predicted application accuracy did not yield consistent values for the ULE. Conclusions Quantitative data of application accuracy could be tested against prediction models with iso- and anisotropic noise models and revealed some discrepancies. This could potentially be due to the facts that navigation and one prediction model wrongly assume isotropic noise (tracking is anisotropic), while the anisotropic noise prediction model assumes an anisotropic registration strategy (registration is isotropic in typical navigation systems). The ULE data are presumably the first quantitative values for the precision of localizing anatomical landmarks and implanted fiducials

  3. Identification of fidgety movements and prediction of CP by the use of computer-based video analysis is more accurate when based on two video recordings.

    PubMed

    Adde, Lars; Helbostad, Jorunn; Jensenius, Alexander R; Langaas, Mette; Støen, Ragnhild

    2013-08-01

    This study evaluates the role of postterm age at assessment and the use of one or two video recordings for the detection of fidgety movements (FMs) and prediction of cerebral palsy (CP) using computer vision software. Recordings between 9 and 17 weeks postterm age from 52 preterm and term infants (24 boys, 28 girls; 26 born preterm) were used. Recordings were analyzed using computer vision software. Movement variables, derived from differences between subsequent video frames, were used for quantitative analysis. Sensitivities, specificities, and area under curve were estimated for the first and second recording, or a mean of both. FMs were classified based on the Prechtl approach of general movement assessment. CP status was reported at 2 years. Nine children developed CP of whom all recordings had absent FMs. The mean variability of the centroid of motion (CSD) from two recordings was more accurate than using only one recording, and identified all children who were diagnosed with CP at 2 years. Age at assessment did not influence the detection of FMs or prediction of CP. The accuracy of computer vision techniques in identifying FMs and predicting CP based on two recordings should be confirmed in future studies. PMID:23343036

  4. Deep vein thrombosis is accurately predicted by comprehensive analysis of the levels of microRNA-96 and plasma D-dimer

    PubMed Central

    Xie, Xuesheng; Liu, Changpeng; Lin, Wei; Zhan, Baoming; Dong, Changjun; Song, Zhen; Wang, Shilei; Qi, Yingguo; Wang, Jiali; Gu, Zengquan

    2016-01-01

    The aim of the present study was to investigate the association between platelet microRNA-96 (miR-96) expression levels and the occurrence of deep vein thrombosis (DVT) in orthopedic patients. A total of consecutive 69 orthopedic patients with DVT and 30 healthy individuals were enrolled. Ultrasonic color Doppler imaging was performed on lower limb veins after orthopedic surgery to determine the occurrence of DVT. An enzyme-linked fluorescent assay was performed to detect the levels of D-dimer in plasma. A quantitative polymerase chain reaction assay was performed to determine the expression levels of miR-96. Expression levels of platelet miR-96 were significantly increased in orthopedic patients after orthopedic surgery. miR-96 expression levels in orthopedic patients with DVT at days 1, 3 and 7 after orthopedic surgery were significantly increased when compared with those in the control group. The increased miR-96 expression levels were correlated with plasma D-dimer levels in orthopedic patients with DVT. However, for the orthopedic patients in the non-DVT group following surgery, miR-96 expression levels were correlated with plasma D-dimer levels. In summary, the present results suggest that the expression levels of miR-96 may be associated with the occurrence of DVT. The occurrence of DVT may be accurately predicted by comprehensive analysis of the levels of miR-96 and plasma D-dimer. PMID:27588107

  5. Quantitative analysis of pungent and anti-inflammatory phenolic compounds in olive oil by capillary electrophoresis.

    PubMed

    Vulcano, Isabella; Halabalaki, Maria; Skaltsounis, Leandros; Ganzera, Markus

    2015-02-15

    The first CE procedure for the quantitative determination of pharmacologically relevant secoiridoids in olive oil, oleocanthal and oleacein, is described. Together with their precursors tyrosol and hydroxytyrosol they could be baseline separated in less than 15min using a borax buffer with pH 9.5, at 25kV and 30°C. Method validation confirmed that the procedure is selective, accurate (recovery rates from 94.0 to 104.6%), reproducible (σmax⩽6.8%) and precise (inter-day precision⩽6.4%), and that the compounds do not degrade quickly if non-aqueous acetonitrile is used as solvent. Quantitative results indicated a low occurrence of oleocanthal (0.004-0.021%) and oleacein (0.002-0.048%) in olive oil samples, which is in agreement to published HPLC data. The CE method impresses with its simple instrumental and methodological design, combined with reproducible and valid quantitative results. PMID:25236241

  6. Scanning-electron-microscope image processing for accurate analysis of line-edge and line-width roughness

    NASA Astrophysics Data System (ADS)

    Hiraiwa, Atsushi; Nishida, Akio

    2012-03-01

    The control of line-edge or line-width roughness (LER/LWR) is a challenge especially for future devices that are fabricated using extreme-ultraviolet lithography. Accurate analysis of the LER/LWR plays an essential role in this challenge and requires the noise involved in scanning-electron-microscope (SEM) images to be reduced by appropriate image processing prior to analyses. In order to achieve this, the authors simulated SEM images using the Monte-Carlo method and detected line edges in experimental and these theoretical images after noise filtering using new imageanalysis software. The validity of these simulation and software was confirmed by a good agreement between the experimental and theoretical results. In the case when the image pixels aligned perpendicular (crosswise) to line edges were averaged, the variance var(φ) that was additionally induced by the image noise decreased with the number NPIX,X of averaged pixels but turned to increase for relatively large NPIX,X's. Real LER/LWR, however, remained unaffected. On the other hand, averaging image pixels aligned parallel (longitudinal) to line edges not only reduced var(φ) but smoothed the real LER/LWR. As a result, the nominal variance of the real LWR, obtained using simple arithmetic, monotonically decreased with the number NPIX,L of averaged pixels. Artifactual oscillations were additionally observed in power spectral densities. var(φ) in this case decreased in an inverse proportion to the square root of NPIX,L according to the statistical mechanism clarified here. In this way, image processing has a marked effect on the LER/LWR analysis and needs to be much more cared and appropriately applied. All the aforementioned results not only constitute a solid basis of but improve previous empirical instructions for accurate analyses. The most important instruction is to avoid the longitudinal averaging and to crosswise average an optimized number of image pixels consulting the equation derived in this

  7. A versatile pipeline for the multi-scale digital reconstruction and quantitative analysis of 3D tissue architecture

    PubMed Central

    Morales-Navarrete, Hernán; Segovia-Miranda, Fabián; Klukowski, Piotr; Meyer, Kirstin; Nonaka, Hidenori; Marsico, Giovanni; Chernykh, Mikhail; Kalaidzidis, Alexander; Zerial, Marino; Kalaidzidis, Yannis

    2015-01-01

    A prerequisite for the systems biology analysis of tissues is an accurate digital three-dimensional reconstruction of tissue structure based on images of markers covering multiple scales. Here, we designed a flexible pipeline for the multi-scale reconstruction and quantitative morphological analysis of tissue architecture from microscopy images. Our pipeline includes newly developed algorithms that address specific challenges of thick dense tissue reconstruction. Our implementation allows for a flexible workflow, scalable to high-throughput analysis and applicable to various mammalian tissues. We applied it to the analysis of liver tissue and extracted quantitative parameters of sinusoids, bile canaliculi and cell shapes, recognizing different liver cell types with high accuracy. Using our platform, we uncovered an unexpected zonation pattern of hepatocytes with different size, nuclei and DNA content, thus revealing new features of liver tissue organization. The pipeline also proved effective to analyse lung and kidney tissue, demonstrating its generality and robustness. DOI: http://dx.doi.org/10.7554/eLife.11214.001 PMID:26673893

  8. Reference Gene Identification for Reverse Transcription-Quantitative Polymerase Chain Reaction Analysis in an Ischemic Wound-Healing Model

    PubMed Central

    Ruedrich, Elizabeth D.; Henzel, Mary K.; Hausman, Bryan S.; Bogie, Kath M.

    2013-01-01

    Reference genes are often used in RT-quantitative PCR (qPCR) analysis to normalize gene expression levels to a gene that is expressed stably across study groups. They ultimately serve as a control in RT-qPCR analysis, producing more accurate interpretation of results. Whereas many reference genes have been used in various wound-healing studies, the most stable reference gene for ischemic wound-healing analysis has yet to be identified. The goal of this study was to determine systematically the most stable reference gene for studying gene expression in a rat ischemic wound-healing model using RT-qPCR. Twelve commonly used reference genes were analyzed using RT-qPCR and geNorm data analysis to determine stability across normal and ischemic skin tissue. It was ultimately determined that Ubiquitin C (UBC) and β-2 Microglobulin (B2M) are the most stably conserved reference genes across normal and ischemic skin tissue. UBC and B2M represent reliable reference genes for RT-qPCR studies in the rat ischemic wound model and are unaffected by sustained tissue ischemia. The geometric mean of these two stable genes provides an accurate normalization factor. These results provide insight on dependence of reference-gene stability on experimental parameters and the importance of such reference-gene investigations. PMID:24294111

  9. Reference gene identification for reverse transcription-quantitative polymerase chain reaction analysis in an ischemic wound-healing model.

    PubMed

    Ruedrich, Elizabeth D; Henzel, Mary K; Hausman, Bryan S; Bogie, Kath M

    2013-12-01

    Reference genes are often used in RT-quantitative PCR (qPCR) analysis to normalize gene expression levels to a gene that is expressed stably across study groups. They ultimately serve as a control in RT-qPCR analysis, producing more accurate interpretation of results. Whereas many reference genes have been used in various wound-healing studies, the most stable reference gene for ischemic wound-healing analysis has yet to be identified. The goal of this study was to determine systematically the most stable reference gene for studying gene expression in a rat ischemic wound-healing model using RT-qPCR. Twelve commonly used reference genes were analyzed using RT-qPCR and geNorm data analysis to determine stability across normal and ischemic skin tissue. It was ultimately determined that Ubiquitin C (UBC) and β-2 Microglobulin (B2M) are the most stably conserved reference genes across normal and ischemic skin tissue. UBC and B2M represent reliable reference genes for RT-qPCR studies in the rat ischemic wound model and are unaffected by sustained tissue ischemia. The geometric mean of these two stable genes provides an accurate normalization factor. These results provide insight on dependence of reference-gene stability on experimental parameters and the importance of such reference-gene investigations. PMID:24294111

  10. A superconducting quantum interference device magnetometer system for quantitative analysis and imaging of hidden corrosion activity in aircraft aluminum structures

    NASA Astrophysics Data System (ADS)

    Abedi, A.; Fellenstein, J. J.; Lucas, A. J.; Wikswo, J. P.

    1999-12-01

    We have designed and built a magnetic imaging system for quantitative analysis of the rate of ongoing hidden corrosion of aircraft aluminum alloys in planar structures such as intact aircraft lap joints. The system utilizes a superconducting quantum interference device (SQUID) magnetometer that measures the magnetic field associated with corrosion currents. It consists of a three-axis (vector) SQUID differential magnetometer, magnetic, and rf shielding, a computer controlled x-y stage, sample registration, and positioning mechanisms, and data acquisition and analysis software. The system is capable of scanning planar samples with dimensions of up to 28 cm square, with a spatial resolution of 2 mm, and a sensitivity of 0.3 pT/Hz1/2 (at 10 Hz). In this article we report the design and technical issues related to this system, outline important data acquisition techniques and criteria for accurate measurements of the rate of corrosion, especially for weakly corroding samples, and present preliminary measurements.

  11. Analysis and Quantitation of Glycated Hemoglobin by Matrix Assisted Laser Desorption/Ionization Time of Flight Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Hattan, Stephen J.; Parker, Kenneth C.; Vestal, Marvin L.; Yang, Jane Y.; Herold, David A.; Duncan, Mark W.

    2016-03-01

    Measurement of glycated hemoglobin is widely used for the diagnosis and monitoring of diabetes mellitus. Matrix assisted laser desorption/ionization (MALDI) time of flight (TOF) mass spectrometry (MS) analysis of patient samples is used to demonstrate a method for quantitation of total glycation on the β-subunit of hemoglobin. The approach is accurate and calibrated with commercially available reference materials. Measurements were linear (R2 > 0.99) across the clinically relevant range of 4% to 20% glycation with coefficients of variation of ≤ 2.5%. Additional and independent measurements of glycation of the α-subunit of hemoglobin are used to validate β-subunit glycation measurements and distinguish hemoglobin variants. Results obtained by MALDI-TOF MS were compared with those obtained in a clinical laboratory using validated HPLC methodology. MALDI-TOF MS sample preparation was minimal and analysis times were rapid making the method an attractive alternative to methodologies currently in practice.

  12. Response Neighborhoods in Online Learning Networks: A Quantitative Analysis

    ERIC Educational Resources Information Center

    Aviv, Reuven; Erlich, Zippy; Ravid, Gilad

    2005-01-01

    Theoretical foundation of Response mechanisms in networks of online learners are revealed by Statistical Analysis of p* Markov Models for the Networks. Our comparative analysis of two networks shows that the minimal-effort hunt-for-social-capital mechanism controls a major behavior of both networks: negative tendency to respond. Differences in…

  13. [Research progress of quantitative analysis for respiratory sinus arrhythmia].

    PubMed

    Sun, Congcong; Zhang, Zhengbo; Wang, Buqing; Liu, Hongyun; Ang, Qing; Wang, Weidong

    2011-12-01

    Respiratory sinus arrhythmia (RSA) is known as fluctuations of heart rate associated with breathing. It has been increasingly used as a noninvasive index of cardiac vagal tone in psychophysiological research recently. Its analysis is often influenced or distorted by respiratory parameters, posture and action, etc. This paper reviews five methods of quantification, including the root mean square of successive differences (RMSSD), peak valley RSA (pvRSA), cosinor fitting, spectral analysis, and joint timing-frequency analysis (JTFA). Paced breathing, analysis of covariance, residua method and msRSA per liter tidal volume are adjustment strategies of measurement and analysis of RSA in this article as well. At last, some prospects of solutions of the problems of RSA research are given. PMID:22295719

  14. Quantitative analysis of polydisperse systems via solvent-free matrix-assisted laser desorption/ionization time-of-flight mass spectrometry.

    PubMed

    Kulkarni, Sourabh U; Thies, Mark C

    2012-02-15

    Quantitative analysis of partially soluble and insoluble polydisperse materials is challenging due to the lack of both appropriate standards and reliable analytical techniques. To this end, matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) incorporating a solvent-free sample preparation technique was investigated for the quantitative analysis of partially soluble, polydisperse, polycyclic aromatic hydrocarbon (PAH) oligomers. Molecular weight standards consisting of narrow molecular weight dimer and trimer oligomers of the starting M-50 petroleum pitch were produced using both dense-gas/supercritical extraction (DGE/SCE) and preparative-scale, gel permeation chromatography (GPC). The validity of a MALDI-based, quantitative analysis technique using solvent-free sample preparation was first demonstrated by applying the method of standard addition to a pitch of known composition. The standard addition method was then applied to the quantitative analysis of two insoluble petroleum pitch fractions of unknown oligomeric compositions, with both the dimer and trimer compositions of these fractions being accurately determined. To our knowledge, this study represents the first successful MALDI application of solvent-free quantitative analysis to insoluble, polydisperse materials. PMID:22223328

  15. A quantitative analysis of 3-D coronary modeling from two or more projection images.

    PubMed

    Movassaghi, B; Rasche, V; Grass, M; Viergever, M A; Niessen, W J

    2004-12-01

    A method is introduced to examine the geometrical accuracy of the three-dimensional (3-D) representation of coronary arteries from multiple (two and more) calibrated two-dimensional (2-D) angiographic projections. When involving more then two projections, (multiprojection modeling) a novel procedure is presented that consists of fully automated centerline and width determination in all available projections based on the information provided by the semi-automated centerline detection in two initial calibrated projections. The accuracy of the 3-D coronary modeling approach is determined by a quantitative examination of the 3-D centerline point position and the 3-D cross sectional area of the reconstructed objects. The measurements are based on the analysis of calibrated phantom and calibrated coronary 2-D projection data. From this analysis a confidence region (alpha degrees approximately equal to [35 degrees - 145 degrees]) for the angular distance of two initial projection images is determined for which the modeling procedure is sufficiently accurate for the applied system. Within this angular border range the centerline position error is less then 0.8 mm, in terms of the Euclidean distance to a predefined ground truth. When involving more projections using our new procedure, experiments show that when the initial pair of projection images has an angular distance in the range alpha degrees approximately equal to [35 degrees - 145 degrees], the centerlines in all other projections (gamma = 0 degrees - 180 degrees) were indicated very precisely without any additional centering procedure. When involving additional projection images in the modeling procedure a more realistic shape of the structure can be provided. In case of the concave segment, however, the involvement of multiple projections does not necessarily provide a more realistic shape of the reconstructed structure. PMID:15575409

  16. Quantitative characterization of the regressive ecological succession by fractal analysis of plant spatial patterns

    USGS Publications Warehouse

    Alados, C.L.; Pueyo, Y.; Giner, M.L.; Navarro, T.; Escos, J.; Barroso, F.; Cabezudo, B.; Emlen, J.M.

    2003-01-01

    We studied the effect of grazing on the degree of regression of successional vegetation dynamic in a semi-arid Mediterranean matorral. We quantified the spatial distribution patterns of the vegetation by fractal analyses, using the fractal information dimension and spatial autocorrelation measured by detrended fluctuation analyses (DFA). It is the first time that fractal analysis of plant spatial patterns has been used to characterize the regressive ecological succession. Plant spatial patterns were compared over a long-term grazing gradient (low, medium and heavy grazing pressure) and on ungrazed sites for two different plant communities: A middle dense matorral of Chamaerops and Periploca at Sabinar-Romeral and a middle dense matorral of Chamaerops, Rhamnus and Ulex at Requena-Montano. The two communities differed also in the microclimatic characteristics (sea oriented at the Sabinar-Romeral site and inland oriented at the Requena-Montano site). The information fractal dimension increased as we moved from a middle dense matorral to discontinuous and scattered matorral and, finally to the late regressive succession, at Stipa steppe stage. At this stage a drastic change in the fractal dimension revealed a change in the vegetation structure, accurately indicating end successional vegetation stages. Long-term correlation analysis (DFA) revealed that an increase in grazing pressure leads to unpredictability (randomness) in species distributions, a reduction in diversity, and an increase in cover of the regressive successional species, e.g. Stipa tenacissima L. These comparisons provide a quantitative characterization of the successional dynamic of plant spatial patterns in response to grazing perturbation gradient. ?? 2002 Elsevier Science B.V. All rights reserved.

  17. Quantitative analysis of deuterium using the isotopic effect on quaternary (13)C NMR chemical shifts.

    PubMed

    Darwish, Tamim A; Yepuri, Nageshwar Rao; Holden, Peter J; James, Michael

    2016-07-13

    Quantitative analysis of specifically deuterated compounds can be achieved by a number of conventional methods, such as mass spectroscopy, or by quantifying the residual (1)H NMR signals compared to signals from internal standards. However, site specific quantification using these methods becomes challenging when dealing with non-specifically or randomly deuterated compounds that are produced by metal catalyzed hydrothermal reactions in D2O, one of the most convenient deuteration methods. In this study, deuterium-induced NMR isotope shifts of quaternary (13)C resonances neighboring deuterated sites have been utilized to quantify the degree of isotope labeling of molecular sites in non-specifically deuterated molecules. By probing (13)C NMR signals while decoupling both proton and deuterium nuclei, it is possible to resolve (13)C resonances of the different isotopologues based on the isotopic shifts and the degree of deuteration of the carbon atoms. We demonstrate that in different isotopologues, the same quaternary carbon, neighboring partially deuterated carbon atoms, are affected to an equal extent by relaxation. Decoupling both nuclei ((1)H, (2)H) resolves closely separated quaternary (13)C signals of the different isotopologues, and allows their accurate integration and quantification under short relaxation delays (D1 = 1 s) and hence fast accumulative spectral acquisition. We have performed a number of approaches to quantify the deuterium content at different specific sites to demonstrate a convenient and generic analysis method for use in randomly deuterated molecules, or in cases of specifically deuterated molecules where back-exchange processes may take place during work up. PMID:27237841

  18. Quantitative analysis of numerical solvers for oscillatory biomolecular system models

    PubMed Central

    Quo, Chang F; Wang, May D

    2008-01-01

    Background This article provides guidelines for selecting optimal numerical solvers for biomolecular system models. Because various parameters of the same system could have drastically different ranges from 10-15 to 1010, the ODEs can be stiff and ill-conditioned, resulting in non-unique, non-existing, or non-reproducible modeling solutions. Previous studies have not examined in depth how to best select numerical solvers for biomolecular system models, which makes it difficult to experimentally validate the modeling results. To address this problem, we have chosen one of the well-known stiff initial value problems with limit cycle behavior as a test-bed system model. Solving this model, we have illustrated that different answers may result from different numerical solvers. We use MATLAB numerical solvers because they are optimized and widely used by the modeling community. We have also conducted a systematic study of numerical solver performances by using qualitative and quantitative measures such as convergence, accuracy, and computational cost (i.e. in terms of function evaluation, partial derivative, LU decomposition, and "take-off" points). The results show that the modeling solutions can be drastically different using different numerical solvers. Thus, it is important to intelligently select numerical solvers when solving biomolecular system models. Results The classic Belousov-Zhabotinskii (BZ) reaction is described by the Oregonator model and is used as a case study. We report two guidelines in selecting optimal numerical solver(s) for stiff, complex oscillatory systems: (i) for problems with unknown parameters, ode45 is the optimal choice regardless of the relative error tolerance; (ii) for known stiff problems, both ode113 and ode15s are good choices under strict relative tolerance conditions. Conclusions For any given biomolecular model, by building a library of numerical solvers with quantitative performance assessment metric, we show that it is possible

  19. ImatraNMR: novel software for batch integration and analysis of quantitative NMR spectra.

    PubMed

    Mäkelä, A V; Heikkilä, O; Kilpeläinen, I; Heikkinen, S

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D (1)H and (13)C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request. PMID:21705250

  20. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    NASA Astrophysics Data System (ADS)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  1. Noise filtering of scanning-electron-microscope images for accurate analysis of line-edge and line-width roughness

    NASA Astrophysics Data System (ADS)

    Hiraiwa, Atsushi; Nishida, Akio

    2012-10-01

    The control of line-edge or line-width roughness (LER/LWR) is a challenge, especially for future devices that are fabricated using extreme-ultraviolet (EUV) lithography. Accurate analysis of the LER/LWR plays an essential role in this challenge and requires the noise involved in scanning-electron-microscope (SEM) images to be reduced by appropriate noise filtering prior to analysis. To achieve this, we simulated the SEM images using a Monte Carlo method, and detected line edges in both experimental and theoretical images after noise filtering using new image-analysis software. The validity of this software and these simulations was confirmed by a good agreement between the experimental and theoretical results. In the case when the image pixels aligned perpendicular (crosswise) to line edges were averaged, the variance var(φ) that was additionally induced by the image noise decreased with a number N of averaged pixels, with exceptions when N was relatively large, whereupon the variance increased. The optimal N to minimize var(φ) was formulated based on a statistical mechanism of this change. LER/LWR statistics estimated using the crosswise filtering remained unaffected when N was smaller than the aforementioned optimal value, but monotonically changed when N was larger contrary to expectations. This change was possibly caused by an asymmetric scan-signal profile at edges. On the other hand, averaging image pixels aligned parallel (longitudinal) to line edges not only reduced var(φ) but smoothed real LER/LWR. As a result, the nominal variance of real LWR, obtained using simple arithmetic, monotonically decreased with a number N of averaged pixels. Artifactual oscillations were additionally observed in power spectral densities. Var(φ) in this case decreased in inverse proportion to the square root of N according to the statistical mechanism clarified here. In this way, the noise filtering has a marked effect on the LER/LWR analysis and needs to be appropriately

  2. Quantitative assessment of human motion using video motion analysis

    NASA Technical Reports Server (NTRS)

    Probe, John D.

    1990-01-01

    In the study of the dynamics and kinematics of the human body, a wide variety of technologies was developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development coupled with recent advances in video technology have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System to develop data on shirt-sleeved and space-suited human performance in order to plan efficient on orbit intravehicular and extravehicular activities. The system is described.

  3. Space-to-Ground Communication for Columbus: A Quantitative Analysis

    PubMed Central

    Uhlig, Thomas; Mannel, Thurid; Fortunato, Antonio; Illmer, Norbert

    2015-01-01

    The astronauts on board the International Space Station (ISS) are only the most visible part of a much larger team engaged around the clock in the performance of science and technical activities in space. The bulk of such team is scattered around the globe in five major Mission Control Centers (MCCs), as well as in a number of smaller payload operations centres. Communication between the crew in space and the flight controllers at those locations is an essential element and one of the key drivers to efficient space operations. Such communication can be carried out in different forms, depending on available technical assets and the selected operational approach for the activity at hand. This paper focuses on operational voice communication and provides a quantitative overview of the balance achieved in the Columbus program between collaborative space/ground operations and autonomous on-board activity execution. An interpretation of the current situation is provided, together with a description of potential future approaches for deep space exploration missions. PMID:26290898

  4. Quantitative analysis on electric dipole energy in Rashba band splitting.

    PubMed

    Hong, Jisook; Rhim, Jun-Won; Kim, Changyoung; Ryong Park, Seung; Hoon Shim, Ji

    2015-01-01

    We report on quantitative comparison between the electric dipole energy and the Rashba band splitting in model systems of Bi and Sb triangular monolayers under a perpendicular electric field. We used both first-principles and tight binding calculations on p-orbitals with spin-orbit coupling. First-principles calculation shows Rashba band splitting in both systems. It also shows asymmetric charge distributions in the Rashba split bands which are induced by the orbital angular momentum. We calculated the electric dipole energies from coupling of the asymmetric charge distribution and external electric field, and compared it to the Rashba splitting. Remarkably, the total split energy is found to come mostly from the difference in the electric dipole energy for both Bi and Sb systems. A perturbative approach for long wave length limit starting from tight binding calculation also supports that the Rashba band splitting originates mostly from the electric dipole energy difference in the strong atomic spin-orbit coupling regime. PMID:26323493

  5. Quantitative analysis on electric dipole energy in Rashba band splitting

    PubMed Central

    Hong, Jisook; Rhim, Jun-Won; Kim, Changyoung; Ryong Park, Seung; Hoon Shim, Ji

    2015-01-01

    We report on quantitative comparison between the electric dipole energy and the Rashba band splitting in model systems of Bi and Sb triangular monolayers under a perpendicular electric field. We used both first-principles and tight binding calculations on p-orbitals with spin-orbit coupling. First-principles calculation shows Rashba band splitting in both systems. It also shows asymmetric charge distributions in the Rashba split bands which are induced by the orbital angular momentum. We calculated the electric dipole energies from coupling of the asymmetric charge distribution and external electric field, and compared it to the Rashba splitting. Remarkably, the total split energy is found to come mostly from the difference in the electric dipole energy for both Bi and Sb systems. A perturbative approach for long wave length limit starting from tight binding calculation also supports that the Rashba band splitting originates mostly from the electric dipole energy difference in the strong atomic spin-orbit coupling regime. PMID:26323493

  6. Quantitative analysis of electroluminescence images from polymer solar cells

    NASA Astrophysics Data System (ADS)

    Seeland, Marco; Rösch, Roland; Hoppe, Harald

    2012-01-01

    We introduce the micro-diode-model (MDM) based on a discrete network of interconnected diodes, which allows for quantitative description of lateral electroluminescence emission images obtained from organic bulk heterojunction solar cells. Besides the distributed solar cell description, the equivalent circuit, respectively, network model considers interface and bulk resistances as well as the sheet resistance of the semitransparent electrode. The application of this model allows direct calculation of the lateral current and voltage distribution within the solar cell and thus accounts well for effects known as current crowding. In addition, network parameters such as internal resistances and the sheet-resistance of the higher resistive electrode can be determined. Furthermore, upon introduction of current sources the micro-diode-model also is able to describe and predict current-voltage characteristics for solar cell devices under illumination. The local nature of this description yields important conclusions concerning the geometry dependent performance and the validity of classical models and equivalent circuits describing thin film solar cells.

  7. Quantitative analysis of ultrasound images for computer-aided diagnosis.

    PubMed

    Wu, Jie Ying; Tuomi, Adam; Beland, Michael D; Konrad, Joseph; Glidden, David; Grand, David; Merck, Derek

    2016-01-01

    We propose an adaptable framework for analyzing ultrasound (US) images quantitatively to provide computer-aided diagnosis using machine learning. Our preliminary clinical targets are hepatic steatosis, adenomyosis, and craniosynostosis. For steatosis and adenomyosis, we collected US studies from 288 and 88 patients, respectively, as well as their biopsy or magnetic resonanceconfirmed diagnosis. Radiologists identified a region of interest (ROI) on each image. We filtered the US images for various texture responses and use the pixel intensity distribution within each ROI as feature parameterizations. Our craniosynostosis dataset consisted of 22 CT-confirmed cases and 22 age-matched controls. One physician manually measured the vectors from the center of the skull to the outer cortex at every 10 deg for each image and we used the principal directions as shape features for parameterization. These parameters and the known diagnosis were used to train classifiers. Testing with cross-validation, we obtained 72.74% accuracy and 0.71 area under receiver operating characteristics curve for steatosis ([Formula: see text]), 77.27% and 0.77 for adenomyosis ([Formula: see text]), and 88.63% and 0.89 for craniosynostosis ([Formula: see text]). Our framework is able to detect a variety of diseases with high accuracy. We hope to include it as a routinely available support system in the clinic. PMID:26835502

  8. Quantitative proteomic analysis of the Salmonella-lettuce interaction

    PubMed Central

    Zhang, Yuping; Nandakumar, Renu; Bartelt-Hunt, Shannon L; Snow, Daniel D; Hodges, Laurie; Li, Xu

    2014-01-01

    Human pathogens can internalize food crops through root and surface uptake and persist inside crop plants. The goal of the study was to elucidate the global modulation of bacteria and plant protein expression after Salmonella internalizes lettuce. A quantitative proteomic approach was used to analyse the protein expression of Salmonella enterica serovar Infantis and lettuce cultivar Green Salad Bowl 24 h after infiltrating S. Infantis into lettuce leaves. Among the 50 differentially expressed proteins identified by comparing internalized S. Infantis against S. Infantis grown in Luria Broth, proteins involved in glycolysis were down-regulated, while one protein involved in ascorbate uptake was up-regulated. Stress response proteins, especially antioxidant proteins, were up-regulated. The modulation in protein expression suggested that internalized S. Infantis might utilize ascorbate as a carbon source and require multiple stress response proteins to cope with stresses encountered in plants. On the other hand, among the 20 differentially expressed lettuce proteins, proteins involved in defense response to bacteria were up-regulated. Moreover, the secreted effector PipB2 of S. Infantis and R proteins of lettuce were induced after bacterial internalization into lettuce leaves, indicating human pathogen S. Infantis triggered the defense mechanisms of lettuce, which normally responds to plant pathogens. PMID:24512637

  9. Quantitative analysis of pheromone-binding protein specificity

    PubMed Central

    Katti, S.; Lokhande, N.; González, D.; Cassill, A.; Renthal, R.

    2012-01-01

    Many pheromones have very low water solubility, posing experimental difficulties for quantitative binding measurements. A new method is presented for determining thermodynamically valid dissociation constants for ligands binding to pheromone-binding proteins (OBPs), using β-cyclodextrin as a solubilizer and transfer agent. The method is applied to LUSH, a Drosophila OBP that binds the pheromone 11-cis vaccenyl acetate (cVA). Refolding of LUSH expressed in E. coli was assessed by measuring N-phenyl-1-naphthylamine (NPN) binding and Förster resonance energy transfer between LUSH tryptophan 123 (W123) and NPN. Binding of cVA was measured from quenching of W123 fluorescence as a function of cVA concentration. The equilibrium constant for transfer of cVA between β-cyclodextrin and LUSH was determined from a linked equilibria model. This constant, multiplied by the β-cyclodextrin-cVA dissociation constant, gives the LUSH-cVA dissociation constant: ~100 nM. It was also found that other ligands quench W123 fluorescence. The LUSH-ligand dissociation constants were determined to be ~200 nM for the silk moth pheromone bombykol and ~90 nM for methyl oleate. The results indicate that the ligand-binding cavity of LUSH can accommodate a variety ligands with strong binding interactions. Implications of this for the pheromone receptor model proposed by Laughlin et al. (Cell 133: 1255–65, 2008) are discussed. PMID:23121132

  10. Quantitative Dopant/Impurity Analysis for ICF Targets

    NASA Astrophysics Data System (ADS)

    Huang, Haibo; Nikroo, Abbas; Stephens, Richard; Eddinger, Samual; Xu, Hongwei; Chen, K. C.; Moreno, Kari

    2008-11-01

    We developed a number of new or improved metrology techniques to measure the spatial distributions of multiple elements in ICF ablator capsules to tight NIF specifications (0.5±0.1 at% Cu, 0.25±0.10 at% Ar, 0.4±0.4 at% O). The elements are either the graded dopants for shock timing, such as Cu in Be, or process-induced impurities, such as Ar and O. Their low concentration, high spatial variation and simultaneous presence make the measurement very difficult. We solved this metrology challenge by combining several techniques: Cu and Ar profiles can be nondestructively measured by operating Contact Radiography (CR) in a differential mode. The result, as well as the O profile, can be checked destructively by a quantitative Energy Dispersive Spectroscopy (EDS) method. Non-spatially resolved methods, such as absorption edge spectroscopy (and to a lesser accuracy, x-ray fluorescence) can calibrate the Ar and Cu measurement in EDS and CR. In addition, oxygen pick-up during mandrel removal can be validated by before-and-after CR and by density change. Use of all these methods gives multiple checks on the reported profiles.

  11. Analysis of alpha-synuclein-associated proteins by quantitative proteomics.

    PubMed

    Zhou, Yong; Gu, Guangyu; Goodlett, David R; Zhang, Terry; Pan, Catherine; Montine, Thomas J; Montine, Kathleen S; Aebersold, Ruedi H; Zhang, Jing

    2004-09-10

    To identify the proteins associated with soluble alpha-synuclein (AS) that might promote AS aggregation, a key event leading to neurodegeneration, we quantitatively compared protein profiles of AS-associated protein complexes in MES cells exposed to rotenone, a pesticide that produces parkinsonism in animals and induces Lewy body (LB)-like inclusions in the remaining dopaminergic neurons, and to vehicle. We identified more than 250 proteins associated with Nonidet P-40 soluble AS, and demonstrated that at least 51 of these proteins displayed significant differences in their relative abundance in AS complexes under conditions where rotenone was cytotoxic and induced formation of cytoplasmic inclusions immunoreactive to anti-AS. Overexpressing one of these proteins, heat shock protein (hsp) 70, not only protected cells from rotenone-mediated cytotoxicity but also decreased soluble AS aggregation. Furthermore, the protection afforded by hsp70 transfection appeared to be related to suppression of rotenone-induced oxidative stress as well as mitochondrial and proteasomal dysfunction. PMID:15234983

  12. Quantitative analysis of virus and plasmid trafficking in cells

    NASA Astrophysics Data System (ADS)

    Lagache, Thibault; Dauty, Emmanuel; Holcman, David

    2009-01-01

    Intracellular transport of DNA carriers is a fundamental step of gene delivery. By combining both theoretical and numerical approaches we study here single and several viruses and DNA particles trafficking in the cell cytoplasm to a small nuclear pore. We present a physical model to account for certain aspects of cellular organization, starting with the observation that a viral trajectory consists of epochs of pure diffusion and epochs of active transport along microtubules. We define a general degradation rate to describe the limitations of the delivery of plasmid or viral particles to a nuclear pore imposed by various types of direct and indirect hydrolysis activity inside the cytoplasm. By replacing the switching dynamics by a single steady state stochastic description, we obtain estimates for the probability and the mean time for the first one of many particles to go from the cell membrane to a small nuclear pore. Computational simulations confirm that our model can be used to analyze and interpret viral trajectories and estimate quantitatively the success of nuclear delivery.

  13. Quantitative proteomic analysis of the Salmonella-lettuce interaction.

    PubMed

    Zhang, Yuping; Nandakumar, Renu; Bartelt-Hunt, Shannon L; Snow, Daniel D; Hodges, Laurie; Li, Xu

    2014-11-01

    Human pathogens can internalize food crops through root and surface uptake and persist inside crop plants. The goal of the study was to elucidate the global modulation of bacteria and plant protein expression after Salmonella internalizes lettuce. A quantitative proteomic approach was used to analyse the protein expression of Salmonella enterica serovar Infantis and lettuce cultivar Green Salad Bowl 24 h after infiltrating S. Infantis into lettuce leaves. Among the 50 differentially expressed proteins identified by comparing internalized S. Infantis against S. Infantis grown in Luria Broth, proteins involved in glycolysis were down-regulated, while one protein involved in ascorbate uptake was up-regulated. Stress response proteins, especially antioxidant proteins, were up-regulated. The modulation in protein expression suggested that internalized S. Infantis might utilize ascorbate as a carbon source and require multiple stress response proteins to cope with stresses encountered in plants. On the other hand, among the 20 differentially expressed lettuce proteins, proteins involved in defense response to bacteria were up-regulated. Moreover, the secreted effector PipB2 of S. Infantis and R proteins of lettuce were induced after bacterial internalization into lettuce leaves, indicating human pathogen S. Infantis triggered the defense mechanisms of lettuce, which normally responds to plant pathogens. PMID:24512637

  14. Quantitative Analysis of CME Deflections in the Corona

    NASA Astrophysics Data System (ADS)

    Gui, Bin; Shen, Chenglong; Wang, Yuming; Ye, Pinzhong; Liu, Jiajia; Wang, Shui; Zhao, Xuepu

    2011-07-01

    In this paper, ten CME events viewed by the STEREO twin spacecraft are analyzed to study the deflections of CMEs during their propagation in the corona. Based on the three-dimensional information of the CMEs derived by the graduated cylindrical shell (GCS) model (Thernisien, Howard, and Vourlidas in Astrophys. J. 652, 1305, 2006), it is found that the propagation directions of eight CMEs had changed. By applying the theoretical method proposed by Shen et al. ( Solar Phys. 269, 389, 2011) to all the CMEs, we found that the deflections are consistent, in strength and direction, with the gradient of the magnetic energy density. There is a positive correlation between the deflection rate and the strength of the magnetic energy density gradient and a weak anti-correlation between the deflection rate and the CME speed. Our results suggest that the deflections of CMEs are mainly controlled by the background magnetic field and can be quantitatively described by the magnetic energy density gradient (MEDG) model.

  15. Analysis of quantitative trait loci for behavioral laterality in mice.

    PubMed Central

    Roubertoux, Pierre L; Le Roy, Isabelle; Tordjman, Sylvie; Cherfou, Améziane; Migliore-Samour, Danièle

    2003-01-01

    Laterality is believed to have genetic components, as has been deduced from family studies in humans and responses to artificial selection in mice, but these genetic components are unknown and the underlying physiological mechanisms are still a subject of dispute. We measured direction of laterality (preferential use of left or right paws) and degree of laterality (absolute difference between the use of left and right paws) in C57BL/6ByJ (B) and NZB/BlNJ (N) mice and in their F(1) and F(2) intercrosses. Measurements were taken of both forepaws and hind paws. Quantitative trait loci (QTL) did not emerge for direction but did for degree of laterality. One QTL for forepaw (LOD score = 5.6) and the second QTL for hind paw (LOD score = 7.2) were both located on chromosome 4 and their peaks were within the same confidence interval. A QTL for plasma luteinizing hormone concentration was also found in the confidence interval of these two QTL. These results suggest that the physiological mechanisms underlying degree of laterality react to gonadal steroids. PMID:12663540

  16. Quantitative Proteome Analysis of Leishmania donovani under Spermidine Starvation

    PubMed Central

    Singh, Shalini; Dubey, Vikash Kumar

    2016-01-01

    We have earlier reported antileishmanial activity of hypericin by spermidine starvation. In the current report, we have used label free proteome quantitation approach to identify differentially modulated proteins after hypericin treatment. A total of 141 proteins were found to be differentially regulated with ANOVA P value less than 0.05 in hypericin treated Leishmania promastigotes. Differentially modulated proteins have been broadly classified under nine major categories. Increase in ribosomal protein S7 protein suggests the repression of translation. Inhibition of proteins related to ubiquitin proteasome system, RNA binding protein and translation initiation factor also suggests altered translation. We have also observed increased expression of Hsp 90, Hsp 83–1 and stress inducible protein 1. Significant decreased level of cyclophilin was observed. These stress related protein could be cellular response of the parasite towards hypericin induced cellular stress. Also, defective metabolism, biosynthesis and replication of nucleic acids, flagellar movement and signalling of the parasite were observed as indicated by altered expression of proteins involved in these pathways. The data was analyzed rigorously to get further insight into hypericin induced parasitic death. PMID:27123864

  17. Space-to-Ground Communication for Columbus: A Quantitative Analysis.

    PubMed

    Uhlig, Thomas; Mannel, Thurid; Fortunato, Antonio; Illmer, Norbert

    2015-01-01

    The astronauts on board the International Space Station (ISS) are only the most visible part of a much larger team engaged around the clock in the performance of science and technical activities in space. The bulk of such team is scattered around the globe in five major Mission Control Centers (MCCs), as well as in a number of smaller payload operations centres. Communication between the crew in space and the flight controllers at those locations is an essential element and one of the key drivers to efficient space operations. Such communication can be carried out in different forms, depending on available technical assets and the selected operational approach for the activity at hand. This paper focuses on operational voice communication and provides a quantitative overview of the balance achieved in the Columbus program between collaborative space/ground operations and autonomous on-board activity execution. An interpretation of the current situation is provided, together with a description of potential future approaches for deep space exploration missions. PMID:26290898

  18. Temporal kinetics and quantitative analysis of Cryptococcus neoformans nonlytic exocytosis.

    PubMed

    Stukes, Sabriya A; Cohen, Hillel W; Casadevall, Arturo

    2014-05-01

    Cryptococcus neoformans is a facultative intracellular pathogen and the causative agent of cryptococcosis, a disease that is often fatal to those with compromised immune systems. C. neoformans has the capacity to escape phagocytic cells through a process known as nonlytic exocytosis whereby the cryptococcal cell is released from the macrophage into the extracellular environment, leaving both the host and pathogen alive. Little is known about the mechanism behind nonlytic exocytosis, but there is evidence that both the fungal and host cells contribute to the process. In this study, we used time-lapse movies of C. neoformans-infected macrophages to delineate the kinetics and quantitative aspects of nonlytic exocytosis. We analyzed approximately 800 macrophages containing intracellular C. neoformans and identified 163 nonlytic exocytosis events that were further characterized into three subcategories: type I (complete emptying of macrophage), type II (partial emptying of macrophage), and type III (cell-to-cell transfer). The majority of type I and II events occurred after several hours of intracellular residence, whereas type III events occurred significantly (P < 0.001) earlier in the course of macrophage infection. Our results show that nonlytic exocytosis is a morphologically and temporally diverse process that occurs relatively rapidly in the course of macrophage infection. PMID:24595144

  19. Quantitative analysis of synaptic release at the photoreceptor synapse.

    PubMed

    Duncan, Gabriel; Rabl, Katalin; Gemp, Ian; Heidelberger, Ruth; Thoreson, Wallace B

    2010-05-19

    Exocytosis from the rod photoreceptor is stimulated by submicromolar Ca(2+) and exhibits an unusually shallow dependence on presynaptic Ca(2+). To provide a quantitative description of the photoreceptor Ca(2+) sensor for exocytosis, we tested a family of conventional and allosteric computational models describing the final Ca(2+)-binding steps leading to exocytosis. Simulations were fit to two measures of release, evoked by flash-photolysis of caged Ca(2+): exocytotic capacitance changes from individual rods and postsynaptic currents of second-order neurons. The best simulations supported the occupancy of only two Ca(2+) binding sites on the rod Ca(2+) sensor rather than the typical four or five. For most models, the on-rates for Ca(2+) binding and maximal fusion rate were comparable to those of other neurons. However, the off-rates for Ca(2+) unbinding were unexpectedly slow. In addition to contributing to the high-affinity of the photoreceptor Ca(2+) sensor, slow Ca(2+) unbinding may support the fusion of vesicles located at a distance from Ca(2+) channels. In addition, partial sensor occupancy due to slow unbinding may contribute to the linearization of the first synapse in vision. PMID:20483317

  20. Quantitative Analysis of Synaptic Release at the Photoreceptor Synapse

    PubMed Central

    Duncan, Gabriel; Rabl, Katalin; Gemp, Ian; Heidelberger, Ruth; Thoreson, Wallace B.

    2010-01-01

    Abstract Exocytosis from the rod photoreceptor is stimulated by submicromolar Ca2+ and exhibits an unusually shallow dependence on presynaptic Ca2+. To provide a quantitative description of the photoreceptor Ca2+ sensor for exocytosis, we tested a family of conventional and allosteric computational models describing the final Ca2+-binding steps leading to exocytosis. Simulations were fit to two measures of release, evoked by flash-photolysis of caged Ca2+: exocytotic capacitance changes from individual rods and postsynaptic currents of second-order neurons. The best simulations supported the occupancy of only two Ca2+ binding sites on the rod Ca2+ sensor rather than the typical four or five. For most models, the on-rates for Ca2+ binding and maximal fusion rate were comparable to those of other neurons. However, the off-rates for Ca2+ unbinding were unexpectedly slow. In addition to contributing to the high-affinity of the photoreceptor Ca2+ sensor, slow Ca2+ unbinding may support the fusion of vesicles located at a distance from Ca2+ channels. In addition, partial sensor occupancy due to slow unbinding may contribute to the linearization of the first synapse in vision. PMID:20483317

  1. Quantitative analysis of task selection for brain-computer interfaces

    NASA Astrophysics Data System (ADS)

    Llera, Alberto; Gómez, Vicenç; Kappen, Hilbert J.

    2014-10-01

    Objective. To assess quantitatively the impact of task selection in the performance of brain-computer interfaces (BCI). Approach. We consider the task-pairs derived from multi-class BCI imagery movement tasks in three different datasets. We analyze for the first time the benefits of task selection on a large-scale basis (109 users) and evaluate the possibility of transferring task-pair information across days for a given subject. Main results. Selecting the subject-dependent optimal task-pair among three different imagery movement tasks results in approximately 20% potential increase in the number of users that can be expected to control a binary BCI. The improvement is observed with respect to the best task-pair fixed across subjects. The best task-pair selected for each subject individually during a first day of recordings is generally a good task-pair in subsequent days. In general, task learning from the user side has a positive influence in the generalization of the optimal task-pair, but special attention should be given to inexperienced subjects. Significance. These results add significant evidence to existing literature that advocates task selection as a necessary step towards usable BCIs. This contribution motivates further research focused on deriving adaptive methods for task selection on larger sets of mental tasks in practical online scenarios.

  2. Temporal Kinetics and Quantitative Analysis of Cryptococcus neoformans Nonlytic Exocytosis

    PubMed Central

    Stukes, Sabriya A.; Cohen, Hillel W.

    2014-01-01

    Cryptococcus neoformans is a facultative intracellular pathogen and the causative agent of cryptococcosis, a disease that is often fatal to those with compromised immune systems. C. neoformans has the capacity to escape phagocytic cells through a process known as nonlytic exocytosis whereby the cryptococcal cell is released from the macrophage into the extracellular environment, leaving both the host and pathogen alive. Little is known about the mechanism behind nonlytic exocytosis, but there is evidence that both the fungal and host cells contribute to the process. In this study, we used time-lapse movies of C. neoformans-infected macrophages to delineate the kinetics and quantitative aspects of nonlytic exocytosis. We analyzed approximately 800 macrophages containing intracellular C. neoformans and identified 163 nonlytic exocytosis events that were further characterized into three subcategories: type I (complete emptying of macrophage), type II (partial emptying of macrophage), and type III (cell-to-cell transfer). The majority of type I and II events occurred after several hours of intracellular residence, whereas type III events occurred significantly (P < 0.001) earlier in the course of macrophage infection. Our results show that nonlytic exocytosis is a morphologically and temporally diverse process that occurs relatively rapidly in the course of macrophage infection. PMID:24595144

  3. Quantitative analysis of iontophoretic drug delivery from micropipettes.

    PubMed

    Kirkpatrick, D C; Walton, L R; Edwards, M A; Wightman, R M

    2016-03-21

    Microiontophoresis is a drug delivery method in which an electric current is used to eject molecular species from a micropipette. It has been primarily utilized for neurochemical investigations, but is limited due to difficulty controlling and determining the ejected quantity. Consequently the concentration of an ejected species and the extent of the affected region are relegated to various methods of approximation. To address this, we investigated the principles underlying ejection rates and examined the concentration distribution in microiontophoresis using a combination of electrochemical, chromatographic, and fluorescence-based approaches. This involved a principal focus on how the iontophoretic barrel solution affects ejection characteristics. The ion ejection rate displayed a direct correspondence to the ionic mole fraction, regardless of the ejection current polarity. In contrast, neutral molecules are ejected by electroosmotic flow (EOF) at a rate proportional to the barrel solution concentration. Furthermore, the presence of EOF was observed from barrels containing high ionic strength solutions. In practice, use of a retaining current draws extracellular ions into the barrel and will alter the barrel solution composition. Even in the absence of a retaining current, diffusional exchange at the barrel tip will occur. Thus behavior of successive ejections may slightly differ. To account for this, electrochemical or fluorescence markers can be incorporated into the barrel solution in order to compare ejection quantities. These may also be used to provide an estimate of the ejected amount and distribution provided accurate use of calibration procedures. PMID:26890395

  4. [Quantitative spectrum analysis of characteristic gases of spontaneous combustion coal].

    PubMed

    Liang, Yun-Tao; Tang, Xiao-Jun; Luo, Hai-Zhu; Sun, Yong

    2011-09-01

    Aimed at the characteristics of spontaneous combustion gas such as a variety of gases, lou limit of detection, and critical requirement of safety, Fourier transform infrared (FTIR) spectral analysis is presented to analyze characteristic gases of spontaneous combustion In this paper, analysis method is introduced at first by combing characteristics of absorption spectra of analyte and analysis requirement. Parameter setting method, sample preparation, feature variable abstract and analysis model building are taken into consideration. The methods of sample preparation, feature abstraction and analysis model are introduced in detail. And then, eleven kinds of gases were tested with Tensor 27 spectrometer. CH4, C2H6, C3H8, iC4H10, nC4H10, C2 H4, C3 H6, C3 H2, SF6, CO and CO2 were included. The optical path length was 10 cm while the spectra resolution was set as 1 cm(-1). The testing results show that the detection limit of all analytes is less than 2 x 10(-6). All the detection limits fit the measurement requirement of spontaneous combustion gas, which means that FTIR may be an ideal instrument and the analysis method used in this paper is competent for spontaneous combustion gas measurement on line. PMID:22097853

  5. Quantitative Brightness Analysis of Fluorescence Intensity Fluctuations in E. Coli

    PubMed Central

    Hur, Kwang-Ho; Mueller, Joachim D.

    2015-01-01

    The brightness measured by fluorescence fluctuation spectroscopy specifies the average stoichiometry of a labeled protein in a sample. Here we extended brightness analysis, which has been mainly applied in eukaryotic cells, to prokaryotic cells with E. coli serving as a model system. The small size of the E. coli cell introduces unique challenges for applying brightness analysis that are addressed in this work. Photobleaching leads to a depletion of fluorophores and a reduction of the brightness of protein complexes. In addition, the E. coli cell and the point spread function of the instrument only partially overlap, which influences intensity fluctuations. To address these challenges we developed MSQ analysis, which is based on the mean Q-value of segmented photon count data, and combined it with the analysis of axial scans through the E. coli cell. The MSQ method recovers brightness, concentration, and diffusion time of soluble proteins in E. coli. We applied MSQ to measure the brightness of EGFP in E. coli and compared it to solution measurements. We further used MSQ analysis to determine the oligomeric state of nuclear transport factor 2 labeled with EGFP expressed in E. coli cells. The results obtained demonstrate the feasibility of quantifying the stoichiometry of proteins by brightness analysis in a prokaryotic cell. PMID:26099032

  6. [Quantitative Analysis of Immuno-fluorescence of Nuclear Factor-κB Activation].

    PubMed

    Xiu, Min; He, Feng; Lou, Yuanlei; Xu, Lu; Xiong Jieqi; Wang, Ping; Liu, Sisun; Guo, Fei

    2015-06-01

    Immuno-fluorescence technique can qualitatively determine certain nuclear translocation, of which NF-κB/ p65 implicates the activation of NF-κB signal pathways. Immuno-fluorescence analysis software with independent property rights is able to quantitatively analyze dynamic location of NF-κB/p65 by computing relative fluorescence units in nuclei and cytoplasm. We verified the quantitative analysis by Western Blot. When we applied the software to analysis of nuclear translocation in lipopolysaccharide (LPS) induced (0. 5 h, 1 h, 2 h, 4 h) primary human umbilical vein endothelial cells (HUVECs) , we found that nuclear translocation peak showed up at 2h as with calculated Western blot verification results, indicating that the inventive immuno-fluorescence analysis software can be applied to the quantitative analysis of immuno-fluorescence. PMID:26485997

  7. Quantitative assessment of human motion using video motion analysis

    NASA Technical Reports Server (NTRS)

    Probe, John D.

    1993-01-01

    In the study of the dynamics and kinematics of the human body a wide variety of technologies has been developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development, coupled with recent advances in video technology, have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System (APAS) to develop data on shirtsleeved and space-suited human performance in order to plan efficient on-orbit intravehicular and extravehicular activities. APAS is a fully integrated system of hardware and software for biomechanics and the analysis of human performance and generalized motion measurement. Major components of the complete system include the video system, the AT compatible computer, and the proprietary software.

  8. Application of terahertz time-domain spectroscopy combined with chemometrics to quantitative analysis of imidacloprid in rice samples

    NASA Astrophysics Data System (ADS)

    Chen, Zewei; Zhang, Zhuoyong; Zhu, Ruohua; Xiang, Yuhong; Yang, Yuping; Harrington, Peter B.

    2015-12-01

    Terahertz time-domain spectroscopy (THz-TDS) has been utilized as an effective tool for quantitative analysis of imidacloprid in rice powder samples. Unlike previous studies, our method for sample preparation was mixing imidacloprid with rice powder instead of polyethylene. Then, terahertz time domain transmission spectra of these mixed samples were measured and the absorption coefficient spectra of the samples with frequency range extending from 0.3 to 1.7 THz were obtained. Asymmetric least square (AsLS) method was utilized to correct the slope baselines that are presented in THz absorption coefficient spectra and improve signal-to-noise ratio of THz spectra. Chemometrics methods, including partial least squares (PLS), support vector regression (SVR), interval partial least squares (iPLS), and backward interval partial least squares (biPLS), were used for quantitative model building and prediction. To achieve a reliable and unbiased estimation, bootstrapped Latin partition was chosen as an approach for statistical cross-validation. Results showed that the mean value of root mean square error of prediction (RMSEP) for PLS (0.5%) is smaller than SVR (0.7%), these two methods were based on the whole absorption coefficient spectra. In addition, PLS performed a better performance with a lower RMSEP (0.3%) based on the THz absorption coefficient spectra after AsLS baseline correction. Alternatively, two methods for variable selection, namely iPLS and biPLS, yielded models with improved predictions. Comparing with conventional PLS and SVR, the mean values of RMSEP were 0.4% (iPLS) and 0.3% (biPLS) by selecting the informative frequency ranges. The results demonstrated that an accurate quantitative analysis of imidacloprid in rice powder samples could be achieved by terahertz time-domain transmission spectroscopy combined with chemometrics. Furthermore, these results demonstrate that THz time-domain spectroscopy can be used for quantitative determinations of other

  9. Quantitative analysis of nailfold capillary morphology in patients with fibromyalgia

    PubMed Central

    Choi, Dug-Hyun

    2015-01-01

    Background/Aims Nailfold capillaroscopy (NFC) has been used to examine morphological and functional microcirculation changes in connective tissue diseases. It has been demonstrated that NFC patterns reflect abnormal microvascular dynamics, which may play a role in fibromyalgia (FM) syndrome. The aim of this study was to determine NFC patterns in FM, and their association with clinical features of FM. Methods A total of 67 patients with FM, and 30 age- and sex-matched healthy controls, were included. Nailfold capillary patterns were quantitatively analyzed using computerized NFC. The parameters of interest were as follows: number of capillaries within the central 3 mm, deletion score, apical limb width, capillary width, and capillary dimension. Capillary dimension was determined by calculating the number of capillaries using the Adobe Photoshop version 7.0. Results FM patients had a lower number of capillaries and higher deletion scores on NFC compared to healthy controls (17.3 ± 1.7 vs. 21.8 ± 2.9, p < 0.05; 2.2 ± 0.9 vs. 0.7 ± 0.6, p < 0.05, respectively). Both apical limb width (µm) and capillary width (µm) were significantly decreased in FM patients (1.1 ± 0.2 vs. 3.7 ± 0.6; 5.4 ± 0.5 vs. 7.5 ± 1.4, respectively), indicating that FM patients have abnormally decreased digital capillary diameter and density. Interestingly, there was no difference in capillary dimension between the two groups, suggesting that the length or tortuosity of capillaries in FM patients is increased to compensate for diminished microcirculation. Conclusions FM patients had altered capillary density and diameter in the digits. Diminished microcirculation on NFC may alter capillary density and increase tortuosity. PMID:26161020

  10. Quantitative analysis of wrist electrodermal activity during sleep.

    PubMed

    Sano, Akane; Picard, Rosalind W; Stickgold, Robert

    2014-12-01

    We present the first quantitative characterization of electrodermal activity (EDA) patterns on the wrists of healthy adults during sleep using dry electrodes. We compare the new results on the wrist to the prior findings on palmar or finger EDA by characterizing data measured from 80 nights of sleep consisting of 9 nights of wrist and palm EDA from 9 healthy adults sleeping at home, 56 nights of wrist and palm EDA from one healthy adult sleeping at home, and 15 nights of wrist EDA from 15 healthy adults in a sleep laboratory, with the latter compared to concurrent polysomnography. While high frequency patterns of EDA called "storms" were identified by eye in the 1960s, we systematically compare thresholds for automatically detecting EDA peaks and establish criteria for EDA storms. We found that more than 80% of the EDA peaks occurred in non-REM sleep, specifically during slow-wave sleep (SWS) and non-REM stage 2 sleep (NREM2). Also, EDA amplitude is higher in SWS than in other sleep stages. Longer EDA storms were more likely to occur in the first two quarters of sleep and during SWS and NREM2. We also found from the home studies (65 nights) that EDA levels were higher and the skin conductance peaks were larger and more frequent when measured on the wrist than when measured on the palm. These EDA high frequency peaks and high amplitude were sometimes associated with higher skin temperature, but more work is needed looking at neurological and other EDA elicitors in order to elucidate their complete behavior. PMID:25286449

  11. Accurate Analysis of the Change in Volume, Location, and Shape of Metastatic Cervical Lymph Nodes During Radiotherapy

    SciTech Connect

    Takao, Seishin; Tadano, Shigeru; Taguchi, Hiroshi; Yasuda, Koichi; Onimaru, Rikiya; Ishikawa, Masayori; Bengua, Gerard; Suzuki, Ryusuke; Shirato, Hiroki

    2011-11-01

    Purpose: To establish a method for the accurate acquisition and analysis of the variations in tumor volume, location, and three-dimensional (3D) shape of tumors during radiotherapy in the era of image-guided radiotherapy. Methods and Materials: Finite element models of lymph nodes were developed based on computed tomography (CT) images taken before the start of treatment and every week during the treatment period.