Science.gov

Sample records for additional quantitative analysis

  1. Quantitative Analysis of Polymer Additives with MALDI-TOF MS Using an Internal Standard Approach

    NASA Astrophysics Data System (ADS)

    Schwarzinger, Clemens; Gabriel, Stefan; Beißmann, Susanne; Buchberger, Wolfgang

    2012-06-01

    MALDI-TOF MS is used for the qualitative analysis of seven different polymer additives directly from the polymer without tedious sample pretreatment. Additionally, by using a solid sample preparation technique, which avoids the concentration gradient problems known to occur with dried droplets and by adding tetraphenylporphyrine as an internal standard to the matrix, it is possible to perform quantitative analysis of additives directly from the polymer sample. Calibration curves for Tinuvin 770, Tinuvin 622, Irganox 1024, Irganox 1010, Irgafos 168, and Chimassorb 944 are presented, showing coefficients of determination between 0.911 and 0.990.

  2. Quantitative analysis of polymer additives with MALDI-TOF MS using an internal standard approach.

    PubMed

    Schwarzinger, Clemens; Gabriel, Stefan; Beißmann, Susanne; Buchberger, Wolfgang

    2012-06-01

    MALDI-TOF MS is used for the qualitative analysis of seven different polymer additives directly from the polymer without tedious sample pretreatment. Additionally, by using a solid sample preparation technique, which avoids the concentration gradient problems known to occur with dried droplets and by adding tetraphenylporphyrine as an internal standard to the matrix, it is possible to perform quantitative analysis of additives directly from the polymer sample. Calibration curves for Tinuvin 770, Tinuvin 622, Irganox 1024, Irganox 1010, Irgafos 168, and Chimassorb 944 are presented, showing coefficients of determination between 0.911 and 0.990.

  3. Quantitative analysis

    PubMed Central

    Nevin, John A.

    1984-01-01

    Quantitative analysis permits the isolation of invariant relations in the study of behavior. The parameters of these relations can serve as higher-order dependent variables in more extensive analyses. These points are illustrated by reference to quantitative descriptions of performance maintained by concurrent schedules, multiple schedules, and signal-detection procedures. Such quantitative descriptions of empirical data may be derived from mathematical theories, which in turn can lead to novel empirical analyses so long as their terms refer to behavioral and environmental events. Thus, quantitative analysis is an integral aspect of the experimental analysis of behavior. PMID:16812400

  4. The quantitative surface analysis of an antioxidant additive in a lubricant oil matrix by desorption electrospray ionization mass spectrometry

    PubMed Central

    Da Costa, Caitlyn; Reynolds, James C; Whitmarsh, Samuel; Lynch, Tom; Creaser, Colin S

    2013-01-01

    RATIONALE Chemical additives are incorporated into commercial lubricant oils to modify the physical and chemical properties of the lubricant. The quantitative analysis of additives in oil-based lubricants deposited on a surface without extraction of the sample from the surface presents a challenge. The potential of desorption electrospray ionization mass spectrometry (DESI-MS) for the quantitative surface analysis of an oil additive in a complex oil lubricant matrix without sample extraction has been evaluated. METHODS The quantitative surface analysis of the antioxidant additive octyl (4-hydroxy-3,5-di-tert-butylphenyl)propionate in an oil lubricant matrix was carried out by DESI-MS in the presence of 2-(pentyloxy)ethyl 3-(3,5-di-tert-butyl-4-hydroxyphenyl)propionate as an internal standard. A quadrupole/time-of-flight mass spectrometer fitted with an in-house modified ion source enabling non-proximal DESI-MS was used for the analyses. RESULTS An eight-point calibration curve ranging from 1 to 80 µg/spot of octyl (4-hydroxy-3,5-di-tert-butylphenyl)propionate in an oil lubricant matrix and in the presence of the internal standard was used to determine the quantitative response of the DESI-MS method. The sensitivity and repeatability of the technique were assessed by conducting replicate analyses at each concentration. The limit of detection was determined to be 11 ng/mm2 additive on spot with relative standard deviations in the range 3–14%. CONCLUSIONS The application of DESI-MS to the direct, quantitative surface analysis of a commercial lubricant additive in a native oil lubricant matrix is demonstrated. © 2013 The Authors. Rapid Communications in Mass Spectrometry published by John Wiley & Sons, Ltd. PMID:24097398

  5. [Rapid Quantitative Analysis of Content of the Additive in Gasoline for Motor Vehicles by Near-Infrared Spectroscopy].

    PubMed

    Rong, Hai-teng; Song, Chun-feng; Yuan, Hong-fu; Li, Xiao-yu; Hu, Ai-qin; Xie, Jin-chun; Yan, De-lin

    2015-10-01

    A new rapid quantitative method for the determination of oxygenates and the compounds not included in the national standard in gasoline using near-infrared spectroscopy is raised by this paper. This method combine near-infrared spectroscopy with oblique projection. This experiment choose four different types of gasoline, including reconcile gasoline, FCC refined gasoline, reformed gasoline and desulfurizing gasoline. Prepare series gasoline samples containing different concentrations and different types of compounds. Using FTIR spectrometer to measure those samples and got transmission spectrums. Oblique projection method could separate quantity spectral signal from mixed spectrum signal, and using projection to calculate and analyze the separated signal to obtain the content of measured component. The deviation between this method and the real content is low, the absolute error is less than 0.8 and the relative error is less than 8%. For the actual gasoline samples, compare results of this method with gas chromatography, the absolute error are less than 0.85 and the relative error are less than 6.85%. This method solves the problem of general multivariate calibration methods. It is very significant for the development of rapid detection technology using NIR suitable for on-site and the improvement of the quality of gasoline.

  6. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  7. Effect of preservative addition on sensory and dynamic profile of Lucanian dry-sausages as assessed by quantitative descriptive analysis and temporal dominance of sensations.

    PubMed

    Braghieri, Ada; Piazzolla, Nicoletta; Galgano, Fernanda; Condelli, Nicola; De Rosa, Giuseppe; Napolitano, Fabio

    2016-12-01

    The quantitative descriptive analysis (QDA) was combined with temporal dominance of sensations (TDS) to assess the sensory properties of Lucanian dry-sausages either added with nitrate, nitrite and l-ascorbic acid (NS), or not (NNS). Both QDA and TDS differentiated the two groups of sausages. NNS products were perceived with higher intensity of hardness (P<0.05) and tended to be perceived with higher intensities of flavor (P<0.10), pepper (P<0.20), and oiliness (P<0.20), while resulting lower in chewiness (P<0.20). TDS showed that in all the sausages hardness was the first dominant attribute; then, in NNS products flavor remained dominant until the end of tasting, whereas in NS products oiliness prevailed. In conclusion, TDS showed that the perception of some textural parameters, such as oiliness, during mastication was more dominant in NS products, whereas using conventional QDA this attribute appeared higher in sausages manufactured without preservatives. Therefore, TDS provided additional information for the description and differentiation of Lucanian sausages. PMID:27486959

  8. Detection of multivessel disease in patients with sustained myocardial infarction by thallium 201 myocardial scintigraphy: No additional value of quantitative analysis

    SciTech Connect

    Niemeyer, M.G.; Pauwels, E.K.; van der Wall, E.E.; Cramer, M.J.; Verzijlbergen, J.F.; Zwinderman, A.H.; Ascoop, C.A. )

    1989-01-01

    This study was performed to determine the value of visual and quantitative thallium 201 scintigraphy for the detection of multivessel disease in 67 patients with a sustained transmural myocardial infarction. Also the viability of the myocardial regions corresponding to pathologic Q-waves was evaluated. Of the 67 patients, 51 patients had multivessel coronary artery disease (76%). The sensitivity of the exercise test was 53%, of thallium scintigraphy 69%, when interpreted visually, and 67%, when analysed quantitatively. The specificity of these methods was 69%, 56%, and 50%, respectively. Sixty-two infarct-related flow regions were detected by visual analysis of the thallium scans, total redistribution was observed in 11/62 (18%) of patients, partial redistribution in 26/62 (42%), and no redistribution in 25/62 (40%) of patients. The infarct-related areas with total redistribution on the thallium scintigrams were more likely to be associated with normal or hypokinetic wall motion (7/11: 64%) than the areas with a persistent defect (7/25:28%) (P = 0.05), which were more related with akinetic or dyskinetic wall motion. Based on our results, it is concluded that (1) both visual and quantitative analysis of thallium exercise scintigraphy have limited value to predict the presence or absence of multivessel coronary artery disease in patients with sustained myocardial infarction, and (2) exercise-induced thallium redistribution may occur within the infarct zone, suggesting the presence of viable but jeopardized myocardium in presumed fibrotic myocardial areas.

  9. Quantitative image analysis of synovial tissue.

    PubMed

    van der Hall, Pascal O; Kraan, Maarten C; Tak, Paul Peter

    2007-01-01

    Quantitative image analysis is a form of imaging that includes microscopic histological quantification, video microscopy, image analysis, and image processing. Hallmarks are the generation of reliable, reproducible, and efficient measurements via strict calibration and step-by-step control of the acquisition, storage and evaluation of images with dedicated hardware and software. Major advantages of quantitative image analysis over traditional techniques include sophisticated calibration systems, interaction, speed, and control of inter- and intraobserver variation. This results in a well controlled environment, which is essential for quality control and reproducibility, and helps to optimize sensitivity and specificity. To achieve this, an optimal quantitative image analysis system combines solid software engineering with easy interactivity with the operator. Moreover, the system also needs to be as transparent as possible in generating the data because a "black box design" will deliver uncontrollable results. In addition to these more general aspects, specifically for the analysis of synovial tissue the necessity of interactivity is highlighted by the added value of identification and quantification of information as present in areas such as the intimal lining layer, blood vessels, and lymphocyte aggregates. Speed is another important aspect of digital cytometry. Currently, rapidly increasing numbers of samples, together with accumulation of a variety of markers and detection techniques has made the use of traditional analysis techniques such as manual quantification and semi-quantitative analysis unpractical. It can be anticipated that the development of even more powerful computer systems with sophisticated software will further facilitate reliable analysis at high speed.

  10. A comparison of 3D poly(ε-caprolactone) tissue engineering scaffolds produced with conventional and additive manufacturing techniques by means of quantitative analysis of SR μ-CT images

    NASA Astrophysics Data System (ADS)

    Brun, F.; Intranuovo, F.; Mohammadi, S.; Domingos, M.; Favia, P.; Tromba, G.

    2013-07-01

    The technique used to produce a 3D tissue engineering (TE) scaffold is of fundamental importance in order to guarantee its proper morphological characteristics. An accurate assessment of the resulting structural properties is therefore crucial in order to evaluate the effectiveness of the produced scaffold. Synchrotron radiation (SR) computed microtomography (μ-CT) combined with further image analysis seems to be one of the most effective techniques to this aim. However, a quantitative assessment of the morphological parameters directly from the reconstructed images is a non trivial task. This study considers two different poly(ε-caprolactone) (PCL) scaffolds fabricated with a conventional technique (Solvent Casting Particulate Leaching, SCPL) and an additive manufacturing (AM) technique (BioCell Printing), respectively. With the first technique it is possible to produce scaffolds with random, non-regular, rounded pore geometry. The AM technique instead is able to produce scaffolds with square-shaped interconnected pores of regular dimension. Therefore, the final morphology of the AM scaffolds can be predicted and the resulting model can be used for the validation of the applied imaging and image analysis protocols. It is here reported a SR μ-CT image analysis approach that is able to effectively and accurately reveal the differences in the pore- and throat-size distributions as well as connectivity of both AM and SCPL scaffolds.

  11. Quantitative Proteomic Analysis of the Human Nucleolus.

    PubMed

    Bensaddek, Dalila; Nicolas, Armel; Lamond, Angus I

    2016-01-01

    Recent years have witnessed spectacular progress in the field of mass spectrometry (MS)-based quantitative proteomics, including advances in instrumentation, chromatography, sample preparation methods, and experimental design for multidimensional analyses. It is now possible not only to identify most of the protein components of a cell proteome in a single experiment, but also to describe additional proteome dimensions, such as protein turnover rates, posttranslational modifications, and subcellular localization. Furthermore, by comparing the proteome at different time points, it is possible to create a "time-lapse" view of proteome dynamics. By combining high-throughput quantitative proteomics with detailed subcellular fractionation protocols and data analysis techniques it is also now possible to characterize in detail the proteomes of specific subcellular organelles, providing important insights into cell regulatory mechanisms and physiological responses. In this chapter we present a reliable workflow and protocol for MS-based analysis and quantitation of the proteome of nucleoli isolated from human cells. The protocol presented is based on a SILAC analysis of human MCF10A-Src-ER cells with analysis performed on a Q-Exactive Plus Orbitrap MS instrument (Thermo Fisher Scientific). The subsequent chapter describes how to process the resulting raw MS files from this experiment using MaxQuant software and data analysis procedures to evaluate the nucleolar proteome using customized R scripts. PMID:27576725

  12. Automated quantitative analysis for pneumoconiosis

    NASA Astrophysics Data System (ADS)

    Kondo, Hiroshi; Zhao, Bin; Mino, Masako

    1998-09-01

    Automated quantitative analysis for pneumoconiosis is presented. In this paper Japanese standard radiographs of pneumoconiosis are categorized by measuring the area density and the number density of small rounded opacities. And furthermore the classification of the size and shape of the opacities is made from the measuring of the equivalent radiuses of each opacity. The proposed method includes a bi- level unsharp masking filter with a 1D uniform impulse response in order to eliminate the undesired parts such as the images of blood vessels and ribs in the chest x-ray photo. The fuzzy contrast enhancement is also introduced in this method for easy and exact detection of small rounded opacities. Many simulation examples show that the proposed method is more reliable than the former method.

  13. Quantitative proteomic analysis of intact plastids.

    PubMed

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described. PMID:24136541

  14. Quantitative proteomic analysis of intact plastids.

    PubMed

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described.

  15. Integrated microfluidic device for serum biomarker quantitation using either standard addition or a calibration curve.

    PubMed

    Yang, Weichun; Sun, Xiuhua; Wang, Hsiang-Yu; Woolley, Adam T

    2009-10-01

    Detection and accurate quantitation of biomarkers such as alpha-fetoprotein (AFP) can be a key aspect of early stage cancer diagnosis. Microfluidic devices provide attractive analysis capabilities, including low sample and reagent consumption, as well as short assay times. However, to date microfluidic analyzers have relied almost exclusively on calibration curves for sample quantitation, which can be problematic for complex mixtures such as human serum. We have fabricated integrated polymer microfluidic systems that can quantitatively determine fluorescently labeled AFP in human serum using either the method of standard addition or a calibration curve. Our microdevices couple an immunoaffinity purification step with rapid microchip electrophoresis separation in a laser-induced fluorescence detection system, all under automated voltage control in a miniaturized polymer microchip. In conjunction with laser-induced fluorescence detection, these systems can quantify AFP at approximately 1 ng/mL levels in approximately 10 microL of human serum in a few tens of minutes. Our polymer microdevices have been applied in determining AFP in spiked serum samples. These integrated microsystems offer excellent potential for rapid, simple, and accurate biomarker quantitation in a point-of-care setting.

  16. Quantitative analysis of digital microscope images.

    PubMed

    Wolf, David E; Samarasekera, Champika; Swedlow, Jason R

    2013-01-01

    This chapter discusses quantitative analysis of digital microscope images and presents several exercises to provide examples to explain the concept. This chapter also presents the basic concepts in quantitative analysis for imaging, but these concepts rest on a well-established foundation of signal theory and quantitative data analysis. This chapter presents several examples for understanding the imaging process as a transformation from sample to image and the limits and considerations of quantitative analysis. This chapter introduces to the concept of digitally correcting the images and also focuses on some of the more critical types of data transformation and some of the frequently encountered issues in quantization. Image processing represents a form of data processing. There are many examples of data processing such as fitting the data to a theoretical curve. In all these cases, it is critical that care is taken during all steps of transformation, processing, and quantization.

  17. Quantitative analysis of digital microscope images.

    PubMed

    Wolf, David E; Samarasekera, Champika; Swedlow, Jason R

    2013-01-01

    This chapter discusses quantitative analysis of digital microscope images and presents several exercises to provide examples to explain the concept. This chapter also presents the basic concepts in quantitative analysis for imaging, but these concepts rest on a well-established foundation of signal theory and quantitative data analysis. This chapter presents several examples for understanding the imaging process as a transformation from sample to image and the limits and considerations of quantitative analysis. This chapter introduces to the concept of digitally correcting the images and also focuses on some of the more critical types of data transformation and some of the frequently encountered issues in quantization. Image processing represents a form of data processing. There are many examples of data processing such as fitting the data to a theoretical curve. In all these cases, it is critical that care is taken during all steps of transformation, processing, and quantization. PMID:23931513

  18. Cancer detection by quantitative fluorescence image analysis.

    PubMed

    Parry, W L; Hemstreet, G P

    1988-02-01

    Quantitative fluorescence image analysis is a rapidly evolving biophysical cytochemical technology with the potential for multiple clinical and basic research applications. We report the application of this technique for bladder cancer detection and discuss its potential usefulness as an adjunct to methods used currently by urologists for the diagnosis and management of bladder cancer. Quantitative fluorescence image analysis is a cytological method that incorporates 2 diagnostic techniques, quantitation of nuclear deoxyribonucleic acid and morphometric analysis, in a single semiautomated system to facilitate the identification of rare events, that is individual cancer cells. When compared to routine cytopathology for detection of bladder cancer in symptomatic patients, quantitative fluorescence image analysis demonstrated greater sensitivity (76 versus 33 per cent) for the detection of low grade transitional cell carcinoma. The specificity of quantitative fluorescence image analysis in a small control group was 94 per cent and with the manual method for quantitation of absolute nuclear fluorescence intensity in the screening of high risk asymptomatic subjects the specificity was 96.7 per cent. The more familiar flow cytometry is another fluorescence technique for measurement of nuclear deoxyribonucleic acid. However, rather than identifying individual cancer cells, flow cytometry identifies cellular pattern distributions, that is the ratio of normal to abnormal cells. Numerous studies by others have shown that flow cytometry is a sensitive method to monitor patients with diagnosed urological disease. Based upon results in separate quantitative fluorescence image analysis and flow cytometry studies, it appears that these 2 fluorescence techniques may be complementary tools for urological screening, diagnosis and management, and that they also may be useful separately or in combination to elucidate the oncogenic process, determine the biological potential of tumors

  19. Quantitative histogram analysis of images

    NASA Astrophysics Data System (ADS)

    Holub, Oliver; Ferreira, Sérgio T.

    2006-11-01

    A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for

  20. Quantitative WDS analysis using electron probe microanalyzer

    SciTech Connect

    Ul-Hamid, Anwar . E-mail: anwar@kfupm.edu.sa; Tawancy, Hani M.; Mohammed, Abdul-Rashid I.; Al-Jaroudi, Said S.; Abbas, Nureddin M.

    2006-04-15

    In this paper, the procedure for conducting quantitative elemental analysis by ZAF correction method using wavelength dispersive X-ray spectroscopy (WDS) in an electron probe microanalyzer (EPMA) is elaborated. Analysis of a thermal barrier coating (TBC) system formed on a Ni-based single crystal superalloy is presented as an example to illustrate the analysis of samples consisting of a large number of major and minor elements. The analysis was performed by known standards and measured peak-to-background intensity ratios. The procedure for using separate set of acquisition conditions for major and minor element analysis is explained and its importance is stressed.

  1. Seniors' Online Communities: A Quantitative Content Analysis

    ERIC Educational Resources Information Center

    Nimrod, Galit

    2010-01-01

    Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…

  2. Method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, James S.; Gjerde, Douglas T.; Schmuckler, Gabriella

    1981-06-09

    An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  3. Quantitative ADF STEM: acquisition, analysis and interpretation

    NASA Astrophysics Data System (ADS)

    Jones, L.

    2016-01-01

    Quantitative annular dark-field in the scanning transmission electron microscope (ADF STEM), where image intensities are used to provide composition and thickness measurements, has enjoyed a renaissance during the last decade. Now in a post aberration-correction era many aspects of the technique are being revisited. Here the recent progress and emerging best-practice for such aberration corrected quantitative ADF STEM is discussed including issues relating to proper acquisition of experimental data and its calibration, approaches for data analysis, the utility of such data, its interpretation and limitations.

  4. Quantitative Proteomics Analysis of Leukemia Cells.

    PubMed

    Halbach, Sebastian; Dengjel, Jörn; Brummer, Tilman

    2016-01-01

    Chronic myeloid leukemia (CML) is driven by the oncogenic fusion kinase Bcr-Abl, which organizes its own signaling network with various proteins. These proteins, their interactions, and their role in relevant signaling pathways can be analyzed by quantitative mass spectrometry (MS) approaches in various models systems, e.g., in cell culture models. In this chapter, we describe in detail immunoprecipitations and quantitative proteomics analysis using stable isotope labeling by amino acids in cell culture (SILAC) of components of the Bcr-Abl signaling pathway in the human CML cell line K562. PMID:27581145

  5. Quantitative analysis of blood vessel geometry

    NASA Astrophysics Data System (ADS)

    Fuhrman, Michael G.; Abdul-Karim, Othman; Shah, Sujal; Gilbert, Steven G.; Van Bibber, Richard

    2001-07-01

    Re-narrowing or restenosis of a human coronary artery occurs within six months in one third of balloon angioplasty procedures. Accurate and repeatable quantitative analysis of vessel shape is important to characterize the progression and type of restenosis, and to evaluate effects new therapies might have. A combination of complicated geometry and image variability, and the need for high resolution and large image size makes visual/manual analysis slow, difficult, and prone to error. The image processing and analysis described here was developed to automate feature extraction of the lumen, internal elastic lamina, neointima, external elastic lamina, and tunica adventitia and to enable an objective, quantitative definition of blood vessel geometry. The quantitative geometrical analysis enables the measurement of several features including perimeter, area, and other metrics of vessel damage. Automation of feature extraction creates a high throughput capability that enables analysis of serial sections for more accurate measurement of restenosis dimensions. Measurement results are input into a relational database where they can be statistically analyzed compared across studies. As part of the integrated process, results are also imprinted on the images themselves to facilitate auditing of the results. The analysis is fast, repeatable and accurate while allowing the pathologist to control the measurement process.

  6. Comprehensive quantitative analysis on privacy leak behavior.

    PubMed

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  7. Comprehensive Quantitative Analysis on Privacy Leak Behavior

    PubMed Central

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  8. A correlative imaging based methodology for accurate quantitative assessment of bone formation in additive manufactured implants.

    PubMed

    Geng, Hua; Todd, Naomi M; Devlin-Mullin, Aine; Poologasundarampillai, Gowsihan; Kim, Taek Bo; Madi, Kamel; Cartmell, Sarah; Mitchell, Christopher A; Jones, Julian R; Lee, Peter D

    2016-06-01

    A correlative imaging methodology was developed to accurately quantify bone formation in the complex lattice structure of additive manufactured implants. Micro computed tomography (μCT) and histomorphometry were combined, integrating the best features from both, while demonstrating the limitations of each imaging modality. This semi-automatic methodology registered each modality using a coarse graining technique to speed the registration of 2D histology sections to high resolution 3D μCT datasets. Once registered, histomorphometric qualitative and quantitative bone descriptors were directly correlated to 3D quantitative bone descriptors, such as bone ingrowth and bone contact. The correlative imaging allowed the significant volumetric shrinkage of histology sections to be quantified for the first time (~15 %). This technique demonstrated the importance of location of the histological section, demonstrating that up to a 30 % offset can be introduced. The results were used to quantitatively demonstrate the effectiveness of 3D printed titanium lattice implants.

  9. Quantitative image analysis of celiac disease

    PubMed Central

    Ciaccio, Edward J; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2015-01-01

    We outline the use of quantitative techniques that are currently used for analysis of celiac disease. Image processing techniques can be useful to statistically analyze the pixular data of endoscopic images that is acquired with standard or videocapsule endoscopy. It is shown how current techniques have evolved to become more useful for gastroenterologists who seek to understand celiac disease and to screen for it in suspected patients. New directions for focus in the development of methodology for diagnosis and treatment of this disease are suggested. It is evident that there are yet broad areas where there is potential to expand the use of quantitative techniques for improved analysis in suspected or known celiac disease patients. PMID:25759524

  10. Quantitative image analysis of celiac disease.

    PubMed

    Ciaccio, Edward J; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2015-03-01

    We outline the use of quantitative techniques that are currently used for analysis of celiac disease. Image processing techniques can be useful to statistically analyze the pixular data of endoscopic images that is acquired with standard or videocapsule endoscopy. It is shown how current techniques have evolved to become more useful for gastroenterologists who seek to understand celiac disease and to screen for it in suspected patients. New directions for focus in the development of methodology for diagnosis and treatment of this disease are suggested. It is evident that there are yet broad areas where there is potential to expand the use of quantitative techniques for improved analysis in suspected or known celiac disease patients.

  11. Quantitative analysis of saccadic search strategy

    NASA Astrophysics Data System (ADS)

    Over, E. A. B.

    2007-06-01

    This thesis deals with the quantitative analysis of saccadic search strategy. The goal of the research presented was twofold: 1) to quantify overall characteristics of fixation location and saccade direction, and 2) to identify search strategies, with the use of a quantitative description of eye movement parameters. Chapter 2 provides a method to quantify a general property of fixation locations. We proposed a quantitative measure based on Voronoi diagrams for the characterization of the uniformity of fixation density. This measure may be thought of as indicating the clustering of fixations. We showed that during a visual search task, a structured (natural) background leads to higher clustering of fixations compared to a homogeneous background. In addition, in natural stimuli, a search task leads to higher clustering of fixations than the instruction to freely view the stimuli. Chapter 3 provides a method to identify the overall field of saccade directions in the viewing area. We extended the Voronoi method of chapter 2 so that it became possible to create vector maps. These maps indicate the preferred saccade direction for each position in the viewing area. Several measures of these vector maps were used to quantify the influence of observer-dependent and stimulus-dependent factors on saccade direction in a search task with natural scenes. The results showed that the influence of stimulus-dependent factors appeared to be larger than the influence of observer-dependent factors. In chapter 4 we showed that the border of the search area played a role in the search strategy. In a search experiment in differently shaped areas we measured that search performance was poorer near the search area luminance edges. Fixation density, however, was higher in the edge region, and saccade direction was mainly along the edges of the search areas. In a target visibility experiment we established that the visibility of targets near a luminance edge is less than the visibility of

  12. Multiple Linkage Disequilibrium Mapping Methods to Validate Additive Quantitative Trait Loci in Korean Native Cattle (Hanwoo).

    PubMed

    Li, Yi; Kim, Jong-Joo

    2015-07-01

    The efficiency of genome-wide association analysis (GWAS) depends on power of detection for quantitative trait loci (QTL) and precision for QTL mapping. In this study, three different strategies for GWAS were applied to detect QTL for carcass quality traits in the Korean cattle, Hanwoo; a linkage disequilibrium single locus regression method (LDRM), a combined linkage and linkage disequilibrium analysis (LDLA) and a BayesCπ approach. The phenotypes of 486 steers were collected for weaning weight (WWT), yearling weight (YWT), carcass weight (CWT), backfat thickness (BFT), longissimus dorsi muscle area, and marbling score (Marb). Also the genotype data for the steers and their sires were scored with the Illumina bovine 50K single nucleotide polymorphism (SNP) chips. For the two former GWAS methods, threshold values were set at false discovery rate <0.01 on a chromosome-wide level, while a cut-off threshold value was set in the latter model, such that the top five windows, each of which comprised 10 adjacent SNPs, were chosen with significant variation for the phenotype. Four major additive QTL from these three methods had high concordance found in 64.1 to 64.9Mb for Bos taurus autosome (BTA) 7 for WWT, 24.3 to 25.4Mb for BTA14 for CWT, 0.5 to 1.5Mb for BTA6 for BFT and 26.3 to 33.4Mb for BTA29 for BFT. Several candidate genes (i.e. glutamate receptor, ionotropic, ampa 1 [GRIA1], family with sequence similarity 110, member B [FAM110B], and thymocyte selection-associated high mobility group box [TOX]) may be identified close to these QTL. Our result suggests that the use of different linkage disequilibrium mapping approaches can provide more reliable chromosome regions to further pinpoint DNA makers or causative genes in these regions.

  13. Multiple Linkage Disequilibrium Mapping Methods to Validate Additive Quantitative Trait Loci in Korean Native Cattle (Hanwoo).

    PubMed

    Li, Yi; Kim, Jong-Joo

    2015-07-01

    The efficiency of genome-wide association analysis (GWAS) depends on power of detection for quantitative trait loci (QTL) and precision for QTL mapping. In this study, three different strategies for GWAS were applied to detect QTL for carcass quality traits in the Korean cattle, Hanwoo; a linkage disequilibrium single locus regression method (LDRM), a combined linkage and linkage disequilibrium analysis (LDLA) and a BayesCπ approach. The phenotypes of 486 steers were collected for weaning weight (WWT), yearling weight (YWT), carcass weight (CWT), backfat thickness (BFT), longissimus dorsi muscle area, and marbling score (Marb). Also the genotype data for the steers and their sires were scored with the Illumina bovine 50K single nucleotide polymorphism (SNP) chips. For the two former GWAS methods, threshold values were set at false discovery rate <0.01 on a chromosome-wide level, while a cut-off threshold value was set in the latter model, such that the top five windows, each of which comprised 10 adjacent SNPs, were chosen with significant variation for the phenotype. Four major additive QTL from these three methods had high concordance found in 64.1 to 64.9Mb for Bos taurus autosome (BTA) 7 for WWT, 24.3 to 25.4Mb for BTA14 for CWT, 0.5 to 1.5Mb for BTA6 for BFT and 26.3 to 33.4Mb for BTA29 for BFT. Several candidate genes (i.e. glutamate receptor, ionotropic, ampa 1 [GRIA1], family with sequence similarity 110, member B [FAM110B], and thymocyte selection-associated high mobility group box [TOX]) may be identified close to these QTL. Our result suggests that the use of different linkage disequilibrium mapping approaches can provide more reliable chromosome regions to further pinpoint DNA makers or causative genes in these regions. PMID:26104396

  14. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  15. Quantitative mass spectrometry methods for pharmaceutical analysis.

    PubMed

    Loos, Glenn; Van Schepdael, Ann; Cabooter, Deirdre

    2016-10-28

    Quantitative pharmaceutical analysis is nowadays frequently executed using mass spectrometry. Electrospray ionization coupled to a (hybrid) triple quadrupole mass spectrometer is generally used in combination with solid-phase extraction and liquid chromatography. Furthermore, isotopically labelled standards are often used to correct for ion suppression. The challenges in producing sensitive but reliable quantitative data depend on the instrumentation, sample preparation and hyphenated techniques. In this contribution, different approaches to enhance the ionization efficiencies using modified source geometries and improved ion guidance are provided. Furthermore, possibilities to minimize, assess and correct for matrix interferences caused by co-eluting substances are described. With the focus on pharmaceuticals in the environment and bioanalysis, different separation techniques, trends in liquid chromatography and sample preparation methods to minimize matrix effects and increase sensitivity are discussed. Although highly sensitive methods are generally aimed for to provide automated multi-residue analysis, (less sensitive) miniaturized set-ups have a great potential due to their ability for in-field usage.This article is part of the themed issue 'Quantitative mass spectrometry'.

  16. Quantitative resilience analysis through control design.

    SciTech Connect

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  17. Quantitative Bias Analysis in Regulatory Settings.

    PubMed

    Lash, Timothy L; Fox, Matthew P; Cooney, Darryl; Lu, Yun; Forshee, Richard A

    2016-07-01

    Nonrandomized studies are essential in the postmarket activities of the US Food and Drug Administration, which, however, must often act on the basis of imperfect data. Systematic errors can lead to inaccurate inferences, so it is critical to develop analytic methods that quantify uncertainty and bias and ensure that these methods are implemented when needed. "Quantitative bias analysis" is an overarching term for methods that estimate quantitatively the direction, magnitude, and uncertainty associated with systematic errors influencing measures of associations. The Food and Drug Administration sponsored a collaborative project to develop tools to better quantify the uncertainties associated with postmarket surveillance studies used in regulatory decision making. We have described the rationale, progress, and future directions of this project. PMID:27196652

  18. Quantitative NIR Raman analysis in liquid mixtures.

    PubMed

    Sato-Berrú, R Ysacc; Medina-Valtierra, Jorge; Medina-Gutiérrez, Cirilo; Frausto-Reyes, Claudio

    2004-08-01

    The capability to obtain quantitative information of a simple way from Raman spectra is a subject of considerable interest. In this work, this is demonstrated for mixtures of ethanol with water and rhodamine-6G (R-6G) with methanol, which were analyzed directly in glass vessel. The Raman intensities and a simple mathematical model have been used and applied for the analysis of liquid samples. It is starting point to generate a general expression, from the experimental spectra, as the sum of the particular expression for each pure compound allow us to obtain an expression for the mixtures which can be used for determining concentrations, from the Raman spectrum, of the mixture.

  19. Quantitative interactome analysis reveals a chemoresistant edgotype

    PubMed Central

    Chavez, Juan D.; Schweppe, Devin K.; Eng, Jimmy K.; Zheng, Chunxiang; Taipale, Alex; Zhang, Yiyi; Takara, Kohji; Bruce, James E.

    2015-01-01

    Chemoresistance is a common mode of therapy failure for many cancers. Tumours develop resistance to chemotherapeutics through a variety of mechanisms, with proteins serving pivotal roles. Changes in protein conformations and interactions affect the cellular response to environmental conditions contributing to the development of new phenotypes. The ability to understand how protein interaction networks adapt to yield new function or alter phenotype is limited by the inability to determine structural and protein interaction changes on a proteomic scale. Here, chemical crosslinking and mass spectrometry were employed to quantify changes in protein structures and interactions in multidrug-resistant human carcinoma cells. Quantitative analysis of the largest crosslinking-derived, protein interaction network comprising 1,391 crosslinked peptides allows for ‘edgotype' analysis in a cell model of chemoresistance. We detect consistent changes to protein interactions and structures, including those involving cytokeratins, topoisomerase-2-alpha, and post-translationally modified histones, which correlate with a chemoresistant phenotype. PMID:26235782

  20. Empirical Bayes Analysis of Quantitative Proteomics Experiments

    PubMed Central

    Margolin, Adam A.; Ong, Shao-En; Schenone, Monica; Gould, Robert; Schreiber, Stuart L.; Carr, Steven A.; Golub, Todd R.

    2009-01-01

    Background Advances in mass spectrometry-based proteomics have enabled the incorporation of proteomic data into systems approaches to biology. However, development of analytical methods has lagged behind. Here we describe an empirical Bayes framework for quantitative proteomics data analysis. The method provides a statistical description of each experiment, including the number of proteins that differ in abundance between 2 samples, the experiment's statistical power to detect them, and the false-positive probability of each protein. Methodology/Principal Findings We analyzed 2 types of mass spectrometric experiments. First, we showed that the method identified the protein targets of small-molecules in affinity purification experiments with high precision. Second, we re-analyzed a mass spectrometric data set designed to identify proteins regulated by microRNAs. Our results were supported by sequence analysis of the 3′ UTR regions of predicted target genes, and we found that the previously reported conclusion that a large fraction of the proteome is regulated by microRNAs was not supported by our statistical analysis of the data. Conclusions/Significance Our results highlight the importance of rigorous statistical analysis of proteomic data, and the method described here provides a statistical framework to robustly and reliably interpret such data. PMID:19829701

  1. Additional EIPC Study Analysis. Final Report

    SciTech Connect

    Hadley, Stanton W; Gotham, Douglas J.; Luciani, Ralph L.

    2014-12-01

    Between 2010 and 2012 the Eastern Interconnection Planning Collaborative (EIPC) conducted a major long-term resource and transmission study of the Eastern Interconnection (EI). With guidance from a Stakeholder Steering Committee (SSC) that included representatives from the Eastern Interconnection States Planning Council (EISPC) among others, the project was conducted in two phases. Phase 1 involved a long-term capacity expansion analysis that involved creation of eight major futures plus 72 sensitivities. Three scenarios were selected for more extensive transmission- focused evaluation in Phase 2. Five power flow analyses, nine production cost model runs (including six sensitivities), and three capital cost estimations were developed during this second phase. The results from Phase 1 and 2 provided a wealth of data that could be examined further to address energy-related questions. A list of 14 topics was developed for further analysis. This paper brings together the earlier interim reports of the first 13 topics plus one additional topic into a single final report.

  2. [Quantitative determination of morphine in opium powder by addition and correlation method using capillary electrophoresis].

    PubMed

    Sun, Guo-xiang; Miao, Ju-ru; Wang, Yu; Sun, Yu-qing

    2002-01-01

    The morphine in opium powder has been quantitatively determined by addition and correlation method (ACM), in which capillary zone electrophoresis was applied, and the average recovery was 100.6%. The relative standard deviation (RSD) of migration time was not more than 2.4%, the RSD of relative migration time was not more than 1.1%, and the RSD of the relative area was not more than 0.51%. Meanwhile, the contrast test has been done by the calibration curve method with an internal standard correlation. The content of morphine in opium powder determined by ACM was the same as that by using the calibration curve method with an internal standard correlated. The study shows that ACM is simple, quick and accurate.

  3. Quantitative analysis of protein turnover in plants.

    PubMed

    Nelson, Clark J; Li, Lei; Millar, A Harvey

    2014-03-01

    Proteins are constantly being synthesised and degraded as plant cells age and as plants grow, develop and adapt the proteome. Given that plants develop through a series of events from germination to fruiting and even undertake whole organ senescence, an understanding of protein turnover as a fundamental part of this process in plants is essential. Both synthesis and degradation processes are spatially separated in a cell across its compartmented structure. The majority of protein synthesis occurs in the cytosol, while synthesis of specific components occurs inside plastids and mitochondria. Degradation of proteins occurs in both the cytosol, through the action of the plant proteasome, and in organelles and lytic structures through different protease classes. Tracking the specific synthesis and degradation rate of individual proteins can be undertaken using stable isotope feeding and the ability of peptide MS to track labelled peptide fractions over time. Mathematical modelling can be used to follow the isotope signature of newly synthesised protein as it accumulates and natural abundance proteins as they are lost through degradation. Different technical and biological constraints govern the potential for the use of (13)C, (15)N, (2)H and (18)O for these experiments in complete labelling and partial labelling strategies. Future development of quantitative protein turnover analysis will involve analysis of protein populations in complexes and subcellular compartments, assessing the effect of PTMs and integrating turnover studies into wider system biology study of plants.

  4. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells.

    PubMed

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells. PMID:26039484

  5. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells.

    PubMed

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells.

  6. Biomechanical cell analysis using quantitative phase imaging (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Wax, Adam; Park, Han Sang; Eldridge, William J.

    2016-03-01

    Quantitative phase imaging provides nanometer scale sensitivity and has been previously used to study spectral and temporal characteristics of individual cells in vitro, especially red blood cells. Here we extend this work to study the mechanical responses of individual cells due to the influence of external stimuli. Cell stiffness may be characterized by analyzing the inherent thermal fluctuations of cells but by applying external stimuli, additional information can be obtained. The time dependent response of cells due to external shear stress is examined with high speed quantitative phase imaging and found to exhibit characteristics that relate to their stiffness. However, analysis beyond the cellular scale also reveals internal organization of the cell and its modulation due to pathologic processes such as carcinogenesis. Further studies with microfluidic platforms point the way for using this approach in high throughput assays.

  7. Applying Knowledge of Quantitative Design and Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  8. Error Propagation Analysis for Quantitative Intracellular Metabolomics

    PubMed Central

    Tillack, Jana; Paczia, Nicole; Nöh, Katharina; Wiechert, Wolfgang; Noack, Stephan

    2012-01-01

    Model-based analyses have become an integral part of modern metabolic engineering and systems biology in order to gain knowledge about complex and not directly observable cellular processes. For quantitative analyses, not only experimental data, but also measurement errors, play a crucial role. The total measurement error of any analytical protocol is the result of an accumulation of single errors introduced by several processing steps. Here, we present a framework for the quantification of intracellular metabolites, including error propagation during metabolome sample processing. Focusing on one specific protocol, we comprehensively investigate all currently known and accessible factors that ultimately impact the accuracy of intracellular metabolite concentration data. All intermediate steps are modeled, and their uncertainty with respect to the final concentration data is rigorously quantified. Finally, on the basis of a comprehensive metabolome dataset of Corynebacterium glutamicum, an integrated error propagation analysis for all parts of the model is conducted, and the most critical steps for intracellular metabolite quantification are detected. PMID:24957773

  9. Quantitative gold nanoparticle analysis methods: A review.

    PubMed

    Yu, Lei; Andriola, Angelo

    2010-08-15

    Research and development in the area of gold nanoparticles' (AuNPs) preparation, characterization, and applications are burgeoning in recent years. Many of the techniques and protocols are very mature, but two major concerns are with the mass domestic production and the consumption of AuNP based products. First, how many AuNPs exist in a dispersion? Second, where are the AuNPs after digestion by the environment and how many are there? To answer these two questions, reliable and reproducible methods are needed to analyze the existence and the population of AuNP in samples. This review summarized the most recent chemical and particle quantitative analysis methods that have been used to characterize the concentration (in number of moles of gold per liter) or population (in number of particles per mL) of AuNPs. The methods summarized in this review include, mass spectroscopy, electroanalytical methods, spectroscopic methods, and particle counting methods. These methods may count the number of AuNP directly or analyze the total concentration of element gold in an AuNP dispersion.

  10. An empirical approach to the bond additivity model in quantitative interpretation of sum frequency generation vibrational spectra

    NASA Astrophysics Data System (ADS)

    Wu, Hui; Zhang, Wen-kai; Gan, Wei; Cui, Zhi-feng; Wang, Hong-fei

    2006-10-01

    Knowledge of the ratios between different polarizability βi'j'k' tensor elements of a chemical group in a molecule is crucial for quantitative interpretation and polarization analysis of its sum frequency generation vibrational spectroscopy (SFG-VS) spectrum at interface. The bond additivity model (BAM) or the hyperpolarizability derivative model along with experimentally obtained Raman depolarization ratios has been widely used to obtain such tensor ratios for the CH3, CH2, and CH groups. Successfully, such treatment can quantitatively reproduce the intensity polarization dependence in SFG-VS spectra for the symmetric (SS) and asymmetric (AS) stretching modes of CH3 and CH2 groups, respectively. However, the relative intensities between the SS and AS modes usually do not agree with each other within this model even for some of the simplest molecular systems, such as the air/methanol interface. This fact certainly has cast uncertainties on the effectiveness and conclusions based on the BAM. One of such examples is that the AS mode of CH3 group has never been observed in SFG-VS spectra from the air/methanol interface, while this AS mode is usually very strong for SFG-VS spectra from the air/ethanol interface, other short chain alcohol, as well as long chain surfactants. In order to answer these questions, an empirical approach from known Raman and IR spectra is used to make corrections to the BAM. With the corrected ratios between the βi'j'k' tensor elements of the SS and AS modes, all features in the SFG-VS spectra of the air/methanol and air/ethanol interfaces can be quantitatively interpreted. This empirical approach not only provides new understandings of the effectiveness and limitations of the bond additivity model but also provides a practical way for its application in SFG-VS studies of molecular interfaces.

  11. Acid Rain Analysis by Standard Addition Titration.

    ERIC Educational Resources Information Center

    Ophardt, Charles E.

    1985-01-01

    The standard addition titration is a precise and rapid method for the determination of the acidity in rain or snow samples. The method requires use of a standard buret, a pH meter, and Gran's plot to determine the equivalence point. Experimental procedures used and typical results obtained are presented. (JN)

  12. Electrophoretic analysis of Allium alien addition lines.

    PubMed

    Peffley, E B; Corgan, J N; Horak, K E; Tanksley, S D

    1985-12-01

    Meiotic pairing in an interspecific triploid of Allium cepa and A. fistulosum, 'Delta Giant', exhibits preferential pairing between the two A. cepa genomes, leaving the A. fistulosum genome as univalents. Multivalent pairing involving A. fistulosum chromosomes occurs at a low level, allowing for recombination between the genomes. Ten trisomies were recovered from the backcross of 'Delta Giant' x A. cepa cv., 'Temprana', representing a minimum of four of the eight possible alien addition lines. The alien addition lines possessed different A. fistulosum enzyme markers. Those markers, Adh-1, Idh-1 and Pgm-1 reside on different A. fistulosum chromosomes, whereas Pgi-1 and Idh-1 may be linked. Diploid, trisomic and hyperploid progeny were recovered that exhibited putative pink root resistance. The use of interspecific plants as a means to introgress A. fistulosum genes into A. cepa appears to be successful at both the trisomic and the diploid levels. If introgression can be accomplished using an interspecific triploid such as 'Delta Giant' to generate fertile alien addition lines and subsequent fertile diploids, or if introgression can be accomplished directly at the diploid level, this will have accomplished gene flow that has not been possible at the interspecific diploid level.

  13. Additives

    NASA Technical Reports Server (NTRS)

    Smalheer, C. V.

    1973-01-01

    The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

  14. Quantitative Phase Analysis by the Rietveld Method for Forensic Science.

    PubMed

    Deng, Fei; Lin, Xiaodong; He, Yonghong; Li, Shu; Zi, Run; Lai, Shijun

    2015-07-01

    Quantitative phase analysis (QPA) is helpful to determine the type attribute of the object because it could present the content of the constituents. QPA by Rietveld method requires neither measurement of calibration data nor the use of an internal standard; however, the approximate crystal structure of each phase in a mixture is necessary. In this study, 8 synthetic mixtures composed of potassium nitrate and sulfur were analyzed by Rietveld QPA method. The Rietveld refinement was accomplished with a material analysis using diffraction program and evaluated by three agreement indices. Results showed that Rietveld QPA yielded precise results, with errors generally less than 2.0% absolute. In addition, a criminal case which was broken successfully with the help of Rietveld QPA method was also introduced. This method will allow forensic investigators to acquire detailed information of the material evidence, which could point out the direction for case detection and court proceedings.

  15. Quantitative Analysis of Radar Returns from Insects

    NASA Technical Reports Server (NTRS)

    Riley, J. R.

    1979-01-01

    When a number of flying insects is low enough to permit their resolution as individual radar targets, quantitative estimates of their aerial density are developed. Accurate measurements of heading distribution using a rotating polarization radar to enhance the wingbeat frequency method of identification are presented.

  16. Some Epistemological Considerations Concerning Quantitative Analysis

    ERIC Educational Resources Information Center

    Dobrescu, Emilian

    2008-01-01

    This article presents the author's address at the 2007 "Journal of Applied Quantitative Methods" ("JAQM") prize awarding festivity. The festivity was included in the opening of the 4th International Conference on Applied Statistics, November 22, 2008, Bucharest, Romania. In the address, the author reflects on three theses that question the…

  17. Quantitative analysis of planetary reflectance spectra with principal components analysis

    NASA Technical Reports Server (NTRS)

    Johnson, P. E.; Smith, M. O.; Adams, J. B.

    1985-01-01

    A technique is presented for quantitative analysis of planetary reflectance spectra as mixtures of particles on microscopic and macroscopic scales using principal components analysis. This technique allows for determination of the endmembers being mixed, their abundance, and the scale of mixing, as well as other physical parameters. Eighteen lunar telescopic reflectance spectra of the Copernicus crater region, from 600 nm to 1800 nm in wavelength, are modeled in terms of five likely endmembers: mare basalt, mature mare soil, anorthosite, mature highland soil, and clinopyroxene. These endmembers were chosen from a similar analysis of 92 lunar soil and rock samples. The models fit the data to within 2 percent rms. It is found that the goodness of fit is marginally better for intimate mixing over macroscopic mixing.

  18. Quantitative analysis of planetary reflectance spectra with principal components analysis

    NASA Astrophysics Data System (ADS)

    Johnson, P. E.; Smith, M. O.; Adams, J. B.

    1985-02-01

    A technique is presented for quantitative analysis of planetary reflectance spectra as mixtures of particles on microscopic and macroscopic scales using principal components analysis. This technique allows for determination of the endmembers being mixed, their abundance, and the scale of mixing, as well as other physical parameters. Eighteen lunar telescopic reflectance spectra of the Copernicus crater region, from 600 nm to 1800 nm in wavelength, are modeled in terms of five likely endmembers: mare basalt, mature mare soil, anorthosite, mature highland soil, and clinopyroxene. These endmembers were chosen from a similar analysis of 92 lunar soil and rock samples. The models fit the data to within 2 percent rms. It is found that the goodness of fit is marginally better for intimate mixing over macroscopic mixing.

  19. Using quantitative acid-base analysis in the ICU.

    PubMed

    Lloyd, P; Freebairn, R

    2006-03-01

    The quantitative acid-base 'Strong Ion' calculator is a practical application of quantitative acid-base chemistry, as developed by Peter Stewart and Peter Constable. It quantifies the three independent factors that control acidity, calculates the concentration and charge of unmeasured ions, produces a report based on these calculations and displays a Gamblegram depicting measured ionic species. Used together with the medical history, quantitative acid-base analysis has advantages over traditional approaches.

  20. Quantitative analysis of Caenorhabditis elegans chemotaxis using a microfluidic device.

    PubMed

    Hu, Liang; Ye, Jinjuan; Tan, Haowei; Ge, Anle; Tang, Lichun; Feng, Xiaojun; Du, Wei; Liu, Bi-Feng

    2015-08-01

    Caenorhabditis elegans, one of the widely studied model organisms, sense external chemical cues and perform relative chemotaxis behaviors through its simple chemosensory neuronal system. To study the mechanism underlying chemosensory behavior, a rapid and reliable method for quantitatively analyzing the worms' behaviors is essential. In this work, we demonstrated a microfluidic approach for investigating chemotaxis responses of worms to chemical gradients. The flow-based microfluidic chip was consisted of circular tree-like microchannels, which was able to generate eight flow streams containing stepwise chemical concentrations without the difference in flow velocity. Worms' upstream swimming into microchannels with various concentrations was monitored for quantitative analysis of the chemotaxis behavior. By using this microfluidic chip, the attractive and repellent responses of C. elegans to NaCl were successfully quantified within several minutes. The results demonstrated the wild type-like repellent responses and severely impaired attractive responses in grk-2 mutant animals with defects in calcium influx. In addition, the chemotaxis analysis of the third stage larvae revealed that its gustatory response was different from that in the adult stage. Thus, our microfluidic method provided a useful platform for studying the chemosensory behaviors of C. elegans and screening of chemosensation-related chemical drugs.

  1. Quantitative analysis of Caenorhabditis elegans chemotaxis using a microfluidic device.

    PubMed

    Hu, Liang; Ye, Jinjuan; Tan, Haowei; Ge, Anle; Tang, Lichun; Feng, Xiaojun; Du, Wei; Liu, Bi-Feng

    2015-08-01

    Caenorhabditis elegans, one of the widely studied model organisms, sense external chemical cues and perform relative chemotaxis behaviors through its simple chemosensory neuronal system. To study the mechanism underlying chemosensory behavior, a rapid and reliable method for quantitatively analyzing the worms' behaviors is essential. In this work, we demonstrated a microfluidic approach for investigating chemotaxis responses of worms to chemical gradients. The flow-based microfluidic chip was consisted of circular tree-like microchannels, which was able to generate eight flow streams containing stepwise chemical concentrations without the difference in flow velocity. Worms' upstream swimming into microchannels with various concentrations was monitored for quantitative analysis of the chemotaxis behavior. By using this microfluidic chip, the attractive and repellent responses of C. elegans to NaCl were successfully quantified within several minutes. The results demonstrated the wild type-like repellent responses and severely impaired attractive responses in grk-2 mutant animals with defects in calcium influx. In addition, the chemotaxis analysis of the third stage larvae revealed that its gustatory response was different from that in the adult stage. Thus, our microfluidic method provided a useful platform for studying the chemosensory behaviors of C. elegans and screening of chemosensation-related chemical drugs. PMID:26320797

  2. Quantitative signal analysis in pulsed resonant photoacoustics

    NASA Astrophysics Data System (ADS)

    Schäfer, Stefan; Miklós, András; Hess, Peter

    1997-05-01

    The pulsed excitation of acoustic resonances was studied by means of a high- Q photoacoustic resonator with different types of microphone. The signal strength of the first radial mode was calculated by the basic theory as well as by a modeling program, which takes into account the acoustic impedances of the resonator, the acoustic filter system, and the influence of the microphone coupling on the photoacoustic cavity. When the calculated signal strength is used, the high- Q system can be calibrated for trace-gas analysis without a certified gas mixture. The theoretical results were compared with measurements and show good agreement for different microphone configurations. From the measured pressure signal (in pascals per joule), the absorption coefficient of ethylene was calculated; it agreed within 10 with literature values. In addition, a Helmholtz configuration with a highly sensitive 1-in. (2.54-cm) microphone was realized. Although the Q factor was reduced, the sensitivity could be increased by the Helmholtz resonator in the case of pulsed experiments. A maximum sensitivity of the coupled system of 341 mV Pa was achieved.

  3. Quantitative genetic analysis of injury liability in infants and toddlers

    SciTech Connect

    Phillips, K.; Matheny, A.P. Jr.

    1995-02-27

    A threshold model of latent liability was applied to infant and toddler twin data on total count of injuries sustained during the interval from birth to 36 months of age. A quantitative genetic analysis of estimated twin correlations in injury liability indicated strong genetic dominance effects, but no additive genetic variance was detected. Because interpretations involving overdominance have little research support, the results may be due to low order epistasis or other interaction effects. Boys had more injuries than girls, but this effect was found only for groups whose parents were prompted and questioned in detail about their children`s injuries. Activity and impulsivity are two behavioral predictors of childhood injury, and the results are discussed in relation to animal research on infant and adult activity levels, and impulsivity in adult humans. Genetic epidemiological approaches to childhood injury should aid in targeting higher risk children for preventive intervention. 30 refs., 4 figs., 3 tabs.

  4. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  5. 75 FR 4323 - Additional Quantitative Fit-testing Protocols for the Respiratory Protection Standard

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-27

    ... performed particle counts on samples collected during the Study. Table 1 provides the exercise and sampling... revised PortaCount quantitative fit-testing protocols are not sufficiently accurate or reliable to include...) to Appendix A of ] its Respiratory Protection Standard (see 69 FR 46986). OSHA also published...

  6. Validation and Estimation of Additive Genetic Variation Associated with DNA Tests for Quantitative Beef Cattle Traits

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The U.S. National Beef Cattle Evaluation Consortium (NBCEC) has been involved in the validation of commercial DNA tests for quantitative beef quality traits since their first appearance on the U.S. market in the early 2000s. The NBCEC Advisory Council initially requested that the NBCEC set up a syst...

  7. Structural and quantitative analysis of Equisetum alkaloids.

    PubMed

    Cramer, Luise; Ernst, Ludger; Lubienski, Marcus; Papke, Uli; Schiebel, Hans-Martin; Jerz, Gerold; Beuerle, Till

    2015-08-01

    Equisetum palustre L. is known for its toxicity for livestock. Several studies in the past addressed the isolation and identification of the responsible alkaloids. So far, palustrine (1) and N(5)-formylpalustrine (2) are known alkaloids of E. palustre. A HPLC-ESI-MS/MS method in combination with simple sample work-up was developed to identify and quantitate Equisetum alkaloids. Besides the two known alkaloids six related alkaloids were detected in different Equisetum samples. The structure of the alkaloid palustridiene (3) was derived by comprehensive 1D and 2D NMR experiments. N(5)-Acetylpalustrine (4) was also thoroughly characterized by NMR for the first time. The structure of N(5)-formylpalustridiene (5) is proposed based on mass spectrometry results. Twenty-two E. palustre samples were screened by a HPLC-ESI-MS/MS method after development of a simple sample work-up and in most cases the set of all eight alkaloids were detected in all parts of the plant. A high variability of the alkaloid content and distribution was found depending on plant organ, plant origin and season ranging from 88 to 597mg/kg dried weight. However, palustrine (1) and the alkaloid palustridiene (3) always represented the main alkaloids. For the first time, a comprehensive identification, quantitation and distribution of Equisetum alkaloids was achieved.

  8. Quantitative data analysis of ESAR data

    NASA Astrophysics Data System (ADS)

    Phruksahiran, N.; Chandra, M.

    2013-07-01

    A synthetic aperture radar (SAR) data processing uses the backscattered electromagnetic wave to map radar reflectivity of the ground surface. The polarization property in radar remote sensing was used successfully in many applications, especially in target decomposition. This paper presents a case study to the experiments which are performed on ESAR L-Band full polarized data sets from German Aerospace Center (DLR) to demonstrate the potential of coherent target decomposition and the possibility of using the weather radar measurement parameter, such as the differential reflectivity and the linear depolarization ratio to obtain the quantitative information of the ground surface. The raw data of ESAR has been processed by the SAR simulator developed using MATLAB program code with Range-Doppler algorithm.

  9. Qualitative and quantitative analysis of endocytic recycling.

    PubMed

    Reineke, James B; Xie, Shuwei; Naslavsky, Naava; Caplan, Steve

    2015-01-01

    Endocytosis, which encompasses the internalization and sorting of plasma membrane (PM) lipids and proteins to distinct membrane-bound intracellular compartments, is a highly regulated and fundamental cellular process by which eukaryotic cells dynamically regulate their PM composition. Indeed, endocytosis is implicated in crucial cellular processes that include proliferation, migration, and cell division as well as maintenance of tissue homeostasis such as apical-basal polarity. Once PM constituents have been taken up into the cell, either via clathrin-dependent endocytosis (CDE) or clathrin-independent endocytosis (CIE), they typically have two fates: degradation through the late-endosomal/lysosomal pathway or returning to the PM via endocytic recycling pathways. In this review, we will detail experimental procedures that allow for both qualitative and quantitative assessment of endocytic recycling of transmembrane proteins internalized by CDE and CIE, using the HeLa cervical cancer cell line as a model system. PMID:26360033

  10. Functional Linear Models for Association Analysis of Quantitative Traits

    PubMed Central

    Fan, Ruzong; Wang, Yifan; Mills, James L.; Wilson, Alexander F.; Bailey-Wilson, Joan E.; Xiong, Momiao

    2014-01-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. PMID:24130119

  11. Fractal Spectrum Technique for Quantitative Analysis of Volcanic Particle Shapes

    NASA Astrophysics Data System (ADS)

    Maria, A. H.; Carey, S. N.

    2001-12-01

    The shapes of volcanic particles reflect numerous eruptive parameters (e.g. magma viscosity, volatile content, degree of interaction with water) and are useful for understanding fragmentation and transport processes associated with volcanic eruptions. However, quantitative analysis of volcanic particle shapes has proven difficult due to their morphological complexity and variability. Shape analysis based on fractal geometry has been successfully applied to a wide variety of particles and appears to be well suited for describing complex features. The technique developed and applied to volcanic particles in this study uses fractal data produced by dilation of the 2-D particle boundary to produce a full spectrum of fractal dimensions over a range of scales for each particle. Multiple fractal dimensions, which can be described as a fractal spectrum curve, are calculated by taking the first derivative of data points on a standard Richardson plot. Quantitative comparisons are carried out using multivariate statistical techniques such as cluster and principal components analysis. Compared with previous fractal methods that express shape in terms of only one or two fractal dimensions, use of multiple fractal dimensions results in more effective discrimination between samples. In addition, the technique eliminates the subjectivity associated with selecting linear segments on Richardson plots for fractal dimension calculation, and allows direct comparison of particles as long as instantaneous dimensions used as input to multivariate analyses are selected at the same scales for each particle. Applications to samples from well documented eruptions (e.g. Mt. St. Helens, Tambora, Surtsey) indicate that the fractal spectrum technique provides a useful means of characterizing volcanic particles and can be helpful for identifying the products of specific fragmentation processes (volatile exsolution, phreatomagmatic, quench granulation) and modes of volcanic deposition (tephra fall

  12. Functional linear models for association analysis of quantitative traits.

    PubMed

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study.

  13. Application of Synchrotron-XRF to Quantitative Elemental Aerosol Analysis

    NASA Astrophysics Data System (ADS)

    Cliff, S. S.; Perry, K. D.; Jimenez-Cruz, M. P.; Cahill, T. A.

    2001-12-01

    Recent advances in synchrotron x-ray fluorescence (s-XRF) analysis of atmospheric particulate matter have improved elemental sensitivity, quantification and time-resolution. Analysis of both filter and impactor based aerosol samples have yielded quantitative data for elements Na-U, if present, in ambient aerosols. The increased sensitivity allows higher time resolution through either smaller spatial analysis of time-resolved impactor samples or shorter sample time-integration using filter-based samplers. Of particular interest is the application of s-XRF to aerodynamically sized rotating substrate impactor samples. These samplers, 8- and 3-stage DRUM's, have the ability to aerodynamically size-classify particles in either 8 or 3 categories, respectively. In addition, the rotating substrate allows time-resolved analysis of samples with little or no loss in elemental sensitivity. The s-XRF analyses are performed on Beamline 10.3.1 at the Advanced Light Source-Lawrence Berkeley Laboratory (ALS-LBL). Beamline 10.3.1, originally designed for materials analysis, has been supplemented with aerosol analysis capability from several substrate options. Typical analysis involves Teflon filters or Mylar impaction substrates. The newly formed Participating Research Team (PRT) for beamline 10.3.1 encompasses both global climate and material science research. The s-XRF capabilities of beamline 10.3.1 are now available for PRT researchers and independent investigators through a proposal process to the ALS. The technology, application to aerosol research and monitoring, and availability of the facility to the aerosol research community will be presented.

  14. Quantitative infrared analysis of hydrogen fluoride

    SciTech Connect

    Manuta, D.M.

    1997-04-01

    This work was performed at the Portsmouth Gaseous Diffusion Plant where hydrogen fluoride is produced upon the hydrolysis of UF{sub 6}. This poses a problem for in this setting and a method for determining the mole percent concentration was desired. HF has been considered to be a non-ideal gas for many years. D. F. Smith utilized complex equations in his HF studies in the 1950s. We have evaluated HF behavior as a function of pressure from three different perspectives. (1) Absorbance at 3877 cm{sup -1} as a function of pressure for 100% HF. (2) Absorbance at 3877 cm{sup -1} as a function of increasing partial pressure HF. Total pressure = 300 mm HgA maintained with nitrogen. (3) Absorbance at 3877 cm{sup -1} for constant partial pressure HF. Total pressure is increased to greater than 800 mm HgA with nitrogen. These experiments have shown that at partial pressures up to 35mm HgA, HIF follows the ideal gas law. The absorbance at 3877 cm{sup -1} can be quantitatively analyzed via infrared methods.

  15. Quantitative multi-modal NDT data analysis

    SciTech Connect

    Heideklang, René; Shokouhi, Parisa

    2014-02-18

    A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundant information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity.

  16. Using fire tests for quantitative risk analysis

    SciTech Connect

    Ling, W.C.T.; Williamson, R.B.

    1980-03-01

    Fires can be considered a causal chain-of-events in which the growth and spread of fire may cause damage and injury if it is rapid enough to overcome the barriers placed in its way. Fire tests for fire resistance of the barriers can be used in a quantitative risk assessment. The fire growth and spread is modelled in a State Transition Model (STM). The fire barriers are presented as part of the Fire Protection Model (FPM) which is based on a portion of the NFPA Decision Tree. An Emergency Equivalent Network is introduced to couple the Fire Growth Model (FGM) and the FPM so that the spread of fire beyond the room-of-origin can be computed. An example is presented in which a specific building floor plan is analyzed to obtain the shortest expected time for fire to spread between two points. To obtain the probability and time for each link in the network, data from the results of fire tests were used. These results were found to be lacking and new standards giving better data are advocated.

  17. Quantitative Medical Image Analysis for Clinical Development of Therapeutics

    NASA Astrophysics Data System (ADS)

    Analoui, Mostafa

    There has been significant progress in development of therapeutics for prevention and management of several disease areas in recent years, leading to increased average life expectancy, as well as of quality of life, globally. However, due to complexity of addressing a number of medical needs and financial burden of development of new class of therapeutics, there is a need for better tools for decision making and validation of efficacy and safety of new compounds. Numerous biological markers (biomarkers) have been proposed either as adjunct to current clinical endpoints or as surrogates. Imaging biomarkers are among rapidly increasing biomarkers, being examined to expedite effective and rational drug development. Clinical imaging often involves a complex set of multi-modality data sets that require rapid and objective analysis, independent of reviewer's bias and training. In this chapter, an overview of imaging biomarkers for drug development is offered, along with challenges that necessitate quantitative and objective image analysis. Examples of automated and semi-automated analysis approaches are provided, along with technical review of such methods. These examples include the use of 3D MRI for osteoarthritis, ultrasound vascular imaging, and dynamic contrast enhanced MRI for oncology. Additionally, a brief overview of regulatory requirements is discussed. In conclusion, this chapter highlights key challenges and future directions in this area.

  18. Mass spectrometry-based quantitative analysis and biomarker discovery.

    PubMed

    Suzuki, Naoto

    2011-01-01

      Mass spectrometry-based quantitative analysis and biomarker discovery using metabolomics approach represent one of the major platforms in clinical fields including for the prognosis or diagnosis, assessment of severity and response to therapy in a number of clinical disease states as well as therapeutic drug monitoring (TDM). This review first summarizes our mass spectrometry-based research strategy and some results on relationship between cysteinyl leukotriene (cysLT), thromboxane (TX), 12-hydroxyeicosatetraenoic acid (12-HETE) and other metabolites of arachidonic acid and diseases such as atopic dermatitis, rheumatoid arthritis and diabetes mellitus. For the purpose of evaluating the role of these metabolites of arachidonic acid in disease status, we have developed sensitive determination methods with simple solid-phase extraction and applied in clinical settings. In addition to these endogenous compounds, using mass spectrometry, we have developed actually applicable quantitative methods for TDM. Representative example was a method of TDM for sirolimus, one of the immunosuppressant agents for a recipient of organ transplant, which requires rigorous monitoring of blood level. As we recognized great potential in mass spectrometry during these researches, we have become interested in metabolomics as the non-targeted analysis of metabolites. Now, established strategy for the metabolomics investigation applies to samples from cells, animals and humans to separate groups based on altered patterns of metabolites in biological fluids and to identify metabolites as potential biomarkers discriminating groups. We would be honored if our research using mass spectrometry would contribute to provide useful information in the field of medical pharmacy. PMID:21881303

  19. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    PubMed Central

    Jensen, Thomas; Holten-Rossing, Henrik; Svendsen, Ida M H; Jacobsen, Christina; Vainer, Ben

    2016-01-01

    Background: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including staining may benefit. Methods: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm are presented. Results: It is shown that the autofluorescence intensity of unstained microsections at two different wavelengths is a suitable starting point for automated digital analysis of myocytes, fibrous tissue, lipofuscin, and the extracellular compartment. The output of the method is absolute quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. Conclusion: The presented method is amply described as a prestain multicomponent quantitation and outlining tool for histological sections of cardiac tissue. The main perspective is the opportunity for combination with digital analysis of stained microsections, for which the method may provide an accurate digital framework. PMID:27141321

  20. The quantitative failure of human reliability analysis

    SciTech Connect

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  1. A Quantitative Analysis of Countries' Research Strengths

    ERIC Educational Resources Information Center

    Saxena, Anurag; Brazer, S. David; Gupta, B. M.

    2009-01-01

    This study employed a multidimensional analysis to evaluate transnational patterns of scientific research to determine relative research strengths among widely varying nations. Findings from this study may inform national policy with regard to the most efficient use of scarce national research resources, including government and private funding.…

  2. Quantitative analysis of cascade impactor samples - revisited

    NASA Astrophysics Data System (ADS)

    Orlić , I.; Chiam, S. Y.; Sanchez, J. L.; Tang, S. M.

    1999-04-01

    Concentrations of aerosols collected in Singapore during the three months long haze period that affected the whole South-East Asian region in 1997 are reported. Aerosol samples were continuously collected by using a fine aerosol sampler (PM2.5) and occasionally with a single orifice cascade impactor (CI) sampler. Our results show that in the fine fraction (<2.5 μm) the concentrations of two well-known biomass burning products, i.e. K and S were generally increased by a factor 2-3 compared to the non-hazy periods. However, a discrepancy was noticed, at least for elements with lower atomic number (Ti and below) between the results obtained by the fine aerosol sampler and the cascade impactor. Careful analysis by means of Nuclear Microscopy, in particular by the Scanning Transmission Ion Microscopy (STIM) technique, revealed that thicknesses of the lower CI stages exceeded thick target limits for 2 MeV protons. Detailed depth profiles of all CI stages were therefore measured using the STIM technique and concentrations corrected for absorption and proton energy loss. After correcting results for the actual sample thickness, concentrations of all major elements (S, Cl, K, Ca) agreed much better with the PM2.5 results. The importance of implementing thick target corrections in analysis of CI samples, especially those collected in the urban environments, is emphasized. Broad beam PIXE analysis approach is certainly not adequate in these cases.

  3. An approach for quantitative image quality analysis for CT

    NASA Astrophysics Data System (ADS)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  4. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    ERIC Educational Resources Information Center

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  5. Additive interaction in survival analysis: use of the additive hazards model.

    PubMed

    Rod, Naja Hulvej; Lange, Theis; Andersen, Ingelise; Marott, Jacob Louis; Diderichsen, Finn

    2012-09-01

    It is a widely held belief in public health and clinical decision-making that interventions or preventive strategies should be aimed at patients or population subgroups where most cases could potentially be prevented. To identify such subgroups, deviation from additivity of absolute effects is the relevant measure of interest. Multiplicative survival models, such as the Cox proportional hazards model, are often used to estimate the association between exposure and risk of disease in prospective studies. In Cox models, deviations from additivity have usually been assessed by surrogate measures of additive interaction derived from multiplicative models-an approach that is both counter-intuitive and sometimes invalid. This paper presents a straightforward and intuitive way of assessing deviation from additivity of effects in survival analysis by use of the additive hazards model. The model directly estimates the absolute size of the deviation from additivity and provides confidence intervals. In addition, the model can accommodate both continuous and categorical exposures and models both exposures and potential confounders on the same underlying scale. To illustrate the approach, we present an empirical example of interaction between education and smoking on risk of lung cancer. We argue that deviations from additivity of effects are important for public health interventions and clinical decision-making, and such estimations should be encouraged in prospective studies on health. A detailed implementation guide of the additive hazards model is provided in the appendix.

  6. Quantitative analysis of airway abnormalities in CT

    NASA Astrophysics Data System (ADS)

    Petersen, Jens; Lo, Pechin; Nielsen, Mads; Edula, Goutham; Ashraf, Haseem; Dirksen, Asger; de Bruijne, Marleen

    2010-03-01

    A coupled surface graph cut algorithm for airway wall segmentation from Computed Tomography (CT) images is presented. Using cost functions that highlight both inner and outer wall borders, the method combines the search for both borders into one graph cut. The proposed method is evaluated on 173 manually segmented images extracted from 15 different subjects and shown to give accurate results, with 37% less errors than the Full Width at Half Maximum (FWHM) algorithm and 62% less than a similar graph cut method without coupled surfaces. Common measures of airway wall thickness such as the Interior Area (IA) and Wall Area percentage (WA%) was measured by the proposed method on a total of 723 CT scans from a lung cancer screening study. These measures were significantly different for participants with Chronic Obstructive Pulmonary Disease (COPD) compared to asymptomatic participants. Furthermore, reproducibility was good as confirmed by repeat scans and the measures correlated well with the outcomes of pulmonary function tests, demonstrating the use of the algorithm as a COPD diagnostic tool. Additionally, a new measure of airway wall thickness is proposed, Normalized Wall Intensity Sum (NWIS). NWIS is shown to correlate better with lung function test values and to be more reproducible than previous measures IA, WA% and airway wall thickness at a lumen perimeter of 10 mm (PI10).

  7. Quantitative analysis of heart rate variability

    NASA Astrophysics Data System (ADS)

    Kurths, J.; Voss, A.; Saparin, P.; Witt, A.; Kleiner, H. J.; Wessel, N.

    1995-03-01

    In the modern industrialized countries every year several hundred thousands of people die due to sudden cardiac death. The individual risk for this sudden cardiac death cannot be defined precisely by common available, noninvasive diagnostic tools like Holter monitoring, highly amplified ECG and traditional linear analysis of heart rate variability (HRV). Therefore, we apply some rather unconventional methods of nonlinear dynamics to analyze the HRV. Especially, some complexity measures that are based on symbolic dynamics as well as a new measure, the renormalized entropy, detect some abnormalities in the HRV of several patients who have been classified in the low risk group by traditional methods. A combination of these complexity measures with the parameters in the frequency domain seems to be a promising way to get a more precise definition of the individual risk. These findings have to be validated by a representative number of patients.

  8. Quantitative 3D analysis of huge nanoparticle assemblies† †Electronic supplementary information (ESI) available.CCDC 1417516–1417520 contain the supplementary crystallographic data for this paper. For ESI and crystallographic data in CIF or other electronic format see DOI: 10.1039/c5nr06962a Click here for additional data file.

    PubMed Central

    Zanaga, Daniele; Bleichrodt, Folkert; Altantzis, Thomas; Winckelmans, Naomi; Palenstijn, Willem Jan; Sijbers, Jan; de Nijs, Bart; van Huis, Marijn A.; Sánchez-Iglesias, Ana; Liz-Marzán, Luis M.; van Blaaderen, Alfons; Joost Batenburg, K.; Van Tendeloo, Gustaaf

    2016-01-01

    Nanoparticle assemblies can be investigated in 3 dimensions using electron tomography. However, it is not straightforward to obtain quantitative information such as the number of particles or their relative position. This becomes particularly difficult when the number of particles increases. We propose a novel approach in which prior information on the shape of the individual particles is exploited. It improves the quality of the reconstruction of these complex assemblies significantly. Moreover, this quantitative Sparse Sphere Reconstruction approach yields directly the number of particles and their position as an output of the reconstruction technique, enabling a detailed 3D analysis of assemblies with as many as 10 000 particles. The approach can also be used to reconstruct objects based on a very limited number of projections, which opens up possibilities to investigate beam sensitive assemblies where previous reconstructions with the available electron tomography techniques failed. PMID:26607629

  9. Quantitative analysis of the economically recoverable resource

    SciTech Connect

    Pulle, C.V.; Seskus, A.P.

    1981-05-01

    The objective of this study is to obtain estimates of the economically recoverable gas in the Appalachian Basin. The estimates were obtained in terms of a probability distribution, which quantifies the inherent uncertainty associated with estimates where geologic and production uncertainties prevail. It is established that well productivity on a county and regional basis is lognormally distributed, and the total recoverable gas is Normally distributed. The expected (mean), total economically recoverable gas is 20.2 trillion cubic feet (TCF) with a standard deviation of 1.6 TCF, conditional on the use of shooting technology on 160-acre well-spacing. From properties of the Normal distribution, it is seen that a 95 percent probability exists for the total recoverable gas to lie between 17.06 and 23.34 TCF. The estimates are sensitive to well spacings and the technology applied to a particular geologic environment. It is observed that with smaller well spacings - for example, at 80 acres - the estimate is substantially increased, and that advanced technology, such as foam fracturing, has the potential of significantly increasing gas recovery. However, the threshold and optimum conditions governing advanced exploitation technology, based on well spacing and other parameters, were not analyzed in this study. Their technological impact on gas recovery is mentioned in the text where relevant; and on the basis of a rough projection an additional 10 TCF could be expected with the use of foam fracturing on wells with initial open flows lower than 300 MCFD. From the exploration point of view, the lognormal distribution of well productivity suggests that even in smaller areas, such as a county basis, intense exploration might be appropriate. This is evident from the small tail probabilities of the lognormal distribution, which represent the small number of wells with relatively very high productivity.

  10. Quantitative surface spectroscopic analysis of multicomponent polymers

    NASA Astrophysics Data System (ADS)

    Zhuang, Hengzhong

    Angle-dependent electron spectroscopy for chemical analysis (ESCA) has been successfully used to examine the surface compositional gradient of a multicomponent polymer. However, photoelectron intensities detected at each take-off angle of ESCA measurements are convoluted signals. The convoluted nature of the signal distorts depth profiles for samples having compositional gradients. To recover the true concentration profiles for the samples, a deconvolution program has been described in Chapter 2. The compositional profiles of two classes of important multicomponent polymers, i.e., poly(dimethysiloxane urethane) (PU-DMS) segmented copolymers and fluorinated poly(amide urethane) block copolymers, are achieved using this program. The effects of the polymer molecular structure and the processing variation on its surface compositional profile have been studied. Besides surface composition, it is desirable to know whether the distribution of segment or block lengths at the surface is different than in the bulk, because this aspect of surface structure may lead to properties different than that predicted simply by knowledge of the surface composition and the bulk structure. In Chapter 3, we pioneered the direct determination of the distribution of polydimethylsiloxane (PDMS) segment lengths at the surface of PU-DMS using time-of-flight secondary ion mass spectrometry (SUMS). Exciting preliminary results are provided: for the thick film of PU-DMS with nominal MW of PDMS = 1000, the distribution of the PDMS segment lengths at the surface is nearly identical to that in the bulk, whereas in the case of the thick films of PU-DMS with nominal MW of PDMS = 2400, only those PDMS segments with MW of ca. 1000 preferentially segregated at the surface. As a potential minimal fouling coating or biocompatible cardio-vascular materials, PU-DMS copolymers eventually come into contact with water once in use. Could such an environmental change (from air to aqueous) induce any undesirable

  11. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. PMID:26456251

  12. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods.

  13. Quantitative analysis of in vivo cell proliferation.

    PubMed

    Cameron, Heather A

    2006-11-01

    Injection and immunohistochemical detection of 5-bromo-2'-deoxyuridine (BrdU) has become the standard method for studying the birth and survival of neurons, glia, and other cell types in the nervous system. BrdU, a thymidine analog, becomes stably incorporated into DNA during the S-phase of mitosis. Because DNA containing BrdU can be specifically recognized by antibodies, this method allows dividing cells to be marked at any given time and then identified at time points from a few minutes to several years later. BrdU immunohistochemistry is suitable for cell counting to examine the regulation of cell proliferation and cell fate. It can be combined with labeling by other antibodies, allowing confocal analysis of cell phenotype or expression of other proteins. The potential for nonspecific labeling and toxicity are discussed. Although BrdU immunohistochemistry has almost completely replaced tritiated thymidine autoradiography for labeling dividing cells, this method and situations in which it is still useful are also described. PMID:18428635

  14. Differentiation between Glioblastoma Multiforme and Primary Cerebral Lymphoma: Additional Benefits of Quantitative Diffusion-Weighted MR Imaging

    PubMed Central

    Li, Chien Feng; Chen, Tai Yuan; Shu, Ginger; Kuo, Yu Ting; Lee, Yu Chang

    2016-01-01

    The differentiation between glioblastoma multiforme (GBM) and primary cerebral lymphoma (PCL) is important because the treatments are substantially different. The purpose of this article is to describe the MR imaging characteristics of GBM and PCL with emphasis on the quantitative ADC analysis in the tumor necrosis, the most strongly-enhanced tumor area, and the peritumoral edema. This retrospective cohort study collected 104 GBM (WHO grade IV) patients and 22 immune-competent PCL (diffuse large B cell lymphoma) patients. All these patients had pretreatment brain MR DWI and ADC imaging. Analysis of conventional MR imaging and quantitative ADC measurement including the tumor necrosis (ADCn), the most strongly-enhanced tumor area (ADCt), and the peritumoral edema (ADCe) were done. ROC analysis with optimal cut-off values and area-under-the ROC curve (AUC) was performed. For conventional MR imaging, there are statistical differences in tumor size, tumor location, tumor margin, and the presence of tumor necrosis between GBM and PCL. Quantitative ADC analysis shows that GBM tended to have significantly (P<0.05) higher ADC in the most strongly-enhanced area (ADCt) and lower ADC in the peritumoral edema (ADCe) as compared with PCL. Excellent AUC (0.94) with optimal sensitivity of 90% and specificity of 86% for differentiating between GBM and PCL was obtained by combination of ADC in the tumor necrosis (ADCn), the most strongly-enhanced tumor area (ADCt), and the peritumoral edema (ADCe). Besides, there are positive ADC gradients in the peritumoral edema in a subset of GBMs but not in the PCLs. Quantitative ADC analysis in these three areas can thus be implemented to improve diagnostic accuracy for these two brain tumor types. The histological correlation of the ADC difference deserves further investigation. PMID:27631626

  15. Differentiation between Glioblastoma Multiforme and Primary Cerebral Lymphoma: Additional Benefits of Quantitative Diffusion-Weighted MR Imaging.

    PubMed

    Ko, Ching Chung; Tai, Ming Hong; Li, Chien Feng; Chen, Tai Yuan; Chen, Jeon Hor; Shu, Ginger; Kuo, Yu Ting; Lee, Yu Chang

    2016-01-01

    The differentiation between glioblastoma multiforme (GBM) and primary cerebral lymphoma (PCL) is important because the treatments are substantially different. The purpose of this article is to describe the MR imaging characteristics of GBM and PCL with emphasis on the quantitative ADC analysis in the tumor necrosis, the most strongly-enhanced tumor area, and the peritumoral edema. This retrospective cohort study collected 104 GBM (WHO grade IV) patients and 22 immune-competent PCL (diffuse large B cell lymphoma) patients. All these patients had pretreatment brain MR DWI and ADC imaging. Analysis of conventional MR imaging and quantitative ADC measurement including the tumor necrosis (ADCn), the most strongly-enhanced tumor area (ADCt), and the peritumoral edema (ADCe) were done. ROC analysis with optimal cut-off values and area-under-the ROC curve (AUC) was performed. For conventional MR imaging, there are statistical differences in tumor size, tumor location, tumor margin, and the presence of tumor necrosis between GBM and PCL. Quantitative ADC analysis shows that GBM tended to have significantly (P<0.05) higher ADC in the most strongly-enhanced area (ADCt) and lower ADC in the peritumoral edema (ADCe) as compared with PCL. Excellent AUC (0.94) with optimal sensitivity of 90% and specificity of 86% for differentiating between GBM and PCL was obtained by combination of ADC in the tumor necrosis (ADCn), the most strongly-enhanced tumor area (ADCt), and the peritumoral edema (ADCe). Besides, there are positive ADC gradients in the peritumoral edema in a subset of GBMs but not in the PCLs. Quantitative ADC analysis in these three areas can thus be implemented to improve diagnostic accuracy for these two brain tumor types. The histological correlation of the ADC difference deserves further investigation. PMID:27631626

  16. Control of separation and quantitative analysis by GC-FTIR

    NASA Astrophysics Data System (ADS)

    Semmoud, A.; Huvenne, Jean P.; Legrand, P.

    1992-03-01

    Software for 3-D representations of the 'Absorbance-Wavenumber-Retention time' is used to control the quality of the GC separation. Spectral information given by the FTIR detection allows the user to be sure that a chromatographic peak is 'pure.' The analysis of peppermint essential oil is presented as an example. This assurance is absolutely required for quantitative applications. In these conditions, we have worked out a quantitative analysis of caffeine. Correlation coefficients between integrated absorbance measurements and concentration of caffeine are discussed at two steps of the data treatment.

  17. Quantitative analysis of fault slip evolution in analogue transpression models

    NASA Astrophysics Data System (ADS)

    Leever, Karen; Gabrielsen, Roy H.; Schmid, Dani; Braathen, Alvar

    2010-05-01

    A quantitative analysis of fault slip evolution in crustal scale brittle and brittle-ductile analogue models of doubly vergent transpressional wedges was performed by means of Particle Image Velocimetry (PIV). The kinematic analyses allow detailed comparison between model results and field kinematic data. This novel approach leads to better understanding of the evolution of transpressional orogens such as the Tertiary West Spitsbergen fold and thrust belt in particular and will advance the understanding of transpressional wedge mechanics in general. We ran a series of basal-driven models with convergence angles of 4, 7.5, 15 and 30 degrees. In these crustal scale models, brittle rheology was represented by quartz sand; in one model a viscous PDMS layer was included at shallow depth. Total sand pack thickness was 6cm, its extent 120x60cm. The PIV method was used to calculate a vector field from pairs of images that were recorded from the top of the experiments at a 2mm displacement increment. The slip azimuth on discrete faults was calculated and visualized by means of a directional derivative of this vector field. From this data set, several stages in the evolution of the models could be identified. The stages were defined by changes in the degree of displacement partitioning, i.e. slip along-strike and orthogonal to the plate boundary. A first stage of distributed strain (with no visible faults at the model surface) was followed by a shear lens stage with oblique displacement on pro- and retro-shear. The oblique displacement became locally partitioned during progressive displacement. During the final stage, strain was more fully partitioned between a newly formed central strike slip zone and reverse faults at the sides. Strain partitioning was best developed in the 15 degrees model, which shows near-reverse faults along both sides of the wedge in addition to strike slip displacement in the center. In further analysis we extracted average slip vectors for

  18. A quantitative analysis of the F18 flight control system

    NASA Technical Reports Server (NTRS)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  19. Quantitative analysis of flagellar proteins in Drosophila sperm tails.

    PubMed

    Mendes Maia, Teresa; Paul-Gilloteaux, Perrine; Basto, Renata

    2015-01-01

    The cilium has a well-defined structure, which can still accommodate some morphological and molecular composition diversity to suit the functional requirements of different cell types. The sperm flagellum of the fruit fly Drosophila melanogaster appears as a good model to study the genetic regulation of axoneme assembly and motility, due to the wealth of genetic tools publically available for this organism. In addition, the fruit fly's sperm flagellum displays quite a long axoneme (∼1.8mm), which may facilitate both histological and biochemical analyses. Here, we present a protocol for imaging and quantitatively analyze proteins, which associate with the fly differentiating, and mature sperm flagella. We will use as an example the quantification of tubulin polyglycylation in wild-type testes and in Bug22 mutant testes, which present defects in the deposition of this posttranslational modification. During sperm biogenesis, flagella appear tightly bundled, which makes it more challenging to get accurate measurements of protein levels from immunostained specimens. The method we present is based on the use of a novel semiautomated, macro installed in the image processing software ImageJ. It allows to measure fluorescence levels in closely associated sperm tails, through an exact distinction between positive and background signals, and provides background-corrected pixel intensity values that can directly be used for data analysis. PMID:25837396

  20. Quantitative methods for the analysis of zoosporic fungi.

    PubMed

    Marano, Agostina V; Gleason, Frank H; Bärlocher, Felix; Pires-Zottarelli, Carmen L A; Lilje, Osu; Schmidt, Steve K; Rasconi, Serena; Kagami, Maiko; Barrera, Marcelo D; Sime-Ngando, Télesphore; Boussiba, Sammy; de Souza, José I; Edwards, Joan E

    2012-04-01

    Quantitative estimations of zoosporic fungi in the environment have historically received little attention, primarily due to methodological challenges and their complex life cycles. Conventional methods for quantitative analysis of zoosporic fungi to date have mainly relied on direct observation and baiting techniques, with subsequent fungal identification in the laboratory using morphological characteristics. Although these methods are still fundamentally useful, there has been an increasing preference for quantitative microscopic methods based on staining with fluorescent dyes, as well as the use of hybridization probes. More recently however PCR based methods for profiling and quantification (semi- and absolute) have proven to be rapid and accurate diagnostic tools for assessing zoosporic fungal assemblages in environmental samples. Further application of next generation sequencing technologies will however not only advance our quantitative understanding of zoosporic fungal ecology, but also their function through the analysis of their genomes and gene expression as resources and databases expand in the future. Nevertheless, it is still necessary to complement these molecular-based approaches with cultivation-based methods in order to gain a fuller quantitative understanding of the ecological and physiological roles of zoosporic fungi.

  1. CUMULATIVE RISK ASSESSMENT: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS

    EPA Science Inventory

    INTRODUCTION: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS FOR CUMULATIVE RISK

    Hugh A. Barton1 and Carey N. Pope2
    1US EPA, Office of Research and Development, National Health and Environmental Effects Research Laboratory, Research Triangle Park, NC
    2Department of...

  2. Early Child Grammars: Qualitative and Quantitative Analysis of Morphosyntactic Production

    ERIC Educational Resources Information Center

    Legendre, Geraldine

    2006-01-01

    This article reports on a series of 5 analyses of spontaneous production of verbal inflection (tense and person-number agreement) by 2-year-olds acquiring French as a native language. A formal analysis of the qualitative and quantitative results is developed using the unique resources of Optimality Theory (OT; Prince & Smolensky, 2004). It is…

  3. Quantitative transverse flow measurement using OCT speckle decorrelation analysis

    PubMed Central

    Liu, Xuan; Huang, Yong; Ramella-Roman, Jessica C.; Mathews, Scott A.; Kang, Jin U.

    2014-01-01

    We propose an inter-Ascan speckle decorrelation based method that can quantitatively assess blood flow normal to the direction of the OCT imaging beam. To validate this method, we performed a systematic study using both phantom and in vivo animal models. Results show that our speckle analysis method can accurately extract transverse flow speed with high spatial and temporal resolution. PMID:23455305

  4. Quantitating the subtleties of microglial morphology with fractal analysis

    PubMed Central

    Karperien, Audrey; Ahammer, Helmut; Jelinek, Herbert F.

    2013-01-01

    It is well established that microglial form and function are inextricably linked. In recent years, the traditional view that microglial form ranges between “ramified resting” and “activated amoeboid” has been emphasized through advancing imaging techniques that point to microglial form being highly dynamic even within the currently accepted morphological categories. Moreover, microglia adopt meaningful intermediate forms between categories, with considerable crossover in function and varying morphologies as they cycle, migrate, wave, phagocytose, and extend and retract fine and gross processes. From a quantitative perspective, it is problematic to measure such variability using traditional methods, but one way of quantitating such detail is through fractal analysis. The techniques of fractal analysis have been used for quantitating microglial morphology, to categorize gross differences but also to differentiate subtle differences (e.g., amongst ramified cells). Multifractal analysis in particular is one technique of fractal analysis that may be useful for identifying intermediate forms. Here we review current trends and methods of fractal analysis, focusing on box counting analysis, including lacunarity and multifractal analysis, as applied to microglial morphology. PMID:23386810

  5. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  6. Improved method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.

    An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  7. Quantitative analysis of culture using millions of digitized books

    PubMed Central

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  8. Quantitative analysis of culture using millions of digitized books.

    PubMed

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.

  9. Quantitative analysis of culture using millions of digitized books.

    PubMed

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  10. Particle concentration measurement of virus samples using electrospray differential mobility analysis and quantitative amino acid analysis.

    PubMed

    Cole, Kenneth D; Pease, Leonard F; Tsai, De-Hao; Singh, Tania; Lute, Scott; Brorson, Kurt A; Wang, Lili

    2009-07-24

    Virus reference materials are needed to develop and calibrate detection devices and instruments. We used electrospray differential mobility analysis (ES-DMA) and quantitative amino acid analysis (AAA) to determine the particle concentration of three small model viruses (bacteriophages MS2, PP7, and phiX174). The biological activity, purity, and aggregation of the virus samples were measured using plaque assays, denaturing gel electrophoresis, and size-exclusion chromatography. ES-DMA was developed to count the virus particles using gold nanoparticles as internal standards. ES-DMA additionally provides quantitative measurement of the size and extent of aggregation in the virus samples. Quantitative AAA was also used to determine the mass of the viral proteins in the pure virus samples. The samples were hydrolyzed and the masses of the well-recovered amino acids were used to calculate the equivalent concentration of viral particles in the samples. The concentration of the virus samples determined by ES-DMA was in good agreement with the concentration predicted by AAA for these purified samples. The advantages and limitations of ES-DMA and AAA to characterize virus reference materials are discussed.

  11. Quantitative proteomics analysis of adsorbed plasma proteins classifies nanoparticles with different surface properties and size

    SciTech Connect

    Zhang, Haizhen; Burnum, Kristin E.; Luna, Maria L.; Petritis, Brianne O.; Kim, Jong Seo; Qian, Weijun; Moore, Ronald J.; Heredia-Langner, Alejandro; Webb-Robertson, Bobbie-Jo M.; Thrall, Brian D.; Camp, David G.; Smith, Richard D.; Pounds, Joel G.; Liu, Tao

    2011-12-01

    In biofluids (e.g., blood plasma) nanoparticles are readily embedded in layers of proteins that can affect their biological activity and biocompatibility. Herein, we report a study on the interactions between human plasma proteins and nanoparticles with a controlled systematic variation of properties using stable isotope labeling and liquid chromatography-mass spectrometry (LC-MS) based quantitative proteomics. Novel protocol has been developed to simplify the isolation of nanoparticle bound proteins and improve the reproducibility. Plasma proteins associated with polystyrene nanoparticles with three different surface chemistries and two sizes as well as for four different exposure times (for a total of 24 different samples) were identified and quantified by LC-MS analysis. Quantitative comparison of relative protein abundances were achieved by spiking an 18 O-labeled 'universal reference' into each individually processed unlabeled sample as an internal standard, enabling simultaneous application of both label-free and isotopic labeling quantitation across the sample set. Clustering analysis of the quantitative proteomics data resulted in distinctive pattern that classifies the nanoparticles based on their surface properties and size. In addition, data on the temporal study indicated that the stable protein 'corona' that was isolated for the quantitative analysis appeared to be formed in less than 5 minutes. The comprehensive results obtained herein using quantitative proteomics have potential implications towards predicting nanoparticle biocompatibility.

  12. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. PMID:24889823

  13. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies.

  14. [Study of infrared spectroscopy quantitative analysis method for methane gas based on data mining].

    PubMed

    Zhang, Ai-Ju

    2013-10-01

    Monitoring of methane gas is one of the important factors affecting the coal mine safety. The online real-time monitoring of the methane gas is used for the mine safety protection. To improve the accuracy of model analysis, in the present paper, the author uses the technology of infrared spectroscopy to study the gas infrared quantitative analysis algorithm. By data mining technology application in multi-component infrared spectroscopy quantitative analysis algorithm, it was found that cluster analysis partial least squares algorithm is obviously superior to simply using partial least squares algorithm in terms of accuracy. In addition, to reduce the influence of the error on the accuracy of model individual calibration samples, the clustering analysis was used for the data preprocessing, and such denoising method was found to improve the analysis accuracy.

  15. Origins of stereoselectivity in the Diels-Alder addition of chiral hydroxyalkyl vinyl ketones to cyclopentadiene: a quantitative computational study.

    PubMed

    Bakalova, Snezhana M; Kaneti, Jose

    2008-12-18

    Modest basis set level MP2/6-31G(d,p) calculations on the Diels-Alder addition of S-1-alkyl-1-hydroxy-but-3-en-2-ones (1-hydroxy-1-alkyl methyl vinyl ketones) to cyclopentadiene correctly reproduce the trends in known experimental endo/exo and diastereoface selectivity. B3LYP theoretical results at the same or significantly higher basis set level, on the other hand, do not satisfactorily model observed endo/exo selectivities and are thus unsuitable for quantitative studies. The same is valid also with regard to subtle effects originating from, for example, conformational distributions of reactants. The latter shortcomings are not alleviated by the fact that observed diastereoface selectivities are well-reproduced by DFT calculations. Quantitative computational studies of large cycloaddition systems would require higher basis sets and better account for electron correlation than MP2, such as, for example, CCSD. Presently, however, with 30 or more non-hydrogen atoms, these computations are hardly feasible. We present quantitatively correct stereochemical predictions using a hybrid layered ONIOM computational approach, including the chiral carbon atom and the intramolecular hydrogen bond into a higher level, MP2/6-311G(d,p) or CCSD/6-311G(d,p), layer. Significant computational economy is achieved by taking account of surrounding bulky (alkyl) residues at 6-31G(d) in a low HF theoretical level layer. We conclude that theoretical calculations based on explicit correlated MO treatment of the reaction site are sufficiently reliable for the prediction of both endo/exo and diastereoface selectivity of Diels-Alder addition reactions. This is in line with the understanding of endo/exo selectivity originating from dynamic electron correlation effects of interacting pi fragments and diastereofacial selectivity originating from steric interactions of fragments outside of the Diels-Alder reaction site. PMID:18637663

  16. A diagnostic programme for quantitative analysis of proteinuria.

    PubMed

    Hofmann, W; Guder, W G

    1989-09-01

    A spectrum of quantitative methods was adapted to the Kone Specific Analyser for the purpose of recognizing, quantifying and differentiating various forms of proteinuria. Total protein, IgG, albumin and alpha 1-microglobulin (measured by turbidimetry), N-acetyl-beta-D-glucosaminidase activity and creatinine (measured photometrically), were measured in undiluted urine; in addition alpha 1-microglobulin was measured in serum. Within and between run precision, accuracy and linearity of the turbidimetric methods were in good agreement with nephelometric procedures. All turbidimetric methods exhibited a correlation coefficient r greater than 0.98 when compared with the radial immunodiffusion procedure as reference method. Total protein measured turbidimetrically with the Kone Specific Analyser was in good agreement with the manual biuret procedure. The low detection limits and linearities allowed quantification of urine analytes from the lower range of normals up to ten times the upper limit of normals. The measured analytes exhibited stability in urine at pH 4-8 over at least seven days at 4-6 degrees C and -20 degrees C. Only IgG showed a significant loss (up to 30 percent), when measured after storage at -20 degrees C. Quantities per mol creatinine showed significantly lower intra-individual and inter-individual variability than quantities per liter. In 31 normal persons, the intraindividual variation was lowest for N-acetyl-beta-D-glucosaminidase activity (13%) and highest for total protein (33%), when measured in the second morning urine on 5 consecutive days. When related to creatinine, results obtained in the second morning urine showed no significant differences from those in 24 h urine, except for alpha 1-microglobulin which gave lower values in 24 h urines. The upper normal limits, calculated as the 95% ranges, were determined from 154 urines of 31 individuals. Nearly all analytes showed an asymmetric distribution. Because of a wide tailing of the upper limit

  17. Computed Tomography Inspection and Analysis for Additive Manufacturing Components

    NASA Technical Reports Server (NTRS)

    Beshears, Ronald D.

    2016-01-01

    Computed tomography (CT) inspection was performed on test articles additively manufactured from metallic materials. Metallic AM and machined wrought alloy test articles with programmed flaws were inspected using a 2MeV linear accelerator based CT system. Performance of CT inspection on identically configured wrought and AM components and programmed flaws was assessed using standard image analysis techniques to determine the impact of additive manufacturing on inspectability of objects with complex geometries.

  18. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. PMID:26556680

  19. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image.

  20. The Quantitative Analysis of an Analgesic Tablet: An NMR Experiment for the Instrumental Analysis Course

    NASA Astrophysics Data System (ADS)

    Schmedake, Thomas A.; Welch, Lawrence E.

    1996-11-01

    A quantitative analysis experiment is outlined that uses 13C NMR. Initial work utilizes a known compound (acenapthene) to assess the type of NMR experiment necessary to achieve a proportional response from all of the carbons in the compound. Both gated decoupling and inverse gated decoupling routines with a variety of delay times are inspected, in addition to investigation of paramagnetic additives in conjunction with inverse gated decoupling. Once the experiments with the known compound have illuminated the merits of the differing strategies for obtaining a proportional carbon response, a quantitative assessment of an unknown analgesic tablet is undertaken. The amounts of the two major components of the tablet, acetaminophen and aspirin, are determined following addition of an internal standard to the mixture. The carbon resonances emanating from each compound can be identified using spectra of the pure analgesic components and internal standard. Knowing the concentration of the internal standard and assuming a proportional response to all carbons in the sample allows calculation of the amount of both analytes in the analgesic tablets. Data from an initial laboratory trial is presented that illustrates the accuracy of the procedure.

  1. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference. PMID:15908107

  2. Data from quantitative label free proteomics analysis of rat spleen.

    PubMed

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis. PMID:27358910

  3. Additivity in the Analysis and Design of HIV Protease Inhibitors

    PubMed Central

    Jorissen, Robert N.; Kiran Kumar Reddy, G. S.; Ali, Akbar; Altman, Michael D.; Chellappan, Sripriya; Anjum, Saima G.; Tidor, Bruce; Schiffer, Celia A.; Rana, Tariq M.; Gilson, Michael K.

    2009-01-01

    We explore the applicability of an additive treatment of substituent effects to the analysis and design of HIV protease inhibitors. Affinity data for a set of inhibitors with a common chemical framework were analyzed to provide estimates of the free energy contribution of each chemical substituent. These estimates were then used to design new inhibitors, whose high affinities were confirmed by synthesis and experimental testing. Derivations of additive models by least-squares and ridge-regression methods were found to yield statistically similar results. The additivity approach was also compared with standard molecular descriptor-based QSAR; the latter was not found to provide superior predictions. Crystallographic studies of HIV protease-inhibitor complexes help explain the perhaps surprisingly high degree of substituent additivity in this system, and allow some of the additivity coefficients to be rationalized on a structural basis. PMID:19193159

  4. Quantitative Analysis of the Effective Functional Structure in Yeast Glycolysis

    PubMed Central

    De la Fuente, Ildefonso M.; Cortes, Jesus M.

    2012-01-01

    The understanding of the effective functionality that governs the enzymatic self-organized processes in cellular conditions is a crucial topic in the post-genomic era. In recent studies, Transfer Entropy has been proposed as a rigorous, robust and self-consistent method for the causal quantification of the functional information flow among nonlinear processes. Here, in order to quantify the functional connectivity for the glycolytic enzymes in dissipative conditions we have analyzed different catalytic patterns using the technique of Transfer Entropy. The data were obtained by means of a yeast glycolytic model formed by three delay differential equations where the enzymatic rate equations of the irreversible stages have been explicitly considered. These enzymatic activity functions were previously modeled and tested experimentally by other different groups. The results show the emergence of a new kind of dynamical functional structure, characterized by changing connectivity flows and a metabolic invariant that constrains the activity of the irreversible enzymes. In addition to the classical topological structure characterized by the specific location of enzymes, substrates, products and feedback-regulatory metabolites, an effective functional structure emerges in the modeled glycolytic system, which is dynamical and characterized by notable variations of the functional interactions. The dynamical structure also exhibits a metabolic invariant which constrains the functional attributes of the enzymes. Finally, in accordance with the classical biochemical studies, our numerical analysis reveals in a quantitative manner that the enzyme phosphofructokinase is the key-core of the metabolic system, behaving for all conditions as the main source of the effective causal flows in yeast glycolysis. PMID:22393350

  5. Single-Molecule Sensors: Challenges and Opportunities for Quantitative Analysis.

    PubMed

    Gooding, J Justin; Gaus, Katharina

    2016-09-12

    Measurement science has been converging to smaller and smaller samples, such that it is now possible to detect single molecules. This Review focuses on the next generation of analytical tools that combine single-molecule detection with the ability to measure many single molecules simultaneously and/or process larger and more complex samples. Such single-molecule sensors constitute a new type of quantitative analytical tool, as they perform analysis by molecular counting and thus potentially capture the heterogeneity of the sample. This Review outlines the advantages and potential of these new, quantitative single-molecule sensors, the measurement challenges in making single-molecule devices suitable for analysis, the inspiration biology provides for overcoming these challenges, and some of the solutions currently being explored.

  6. Single-Molecule Sensors: Challenges and Opportunities for Quantitative Analysis.

    PubMed

    Gooding, J Justin; Gaus, Katharina

    2016-09-12

    Measurement science has been converging to smaller and smaller samples, such that it is now possible to detect single molecules. This Review focuses on the next generation of analytical tools that combine single-molecule detection with the ability to measure many single molecules simultaneously and/or process larger and more complex samples. Such single-molecule sensors constitute a new type of quantitative analytical tool, as they perform analysis by molecular counting and thus potentially capture the heterogeneity of the sample. This Review outlines the advantages and potential of these new, quantitative single-molecule sensors, the measurement challenges in making single-molecule devices suitable for analysis, the inspiration biology provides for overcoming these challenges, and some of the solutions currently being explored. PMID:27444661

  7. Quantitative NMR Analysis of Partially Substituted Biodiesel Glycerols

    SciTech Connect

    Nagy, M.; Alleman, T. L.; Dyer, T.; Ragauskas, A. J.

    2009-01-01

    Phosphitylation of hydroxyl groups in biodiesel samples with 2-chloro-4,4,5,5-tetramethyl-1,3,2-dioxaphospholane followed by 31P-NMR analysis provides a rapid quantitative analytical technique for the determination of substitution patterns on partially esterified glycerols. The unique 31P-NMR chemical shift data was established with a series mono and di-substituted fatty acid esters of glycerol and then utilized to characterize an industrial sample of partially processed biodiesel.

  8. A quantitative analysis of coupled oscillations using mobile accelerometer sensors

    NASA Astrophysics Data System (ADS)

    Castro-Palacio, Juan Carlos; Velázquez-Abad, Luisberis; Giménez, Fernando; Monsoriu, Juan A.

    2013-05-01

    In this paper, smartphone acceleration sensors were used to perform a quantitative analysis of mechanical coupled oscillations. Symmetric and asymmetric normal modes were studied separately in the first two experiments. In the third, a coupled oscillation was studied as a combination of the normal modes. Results indicate that acceleration sensors of smartphones, which are very familiar to students, represent valuable measurement instruments for introductory and first-year physics courses.

  9. High-throughput automated image analysis of neuroinflammation and neurodegeneration enables quantitative assessment of virus neurovirulence

    PubMed Central

    Maximova, Olga A.; Murphy, Brian R.; Pletnev, Alexander G.

    2010-01-01

    Historically, the safety of live attenuated vaccine candidates against neurotropic viruses was assessed by semi-quantitative analysis of virus-induced histopathology in the central nervous system of monkeys. We have developed a high-throughput automated image analysis (AIA) for the quantitative assessment of virus-induced neuroinflammation and neurodegeneration. Evaluation of the results generated by AIA showed that quantitative estimates of lymphocytic infiltration, microglial activation, and neurodegeneration strongly and significantly correlated with results of traditional histopathological scoring. In addition, we show that AIA is a targeted, objective, accurate, and time-efficient approach that provides reliable differentiation of virus neurovirulence. As such, it may become a useful tool in establishing consistent analytical standards across research and development laboratories and regulatory agencies, and may improve the safety evaluation of live virus vaccines. The implementation of this high-throughput AIA will markedly advance many fields of research including virology, neuroinflammation, neuroscience, and vaccinology. PMID:20688036

  10. Analysis of Artifacts Suggests DGGE Should Not Be Used For Quantitative Diversity Analysis

    PubMed Central

    Neilson, Julia W.; Jordan, Fiona L.; Maier, Raina M.

    2014-01-01

    PCR-denaturing gradient gel electrophoresis (PCR-DGGE) is widely used in microbial ecology for the analysis of comparative community structure. However, artifacts generated during PCR-DGGE of mixed template communities impede the application of this technique to quantitative analysis of community diversity. The objective of the current study was to employ an artificial bacterial community to document and analyze artifacts associated with multiband signatures and preferential template amplification and to highlight their impacts on the use of this technique for quantitative diversity analysis. Six bacterial species (three Betaproteobacteria, two Alphaproteobacteria, and one Firmicutes) were amplified individually and in combinations with primers targeting the V7/V8 region of the 16S rRNA gene. Two of the six isolates produced multiband profiles demonstrating that band number does not correlate directly with α-diversity. Analysis of the multiple bands from one of these isolates confirmed that both bands had identical sequences which lead to the hypothesis that the multiband pattern resulted from two distinct structural conformations of the same amplicon. In addition, consistent preferential amplification was demonstrated following pairwise amplifications of the six isolates. DGGE and real time PCR analysis identified primer mismatch and PCR inhibition due to 16S rDNA secondary structure as the most probable causes of preferential amplification patterns. Reproducible DGGE community profiles generated in this study confirm that PCR-DGGE provides an excellent high-throughput tool for comparative community structure analysis, but that method-specific artifacts preclude its use for accurate comparative diversity analysis. PMID:23313091

  11. Optimal Multicomponent Analysis Using the Generalized Standard Addition Method.

    ERIC Educational Resources Information Center

    Raymond, Margaret; And Others

    1983-01-01

    Describes an experiment on the simultaneous determination of chromium and magnesium by spectophotometry modified to include the Generalized Standard Addition Method computer program, a multivariate calibration method that provides optimal multicomponent analysis in the presence of interference and matrix effects. Provides instructions for…

  12. Comprehensive Quantitative Analysis of SQ Injection Using Multiple Chromatographic Technologies.

    PubMed

    Chau, Siu-Leung; Huang, Zhi-Bing; Song, Yan-Gang; Yue, Rui-Qi; Ho, Alan; Lin, Chao-Zhan; Huang, Wen-Hua; Han, Quan-Bin

    2016-01-01

    Quality control of Chinese medicine injections remains a challenge due to our poor knowledge of their complex chemical profile. This study aims to investigate the chemical composition of one of the best-selling injections, Shenqi Fuzheng (SQ) injection (SQI), via a full component quantitative analysis. A total of 15 representative small molecular components of SQI were simultaneously determined using ultra-high performance liquid chromatography (UHPLC) coupled with quadrupole tandem time-of-flight mass spectrometry (Q-TOF-MS); saccharide composition of SQI was also quantitatively determined by high performance liquid chromatography (HPLC) with evaporative light scattering detector (ELSD) on an amino column before and after acid hydrolysis. The existence of polysaccharides was also examined on a gel permeation chromatography column. The method was well validated in terms of linearity, sensitivity, precision, accuracy and stability, and was successfully applied to analyze 13 SQI samples. The results demonstrate that up to 94.69% (w/w) of this injection product are quantitatively determined, in which small molecules and monosaccharide/sucrose account for 0.18%-0.21%, and 53.49%-58.2%, respectively. The quantitative information contributes to accumulating scientific evidence to better understand the therapy efficacy and safety of complex Chinese medicine injections. PMID:27548134

  13. Quantitation of glycerophosphorylcholine by flow injection analysis using immobilized enzymes.

    PubMed

    Mancini, A; Del Rosso, F; Roberti, R; Caligiana, P; Vecchini, A; Binaglia, L

    1996-09-20

    A method for quantitating glycerophosphorylcholine by flow injection analysis is reported in the present paper. Glycerophosphorylcholine phosphodiesterase and choline oxidase, immobilized on controlled porosity glass beads, are packed in a small reactor inserted in a flow injection manifold. When samples containing glycerophosphorylcholine are injected, glycerophosphorylcholine is hydrolyzed into choline and sn-glycerol-3-phosphate. The free choline produced in this reaction is oxidized to betain and hydrogen peroxide. Hydrogen peroxide is detected amperometrically. Quantitation of glycerophosphorylcholine in samples containing choline and phosphorylcholine is obtained inserting ahead of the reactor a small column packed with a mixed bed ion exchange resin. The time needed for each determination does not exceed one minute. The present method, applied to quantitate glycerophosphorylcholine in samples of seminal plasma, gave results comparable with those obtained using the standard enzymatic-spectrophotometric procedure. An alternative procedure, making use of co-immobilized glycerophosphorylcholine phosphodiesterase and glycerol-3-phosphate oxidase for quantitating glycerophosphorylcholine, glycerophosphorylethanolamine and glycerophosphorylserine is also described. PMID:8905629

  14. Quantitative Proteomic Approaches for Analysis of Protein S-Nitrosylation.

    PubMed

    Qu, Zhe; Greenlief, C Michael; Gu, Zezong

    2016-01-01

    S-Nitrosylation is a redox-based post-translational modification of a protein in response to nitric oxide (NO) signaling, and it participates in a variety of processes in diverse biological systems. The significance of this type of protein modification in health and diseases is increasingly recognized. In the central nervous system, aberrant S-nitrosylation, due to excessive NO production, is known to cause protein misfolding, mitochondrial dysfunction, transcriptional dysregulation, and neuronal death. This leads to an altered physiological state and consequently contributes to pathogenesis of neurodegenerative disorders. To date, much effort has been made to understand the mechanisms underlying protein S-nitrosylation, and several approaches have been developed to unveil S-nitrosylated proteins from different organisms. Interest in determining the dynamic changes of protein S-nitrosylation under different physiological and pathophysiological conditions has underscored the need for the development of quantitative proteomic approaches. Currently, both gel-based and gel-free mass spectrometry-based quantitative methods are widely used, and they each have advantages and disadvantages but may also be used together to produce complementary data. This review evaluates current available quantitative proteomic techniques for the analysis of protein S-nitrosylation and highlights recent advances, with emphasis on applications in neurodegenerative diseases. An important goal is to provide a comprehensive guide of feasible quantitative proteomic methodologies for examining protein S-nitrosylation in research to yield insights into disease mechanisms, diagnostic biomarkers, and drug discovery.

  15. Comprehensive Quantitative Analysis of SQ Injection Using Multiple Chromatographic Technologies.

    PubMed

    Chau, Siu-Leung; Huang, Zhi-Bing; Song, Yan-Gang; Yue, Rui-Qi; Ho, Alan; Lin, Chao-Zhan; Huang, Wen-Hua; Han, Quan-Bin

    2016-08-19

    Quality control of Chinese medicine injections remains a challenge due to our poor knowledge of their complex chemical profile. This study aims to investigate the chemical composition of one of the best-selling injections, Shenqi Fuzheng (SQ) injection (SQI), via a full component quantitative analysis. A total of 15 representative small molecular components of SQI were simultaneously determined using ultra-high performance liquid chromatography (UHPLC) coupled with quadrupole tandem time-of-flight mass spectrometry (Q-TOF-MS); saccharide composition of SQI was also quantitatively determined by high performance liquid chromatography (HPLC) with evaporative light scattering detector (ELSD) on an amino column before and after acid hydrolysis. The existence of polysaccharides was also examined on a gel permeation chromatography column. The method was well validated in terms of linearity, sensitivity, precision, accuracy and stability, and was successfully applied to analyze 13 SQI samples. The results demonstrate that up to 94.69% (w/w) of this injection product are quantitatively determined, in which small molecules and monosaccharide/sucrose account for 0.18%-0.21%, and 53.49%-58.2%, respectively. The quantitative information contributes to accumulating scientific evidence to better understand the therapy efficacy and safety of complex Chinese medicine injections.

  16. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    ERIC Educational Resources Information Center

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  17. Quantitative analysis of surface electromyography: Biomarkers for convulsive seizures.

    PubMed

    Beniczky, Sándor; Conradsen, Isa; Pressler, Ronit; Wolf, Peter

    2016-08-01

    Muscle activity during seizures is in electroencephalographical (EEG) praxis often considered an irritating artefact. This article discusses ways by surface electromyography (EMG) to turn it into a valuable tool of epileptology. Muscles are in direct synaptic contact with motor neurons. Therefore, EMG signals provide direct information about the electric activity in the motor cortex. Qualitative analysis of EMG has traditionally been a part of the long-term video-EEG recordings. Recent development in quantitative analysis of EMG signals yielded valuable information on the pathomechanisms of convulsive seizures, demonstrating that it was different from maximal voluntary contraction, and different from convulsive psychogenic non-epileptic seizures. Furthermore, the tonic phase of the generalised tonic-clonic seizures (GTCS) proved to have different quantitative features than tonic seizures. The high temporal resolution of EMG allowed detailed characterisation of temporal dynamics of the GTCS, suggesting that the same inhibitory mechanisms that try to prevent the build-up of the seizure activity, contribute to ending the seizure. These findings have clinical implications: the quantitative EMG features provided the pathophysiologic substrate for developing neurophysiologic biomarkers that accurately identify GTCS. This proved to be efficient both for seizure detection and for objective, automated distinction between convulsive and non-convulsive epileptic seizures.

  18. A global analysis of soil acidification caused by nitrogen addition

    NASA Astrophysics Data System (ADS)

    Tian, Dashuan; Niu, Shuli

    2015-02-01

    Nitrogen (N) deposition-induced soil acidification has become a global problem. However, the response patterns of soil acidification to N addition and the underlying mechanisms remain far from clear. Here, we conducted a meta-analysis of 106 studies to reveal global patterns of soil acidification in responses to N addition. We found that N addition significantly reduced soil pH by 0.26 on average globally. However, the responses of soil pH varied with ecosystem types, N addition rate, N fertilization forms, and experimental durations. Soil pH decreased most in grassland, whereas boreal forest was not observed a decrease to N addition in soil acidification. Soil pH decreased linearly with N addition rates. Addition of urea and NH4NO3 contributed more to soil acidification than NH4-form fertilizer. When experimental duration was longer than 20 years, N addition effects on soil acidification diminished. Environmental factors such as initial soil pH, soil carbon and nitrogen content, precipitation, and temperature all influenced the responses of soil pH. Base cations of Ca2+, Mg2+ and K+ were critical important in buffering against N-induced soil acidification at the early stage. However, N addition has shifted global soils into the Al3+ buffering phase. Overall, this study indicates that acidification in global soils is very sensitive to N deposition, which is greatly modified by biotic and abiotic factors. Global soils are now at a buffering transition from base cations (Ca2+, Mg2+ and K+) to non-base cations (Mn2+ and Al3+). This calls our attention to care about the limitation of base cations and the toxic impact of non-base cations for terrestrial ecosystems with N deposition.

  19. Label-Free Technologies for Quantitative Multiparameter Biological Analysis

    PubMed Central

    Qavi, Abraham J.; Washburn, Adam L.; Byeon, Ji-Yeon; Bailey, Ryan C.

    2009-01-01

    In the post-genomic era, information is king and information-rich technologies are critically important drivers in both fundamental biology and medicine. It is now known that single-parameter measurements provide only limited detail and that quantitation of multiple biomolecular signatures can more fully illuminate complex biological function. Label-free technologies have recently attracted significant interest for sensitive and quantitative multiparameter analysis of biological systems. There are several different classes of label-free sensors that are currently being developed both in academia and in industry. In this critical review, we highlight, compare, and contrast some of the more promising approaches. We will describe the fundamental principles of these different methodologies and discuss advantages and disadvantages that might potentially help one in selecting the appropriate technology for a given bioanalytical application. PMID:19221722

  20. [Kinetic analysis of additive effect on desulfurization activity].

    PubMed

    Han, Kui-hua; Zhao, Jian-li; Lu, Chun-mei; Wang, Yong-zheng; Zhao, Gai-ju; Cheng, Shi-qing

    2006-02-01

    The additive effects of A12O3, Fe2O3 and MnCO3 on CaO sulfation kinetics were investigated by thermogravimetic analysis method and modified grain model. The activation energy (Ea) and the pre-exponential factor (k0) of surface reaction, the activation energy (Ep) and the pre-exponential factor (D0) of product layer diffusion reaction were calculated according to the model. Additions of MnCO3 can enhance the initial reaction rate, product layer diffusion and the final CaO conversion of sorbents, the effect mechanism of which is similar to that of Fe2O3. The method based isokinetic temperature Ts and activation energy can not estimate the contribution of additive to the sulfation reactivity, the rate constant of the surface reaction (k), and the effective diffusivity of reactant in the product layer (Ds) under certain experimental conditions can reflect the effect of additives on the activation. Unstoichiometric metal oxide may catalyze the surface reaction and promote the diffusivity of reactant in the product layer by the crystal defect and distinct diffusion of cation and anion. According to the mechanism and effect of additive on the sulfation, the effective temperature and the stoichiometric relation of reaction, it is possible to improve the utilization of sorbent by compounding more additives to the calcium-based sorbent.

  1. Quantitative Schlieren analysis applied to holograms of crystals grown on Spacelab 3

    NASA Technical Reports Server (NTRS)

    Brooks, Howard L.

    1986-01-01

    In order to extract additional information about crystals grown in the microgravity environment of Spacelab, a quantitative schlieren analysis technique was developed for use in a Holography Ground System of the Fluid Experiment System. Utilizing the Unidex position controller, it was possible to measure deviation angles produced by refractive index gradients of 0.5 milliradians. Additionally, refractive index gradient maps for any recorded time during the crystal growth were drawn and used to create solute concentration maps for the environment around the crystal. The technique was applied to flight holograms of Cell 204 of the Fluid Experiment System that were recorded during the Spacelab 3 mission on STS 51B. A triglycine sulfate crystal was grown under isothermal conditions in the cell and the data gathered with the quantitative schlieren analysis technique is consistent with a diffusion limited growth process.

  2. Quantitative analysis of in vivo confocal microscopy images: a review.

    PubMed

    Patel, Dipika V; McGhee, Charles N

    2013-01-01

    In vivo confocal microscopy (IVCM) is a non-invasive method of examining the living human cornea. The recent trend towards quantitative studies using IVCM has led to the development of a variety of methods for quantifying image parameters. When selecting IVCM images for quantitative analysis, it is important to be consistent regarding the location, depth, and quality of images. All images should be de-identified, randomized, and calibrated prior to analysis. Numerous image analysis software are available, each with their own advantages and disadvantages. Criteria for analyzing corneal epithelium, sub-basal nerves, keratocytes, endothelium, and immune/inflammatory cells have been developed, although there is inconsistency among research groups regarding parameter definition. The quantification of stromal nerve parameters, however, remains a challenge. Most studies report lower inter-observer repeatability compared with intra-observer repeatability, and observer experience is known to be an important factor. Standardization of IVCM image analysis through the use of a reading center would be crucial for any future large, multi-centre clinical trials using IVCM.

  3. Fluorescent foci quantitation for high-throughput analysis

    PubMed Central

    Ledesma-Fernández, Elena; Thorpe, Peter H.

    2015-01-01

    A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells. PMID:26290880

  4. Binary Imaging Analysis for Comprehensive Quantitative Assessment of Peripheral Nerve

    PubMed Central

    Hunter, Daniel A.; Moradzadeh, Arash; Whitlock, Elizabeth L.; Brenner, Michael J.; Myckatyn, Terence M.; Wei, Cindy H.; Tung, Thomas H.H.; Mackinnon, Susan E.

    2007-01-01

    Quantitative histomorphometry is the current gold standard for objective measurement of nerve architecture and its components. Many methods still in use rely heavily upon manual techniques that are prohibitively time consuming, predisposing to operator fatigue, sampling error, and overall limited reproducibility. More recently, investigators have attempted to combine the speed of automated morphometry with the accuracy of manual and semi-automated methods. Systematic refinements in binary imaging analysis techniques combined with an algorithmic approach allow for more exhaustive characterization of nerve parameters in the surgically relevant injury paradigms of regeneration following crush, transection, and nerve gap injuries. The binary imaging method introduced here uses multiple bitplanes to achieve reproducible, high throughput quantitative assessment of peripheral nerve. Number of myelinated axons, myelinated fiber diameter, myelin thickness, fiber distributions, myelinated fiber density, and neural debris can be quantitatively evaluated with stratification of raw data by nerve component. Results of this semi-automated method are validated by comparing values against those obtained with manual techniques. The use of this approach results in more rapid, accurate, and complete assessment of myelinated axons than manual techniques. PMID:17675163

  5. Quantitative option analysis for implementation and management of landfills.

    PubMed

    Kerestecioğlu, Merih

    2016-09-01

    The selection of the most feasible strategy for implementation of landfills is a challenging step. Potential implementation options of landfills cover a wide range, from conventional construction contracts to the concessions. Montenegro, seeking to improve the efficiency of the public services while maintaining affordability, was considering privatisation as a way to reduce public spending on service provision. In this study, to determine the most feasible model for construction and operation of a regional landfill, a quantitative risk analysis was implemented with four steps: (i) development of a global risk matrix; (ii) assignment of qualitative probabilities of occurrences and magnitude of impacts; (iii) determination of the risks to be mitigated, monitored, controlled or ignored; (iv) reduction of the main risk elements; and (v) incorporation of quantitative estimates of probability of occurrence and expected impact for each risk element in the reduced risk matrix. The evaluated scenarios were: (i) construction and operation of the regional landfill by the public sector; (ii) construction and operation of the landfill by private sector and transfer of the ownership to the public sector after a pre-defined period; and (iii) operation of the landfill by the private sector, without ownership. The quantitative risk assessment concluded that introduction of a public private partnership is not the most feasible option, unlike the common belief in several public institutions in developing countries. A management contract for the first years of operation was advised to be implemented, after which, a long term operating contract may follow. PMID:27354014

  6. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  7. A quantitative analysis of IRAS maps of molecular clouds

    NASA Technical Reports Server (NTRS)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-01-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  8. Facegram - Objective quantitative analysis in facial reconstructive surgery.

    PubMed

    Gerós, Ana; Horta, Ricardo; Aguiar, Paulo

    2016-06-01

    Evaluation of effectiveness in reconstructive plastic surgery has become an increasingly important asset in comparing and choosing the most suitable medical procedure to handle facial disfigurement. Unfortunately, traditional methods to assess the results of surgical interventions are mostly qualitative and lack information about movement dynamics. Along with this, the few existing methodologies tailored to objectively quantify surgery results are not practical in the medical field due to constraints in terms of cost, complexity and poor suitability to clinical environment. These limitations enforce an urgent need for the creation of a new system to quantify facial movement and allow for an easy interpretation by medical experts. With this in mind, we present here a novel method capable of quantitatively and objectively assess complex facial movements, using a set of morphological, static and dynamic measurements. For this purpose, RGB-D cameras are used to acquire both color and depth images, and a modified block matching algorithm, combining depth and color information, was developed to track the position of anatomical landmarks of interest. The algorithms are integrated into a user-friendly graphical interface and the analysis outcomes are organized into an innovative medical tool, named facegram. This system was developed in close collaboration with plastic surgeons and the methods were validated using control subjects and patients with facial paralysis. The system was shown to provide useful and detailed quantitative information (static and dynamic) making it an appropriate solution for objective quantitative characterization of facial movement in a clinical environment. PMID:26994664

  9. Quantitative Northern Blot Analysis of Mammalian rRNA Processing.

    PubMed

    Wang, Minshi; Pestov, Dimitri G

    2016-01-01

    Assembly of eukaryotic ribosomes is an elaborate biosynthetic process that begins in the nucleolus and requires hundreds of cellular factors. Analysis of rRNA processing has been instrumental for studying the mechanisms of ribosome biogenesis and effects of stress conditions on the molecular milieu of the nucleolus. Here, we describe the quantitative analysis of the steady-state levels of rRNA precursors, applicable to studies in mammalian cells and other organisms. We include protocols for gel electrophoresis and northern blotting of rRNA precursors using procedures optimized for the large size of these RNAs. We also describe the ratio analysis of multiple precursors, a technique that facilitates the accurate assessment of changes in the efficiency of individual pre-rRNA processing steps. PMID:27576717

  10. QUANTITATIVE MASS SPECTROMETRIC ANALYSIS OF GLYCOPROTEINS COMBINED WITH ENRICHMENT METHODS

    PubMed Central

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc. Rapid Commun. Mass Spec Rev 34:148–165, 2015. PMID:24889823

  11. Quantitative analysis of motion control in long term microgravity.

    PubMed

    Baroni, G; Ferrigno, G; Anolli, A; Andreoni, G; Pedotti, A

    1998-01-01

    In the frame of the 179-days EUROMIR '95 space mission, two in-flight experiments have foreseen quantitative three-dimensional human movement analysis in microgravity. For this aim, a space qualified opto-electronic motion analyser based on passive markers has been installed onboard the Russian Space Station MIR and 8 in flight sessions have been performed. Techhology and method for the collection of kinematics data are described, evaluating the accuracy in three-dimensional marker localisation. Results confirm the suitability of opto-electronic technology for quantitative human motion analysis on orbital modules and raise a set of "lessons learned", leading to the improvement of motion analyser performance with a contemporary swiftness of the on-board operations. Among the experimental program of T4, results of three voluntary posture perturbation protocols are described. The analysis suggests that a short term reinterpretation of proprioceptive information and re-calibration of sensorimotor mechanisms seem to end within the first weeks of flight, while a continuous long term adaptation process allows the refinement of motor performance, in the frame of never abandoned terrestrial strategies.

  12. Computer compensation for NMR quantitative analysis of trace components

    SciTech Connect

    Nakayama, T.; Fujiwara, Y.

    1981-07-22

    A computer program has been written that determines trace components and separates overlapping components in multicomponent NMR spectra. This program uses the Lorentzian curve as a theoretical curve of NMR spectra. The coefficients of the Lorentzian are determined by the method of least squares. Systematic errors such as baseline/phase distortion are compensated and random errors are smoothed by taking moving averages, so that there processes contribute substantially to decreasing the accumulation time of spectral data. The accuracy of quantitative analysis of trace components has been improved by two significant figures. This program was applied to determining the abundance of 13C and the saponification degree of PVA.

  13. Flow quantitation by radio frequency analysis of contrast echocardiography.

    PubMed

    Rovai, D; Lombardi, M; Mazzarisi, A; Landini, L; Taddei, L; Distante, A; Benassi, A; L'Abbate, A

    1993-03-01

    Contrast echocardiography has the potential for measuring cardiac output and regional blood flow. However, accurate quantitation is limited both by the use of non-standard contrast agents and by the electronic signal distortion inherent to the echocardiographic instruments. Thus, the aim of this study is to quantify flow by combining a stable contrast agent and a modified echo equipment, able to sample the radio frequency (RF) signal from a region of interest (ROI) in the echo image. The contrast agent SHU-454 (0.8 ml) was bolus injected into an in vitro calf vein, at 23 flow rates (ranging from 376 to 3620 ml/min) but constant volume and pressure. The ROI was placed in the centre of the vein, the RF signal was processed in real time and transferred to a personal computer to generate time-intensity curves. In the absence of recirculation, contrast washout slope and mean transit time (MTT) of curves (1.11-8.52 seconds) yielded excellent correlations with flow: r = 0.93 and 0.95, respectively. To compare the accuracy of RF analysis with that of conventional image processing as to flow quantitation, conventional images were collected in the same flow model by two different scanners: a) the mechanical sector scanner used for RF analysis, and b) a conventional electronic sector scanner. These images were digitized off-line, mean videodensity inside an identical ROI was measured and time-intensity curves were built. MTT by RF was shorter than by videodensitometric analysis of the images generated by the same scanner (p < 0.001). In contrast, MTT by RF was longer than by the conventional scanner (p < 0.001). Significant differences in MTT were also found with changes in the gain setting controls of the conventional scanner. To study the stability of the contrast effect, 6 contrast injections (20 ml) were performed at a constant flow rate during recirculation: the spontaneous decay in RF signal intensity (t1/2 = 64 +/- 8 seconds) was too long to affect MTT significantly

  14. Quantitative proteomic analysis of drug-induced changes in mycobacteria.

    PubMed

    Hughes, Minerva A; Silva, Jeffrey C; Geromanos, Scott J; Townsend, Craig A

    2006-01-01

    A new approach for qualitative and quantitative proteomic analysis using capillary liquid chromatography and mass spectrometry to study the protein expression response in mycobacteria following isoniazid treatment is discussed. In keeping with known effects on the fatty acid synthase II pathway, proteins encoded by the kas operon (AcpM, KasA, KasB, Accd6) were significantly overexpressed, as were those involved in iron metabolism and cell division suggesting a complex interplay of metabolic events leading to cell death. PMID:16396495

  15. [Quantitative analysis for mast cells in obstructive sialadenitis].

    PubMed

    Diao, G X

    1993-03-01

    Quantitative analysis for mast cells in 27 cases of obstructive sialadenitis, 12 cases of approximate normal salivary gland tissues and 5 cases of lymphoepithelial lesion of salivary glands shows that the number of mast cells is slightly increased with the increase of gravity-grade of obstructive sialadenitis and this is closely related to fibrosis of salivary glands and infiltration grade of inflammation cells (dominated by lymphocyte cells), whereas not closely relating to the age change of patients. For the cases of benign lymphoepithelial lesion of salivary glands with malignant changes despite of malignant lymphoma or squamous cell carcinoma the numbers of mast cells are obviously decreased.

  16. Lipid biomarker analysis for the quantitative analysis of airborne microorganisms

    SciTech Connect

    Macnaughton, S.J.; Jenkins, T.L.; Cormier, M.R.

    1997-08-01

    There is an ever increasing concern regarding the presence of airborne microbial contaminants within indoor air environments. Exposure to such biocontaminants can give rise to large numbers of different health effects including infectious diseases, allergenic responses and respiratory problems, Biocontaminants typically round in indoor air environments include bacteria, fungi, algae, protozoa and dust mites. Mycotoxins, endotoxins, pollens and residues of organisms are also known to cause adverse health effects. A quantitative detection/identification technique independent of culturability that assays both culturable and non culturable biomass including endotoxin is critical in defining risks from indoor air biocontamination. Traditionally, methods employed for the monitoring of microorganism numbers in indoor air environments involve classical culture based techniques and/or direct microscopic counting. It has been repeatedly documented that viable microorganism counts only account for between 0.1-10% of the total community detectable by direct counting. The classic viable microbiologic approach doe`s not provide accurate estimates of microbial fragments or other indoor air components that can act as antigens and induce or potentiate allergic responses. Although bioaerosol samplers are designed to damage the microbes as little as possible, microbial stress has been shown to result from air sampling, aerosolization and microbial collection. Higher collection efficiency results in greater cell damage while less cell damage often results in lower collection efficiency. Filtration can collect particulates at almost 100% efficiency, but captured microorganisms may become dehydrated and damaged resulting in non-culturability, however, the lipid biomarker assays described herein do not rely on cell culture. Lipids are components that are universally distributed throughout cells providing a means to assess independent of culturability.

  17. ANALYSIS OF MPC ACCESS REQUIREMENTS FOR ADDITION OF FILLER MATERIALS

    SciTech Connect

    W. Wallin

    1996-09-03

    This analysis is prepared by the Mined Geologic Disposal System (MGDS) Waste Package Development Department (WPDD) in response to a request received via a QAP-3-12 Design Input Data Request (Ref. 5.1) from WAST Design (formerly MRSMPC Design). The request is to provide: Specific MPC access requirements for the addition of filler materials at the MGDS (i.e., location and size of access required). The objective of this analysis is to provide a response to the foregoing request. The purpose of this analysis is to provide a documented record of the basis for the response. The response is stated in Section 8 herein. The response is based upon requirements from an MGDS perspective.

  18. Functional Regression Models for Epistasis Analysis of Multiple Quantitative Traits.

    PubMed

    Zhang, Futao; Xie, Dan; Liang, Meimei; Xiong, Momiao

    2016-04-01

    To date, most genetic analyses of phenotypes have focused on analyzing single traits or analyzing each phenotype independently. However, joint epistasis analysis of multiple complementary traits will increase statistical power and improve our understanding of the complicated genetic structure of the complex diseases. Despite their importance in uncovering the genetic structure of complex traits, the statistical methods for identifying epistasis in multiple phenotypes remains fundamentally unexplored. To fill this gap, we formulate a test for interaction between two genes in multiple quantitative trait analysis as a multiple functional regression (MFRG) in which the genotype functions (genetic variant profiles) are defined as a function of the genomic position of the genetic variants. We use large-scale simulations to calculate Type I error rates for testing interaction between two genes with multiple phenotypes and to compare the power with multivariate pairwise interaction analysis and single trait interaction analysis by a single variate functional regression model. To further evaluate performance, the MFRG for epistasis analysis is applied to five phenotypes of exome sequence data from the NHLBI's Exome Sequencing Project (ESP) to detect pleiotropic epistasis. A total of 267 pairs of genes that formed a genetic interaction network showed significant evidence of epistasis influencing five traits. The results demonstrate that the joint interaction analysis of multiple phenotypes has a much higher power to detect interaction than the interaction analysis of a single trait and may open a new direction to fully uncovering the genetic structure of multiple phenotypes.

  19. Functional Regression Models for Epistasis Analysis of Multiple Quantitative Traits.

    PubMed

    Zhang, Futao; Xie, Dan; Liang, Meimei; Xiong, Momiao

    2016-04-01

    To date, most genetic analyses of phenotypes have focused on analyzing single traits or analyzing each phenotype independently. However, joint epistasis analysis of multiple complementary traits will increase statistical power and improve our understanding of the complicated genetic structure of the complex diseases. Despite their importance in uncovering the genetic structure of complex traits, the statistical methods for identifying epistasis in multiple phenotypes remains fundamentally unexplored. To fill this gap, we formulate a test for interaction between two genes in multiple quantitative trait analysis as a multiple functional regression (MFRG) in which the genotype functions (genetic variant profiles) are defined as a function of the genomic position of the genetic variants. We use large-scale simulations to calculate Type I error rates for testing interaction between two genes with multiple phenotypes and to compare the power with multivariate pairwise interaction analysis and single trait interaction analysis by a single variate functional regression model. To further evaluate performance, the MFRG for epistasis analysis is applied to five phenotypes of exome sequence data from the NHLBI's Exome Sequencing Project (ESP) to detect pleiotropic epistasis. A total of 267 pairs of genes that formed a genetic interaction network showed significant evidence of epistasis influencing five traits. The results demonstrate that the joint interaction analysis of multiple phenotypes has a much higher power to detect interaction than the interaction analysis of a single trait and may open a new direction to fully uncovering the genetic structure of multiple phenotypes. PMID:27104857

  20. Sensitive and cost-effective LC-MS/MS method for quantitation of CVT-6883 in human urine using sodium dodecylbenzenesulfonate additive to eliminate adsorptive losses.

    PubMed

    Chen, Chungwen; Bajpai, Lakshmikant; Mollova, Nevena; Leung, Kwan

    2009-04-01

    CVT-6883, a novel selective A(2B) adenosine receptor antagonist currently under clinical development, is highly lipophilic and exhibits high affinity for non-specific binding to container surfaces, resulting in very low recovery in urine assays. Our study showed the use of sodium dodecylbenzenesulfonate (SDBS), a low-cost additive, eliminated non-specific binding problems in the analysis of CVT-6883 in human urine without compromising sensitivity. A new sensitive and selective LC-MS/MS method for quantitation of CVT-6883 in the range of 0.200-80.0ng/mL using SDBS additive was therefore developed and validated for the analysis of human urine samples. The recoveries during sample collection, handling and extraction for the analyte and internal standard (d(5)-CVT-6883) were higher than 87%. CVT-6883 was found stable under the following conditions: in extract - at ambient temperature for 3 days, under refrigeration (5 degrees C) for 6 days; in human urine (containing 4mM SDBS) - after three freeze/thaw cycles, at ambient temperature for 26h, under refrigeration (5 degrees C) for 94h, and in a freezer set to -20 degrees C for at least 2 months. The results demonstrated that the validated method is sufficiently sensitive, specific, and cost-effective for the analysis of CVT-6883 in human urine and will provide a powerful tool to support the clinical programs for CVT-6883.

  1. Quantitative analysis of echogenicity for patients with thyroid nodules

    PubMed Central

    Wu, Ming-Hsun; Chen, Chiung-Nien; Chen, Kuen-Yuan; Ho, Ming-Chih; Tai, Hao-Chih; Wang, Yu-Hsin; Chen, Argon; Chang, King-Jen

    2016-01-01

    Hypoechogenicity has been described qualitatively and is potentially subject to intra- and inter-observer variability. The aim of this study was to clarify whether quantitative echoic indexes (EIs) are useful for the detection of malignant thyroid nodules. Overall, 333 participants with 411 nodules were included in the final analysis. Quantification of echogenicity was performed using commercial software (AmCAD-UT; AmCad BioMed, Taiwan). The coordinates of three defined regions, the nodule, thyroid parenchyma, and strap muscle regions, were recorded in the database separately for subsequent analysis. And the results showed that ultrasound echogenicity (US-E), as assessed by clinicians, defined hypoechogenicity as an independent factor for malignancy. The EI, adjusted EI (EIN-T; EIN-M) and automatic EI(N-R)/R values between benign and malignant nodules were all significantly different, with lower values for malignant nodules. All of the EIs showed similar percentages of sensitivity and specificity and had better accuracies than US-E. In conclusion, the proposed quantitative EI seems more promising to constitute an important advancement than the conventional qualitative US-E in allowing for a more reliable distinction between benign and malignant thyroid nodules. PMID:27762299

  2. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    NASA Astrophysics Data System (ADS)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  3. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    SciTech Connect

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-08-14

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ...Cπinteractions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. Finally, the quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.

  4. Multivariate calibration applied to the quantitative analysis of infrared spectra

    SciTech Connect

    Haaland, D.M.

    1991-01-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.

  5. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    PubMed Central

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-01-01

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ⋯Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations. PMID:26306198

  6. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    DOE PAGES

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-08-14

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically,more » the presence of Cπ...Cπinteractions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. Finally, the quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.« less

  7. Quantitative analysis of intermolecular interactions in orthorhombic rubrene.

    PubMed

    Hathwar, Venkatesha R; Sist, Mattia; Jørgensen, Mads R V; Mamakhel, Aref H; Wang, Xiaoping; Hoffmann, Christina M; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-09-01

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ⋯Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H-H interactions. The electron density features of H-H bonding, and the interaction energy of molecular dimers connected by H-H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations. PMID:26306198

  8. Segmentation and quantitative analysis of individual cells in developmental tissues.

    PubMed

    Nandy, Kaustav; Kim, Jusub; McCullough, Dean P; McAuliffe, Matthew; Meaburn, Karen J; Yamaguchi, Terry P; Gudla, Prabhakar R; Lockett, Stephen J

    2014-01-01

    Image analysis is vital for extracting quantitative information from biological images and is used extensively, including investigations in developmental biology. The technique commences with the segmentation (delineation) of objects of interest from 2D images or 3D image stacks and is usually followed by the measurement and classification of the segmented objects. This chapter focuses on the segmentation task and here we explain the use of ImageJ, MIPAV (Medical Image Processing, Analysis, and Visualization), and VisSeg, three freely available software packages for this purpose. ImageJ and MIPAV are extremely versatile and can be used in diverse applications. VisSeg is a specialized tool for performing highly accurate and reliable 2D and 3D segmentation of objects such as cells and cell nuclei in images and stacks.

  9. Quantitatively understanding cellular uptake of gold nanoparticles via radioactivity analysis

    PubMed Central

    Shao, Xia; Schnau, Paul; Qian, Wei; Wang, Xueding

    2015-01-01

    The development of multifunctional gold nanoparticles (AuNPs) underwent an explosion in the last two decades. However, many questions regarding detailed surface chemistry and how they are affecting the behaviors of AuNPs in vivo and in vitro still need to be addressed before AuNPs can be widely adapted into clinical settings. In this work, radioactivity analysis was employed for quantitative evaluation of I-125 radiolabeled AuNPs uptakes by cancer cells. Facilitated with this new method, we have conducted initial bioevaluation of surfactant-free AuNPs produced by femtosecond laser ablation. Cellular uptake of AuNPs as a function of the RGD density on the AuNP surface, as well as a function of time, has been quantified. The radioactivity analysis may shed light on the dynamic interactions of AuNPs with cancer cells, and help achieve optimized designs of AuNPs for future clinical applications. PMID:26505012

  10. [Quantitative analysis of butachlor, oxadiazon and simetryn by gas chromatography].

    PubMed

    Liu, F; Mu, W; Wang, J

    1999-03-01

    The quantitative analysis of the ingredients in 26% B-O-S (butachlor, oxadiazon and simetryn) emulsion by gas chromatographic method was carried out with a 5% SE-30 on Chromosorb AW DMCS, 2 m x 3 mm i.d., glass column at column temperature of 210 degrees C and detector temperature of 230 degrees C. The internal standard is di-n-butyl sebacate. The retentions of simetryn, internal standard, butachlor and oxadiazon were 6.5, 8.3, 9.9 and 11.9 min respectively. This method has a recovery of 98.62%-100.77% and the coefficients of variation of this analysis of butachlor, oxadiazon and simetryn were 0.46%, 0.32% and 0.57% respectively. All coefficients of linear correlation were higher than 0.999.

  11. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  12. The effect of pedigree complexity on quantitative trait linkage analysis.

    PubMed

    Dyer, T D; Blangero, J; Williams, J T; Göring, H H; Mahaney, M C

    2001-01-01

    Due to the computational difficulties of performing linkage analysis on large complex pedigrees, most investigators resort to simplifying such pedigrees by some ad hoc strategy. In this paper, we suggest an analytical method to compare the power of various pedigree simplification schemes by using the asymptotic distribution of the likelihood-ratio statistic. We applied the method to the large Hutterine pedigree. Our results indicate that the breaking and reduction of inbreeding loops can greatly diminish the power to localize quantitative trait loci. We also present an efficient Monte Carlo method for estimating identity-by-descent allele sharing in large complex pedigrees. This method is used to facilitate a linkage analysis of serum IgE levels in the Hutterites without simplifying the pedigree.

  13. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics

    PubMed Central

    Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999–2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  14. Quantitative morphometric analysis for the tectonic characterisation of northern Tunisia.

    NASA Astrophysics Data System (ADS)

    Camafort, Miquel; Pérez-Peña, José Vicente; Booth-Rea, Guillermo; Ranero, César R.; Gràcia, Eulàlia; Azañón, José Miguel; Melki, Fetheddine; Ouadday, Mohamed

    2016-04-01

    Northern Tunisia is characterized by low deformation rates and low to moderate seismicity. Although instrumental seismicity reaches maximum magnitudes of Mw 5.5, some historical earthquakes have occurred with catastrophic consequences in this region. Aiming to improve our knowledge of active tectonics in Tunisia, we carried out both a quantitative morphometric analysis and field study in the north-western region. We applied different morphometric tools, like river profiles, knickpoint analysis, hypsometric curves and integrals and drainage pattern anomalies in order to differentiate between zones with high or low recent tectonic activity. This analysis helps identifying uplift and subsidence zones, which we relate to fault activity. Several active faults in a sparse distribution were identified. A selected sector was studied with a field campaign to test the results obtained with the quantitative analysis. During the fieldwork we identified geological evidence of recent activity and a considerable seismogenic potential along El Alia-Teboursouk (ETF) and Dkhila (DF) faults. The ETF fault could be responsible of one of the most devastating historical earthquakes in northern Tunisia that destroyed Utique in 412 A.D. Geological evidence include fluvial terraces folded by faults, striated and cracked pebbles, clastic dikes, sand volcanoes, coseismic cracks, etc. Although not reflected in the instrumental seismicity, our results support an important seismic hazard, evidenced by the several active tectonic structures identified and the two seismogenic faults described. After obtaining the current active tectonic framework of Tunisia we discuss our results within the western Mediterranean trying to contribute to the understanding of the western Mediterranean tectonic context. With our results, we suggest that the main reason explaining the sparse and scarce seismicity of the area in contrast with the adjacent parts of the Nubia-Eurasia boundary is due to its extended

  15. The Analysis of Quantitative Traits for Simple Genetic Models from Parental, F1 and Backcross Data

    PubMed Central

    Elston, R. C.; Stewart, John

    1973-01-01

    The following models are considered for the genetic determination of quantitative traits: segregation at one locus, at two linked loci, at any number of equal and additive unlinked loci, and at one major locus and an indefinite number of equal and additive loci. In each case an appropriate likelihood is given for data on parental, F1 and backcross individuals, assuming that the environmental variation is normally distributed. Methods of testing and comparing the various models are presented, and methods are suggested for the simultaneous analysis of two or more traits. PMID:4711900

  16. Qualitative and quantitative analysis of volatile constituents from latrines.

    PubMed

    Lin, Jianming; Aoll, Jackline; Niclass, Yvan; Velazco, Maria Inés; Wünsche, Laurent; Pika, Jana; Starkenmann, Christian

    2013-07-16

    More than 2.5 billion people defecate in the open. The increased commitment of private and public organizations to improving this situation is driving the research and development of new technologies for toilets and latrines. Although key technical aspects are considered by researchers when designing new technologies for developing countries, the basic aspect of offending malodors from human waste is often neglected. With the objective of contributing to technical solutions that are acceptable to global consumers, we investigated the chemical composition of latrine malodors sampled in Africa and India. Field latrines in four countries were evaluated olfactively and the odors qualitatively and quantitatively characterized with three analytical techniques. Sulfur compounds including H2S, methyl mercaptan, and dimethyl-mono-(di;tri) sulfide are important in sewage-like odors of pit latrines under anaerobic conditions. Under aerobic conditions, in Nairobi for example, paracresol and indole reached concentrations of 89 and 65 μg/g, respectively, which, along with short chain fatty acids such as butyric acid (13 mg/g) explained the strong rancid, manure and farm yard odor. This work represents the first qualitative and quantitative study of volatile compounds sampled from seven pit latrines in a variety of geographic, technical, and economic contexts in addition to three single stools from India and a pit latrine model system. PMID:23829328

  17. Quantitative image analysis in sonograms of the thyroid gland

    NASA Astrophysics Data System (ADS)

    Catherine, Skouroliakou; Maria, Lyra; Aristides, Antoniou; Lambros, Vlahos

    2006-12-01

    High-resolution, real-time ultrasound is a routine examination for assessing the disorders of the thyroid gland. However, the current diagnosis practice is based mainly on qualitative evaluation of the resulting sonograms, therefore depending on the physician's experience. Computerized texture analysis is widely employed in sonographic images of various organs (liver, breast), and it has been proven to increase the sensitivity of diagnosis by providing a better tissue characterization. The present study attempts to characterize thyroid tissue by automatic texture analysis. The texture features that are calculated are based on co-occurrence matrices as they have been proposed by Haralick. The sample consists of 40 patients. For each patient two sonographic images (one for each lobe) are recorded in DICOM format. The lobe is manually delineated in each sonogram, and the co-occurrence matrices for 52 separation vectors are calculated. The texture features extracted from each one of these matrices are: contrast, correlation, energy and homogeneity. Primary component analysis is used to select the optimal set of features. The statistical analysis resulted in the extraction of 21 optimal descriptors. The optimal descriptors are all co-occurrence parameters as the first-order statistics did not prove to be representative of the images characteristics. The bigger number of components depends mainly on correlation for very close or very far distances. The results indicate that quantitative analysis of thyroid sonograms can provide an objective characterization of thyroid tissue.

  18. Quantitative analysis of CT scans of ceramic candle filters

    SciTech Connect

    Ferer, M.V.; Smith, D.H.

    1996-12-31

    Candle filters are being developed to remove coal ash and other fine particles (<15{mu}m) from hot (ca. 1000 K) gas streams. In the present work, a color scanner was used to digitize hard-copy CT X-ray images of cylindrical SiC filters, and linear regressions converted the scanned (color) data to a filter density for each pixel. These data, with the aid of the density of SiC, gave a filter porosity for each pixel. Radial averages, density-density correlation functions, and other statistical analyses were performed on the density data. The CT images also detected the presence and depth of cracks that developed during usage of the filters. The quantitative data promise to be a very useful addition to the color images.

  19. Epistasis analysis for quantitative traits by functional regression model.

    PubMed

    Zhang, Futao; Boerwinkle, Eric; Xiong, Momiao

    2014-06-01

    The critical barrier in interaction analysis for rare variants is that most traditional statistical methods for testing interactions were originally designed for testing the interaction between common variants and are difficult to apply to rare variants because of their prohibitive computational time and poor ability. The great challenges for successful detection of interactions with next-generation sequencing (NGS) data are (1) lack of methods for interaction analysis with rare variants, (2) severe multiple testing, and (3) time-consuming computations. To meet these challenges, we shift the paradigm of interaction analysis between two loci to interaction analysis between two sets of loci or genomic regions and collectively test interactions between all possible pairs of SNPs within two genomic regions. In other words, we take a genome region as a basic unit of interaction analysis and use high-dimensional data reduction and functional data analysis techniques to develop a novel functional regression model to collectively test interactions between all possible pairs of single nucleotide polymorphisms (SNPs) within two genome regions. By intensive simulations, we demonstrate that the functional regression models for interaction analysis of the quantitative trait have the correct type 1 error rates and a much better ability to detect interactions than the current pairwise interaction analysis. The proposed method was applied to exome sequence data from the NHLBI's Exome Sequencing Project (ESP) and CHARGE-S study. We discovered 27 pairs of genes showing significant interactions after applying the Bonferroni correction (P-values < 4.58 × 10(-10)) in the ESP, and 11 were replicated in the CHARGE-S study.

  20. The Quantitative Analysis of Chennai Automotive Industry Cluster

    NASA Astrophysics Data System (ADS)

    Bhaskaran, Ethirajan

    2016-07-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  1. Quantitative wake analysis of a freely swimming fish using 3D synthetic aperture PIV

    NASA Astrophysics Data System (ADS)

    Mendelson, Leah; Techet, Alexandra H.

    2015-07-01

    Synthetic aperture PIV (SAPIV) is used to quantitatively analyze the wake behind a giant danio ( Danio aequipinnatus) swimming freely in a seeded quiescent tank. The experiment is designed with minimal constraints on animal behavior to ensure that natural swimming occurs. The fish exhibits forward swimming and turning behaviors at speeds between 0.9 and 1.5 body lengths/second. Results show clearly isolated and linked vortex rings in the wake structure, as well as the thrust jet coming off of a visual hull reconstruction of the fish body. As a benchmark for quantitative analysis of volumetric PIV data, the vortex circulation and impulse are computed using methods consistent with those applied to planar PIV data. Volumetric momentum analysis frameworks are discussed for linked and asymmetric vortex structures, laying a foundation for further volumetric studies of swimming hydrodynamics with SAPIV. Additionally, a novel weighted refocusing method is presented as an improvement to SAPIV reconstruction.

  2. Qualitative and quantitative analysis of systems and synthetic biology constructs using P systems.

    PubMed

    Konur, Savas; Gheorghe, Marian; Dragomir, Ciprian; Mierla, Laurentiu; Ipate, Florentin; Krasnogor, Natalio

    2015-01-16

    Computational models are perceived as an attractive alternative to mathematical models (e.g., ordinary differential equations). These models incorporate a set of methods for specifying, modeling, testing, and simulating biological systems. In addition, they can be analyzed using algorithmic techniques (e.g., formal verification). This paper shows how formal verification is utilized in systems and synthetic biology through qualitative vs quantitative analysis. Here, we choose two well-known case studies: quorum sensing in P. aeruginosas and pulse generator. The paper reports verification analysis of two systems carried out using some model checking tools, integrated to the Infobiotics Workbench platform, where system models are based on stochastic P systems.

  3. Bayesian robust analysis for genetic architecture of quantitative traits

    PubMed Central

    Yang, Runqing; Wang, Xin; Li, Jian; Deng, Hongwen

    2009-01-01

    Motivation: In most quantitative trait locus (QTL) mapping studies, phenotypes are assumed to follow normal distributions. Deviations from this assumption may affect the accuracy of QTL detection and lead to detection of spurious QTLs. To improve the robustness of QTL mapping methods, we replaced the normal distribution for residuals in multiple interacting QTL models with the normal/independent distributions that are a class of symmetric and long-tailed distributions and are able to accommodate residual outliers. Subsequently, we developed a Bayesian robust analysis strategy for dissecting genetic architecture of quantitative traits and for mapping genome-wide interacting QTLs in line crosses. Results: Through computer simulations, we showed that our strategy had a similar power for QTL detection compared with traditional methods assuming normal-distributed traits, but had a substantially increased power for non-normal phenotypes. When this strategy was applied to a group of traits associated with physical/chemical characteristics and quality in rice, more main and epistatic QTLs were detected than traditional Bayesian model analyses under the normal assumption. Contact: runqingyang@sjtu.edu.cn; dengh@umkc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:18974168

  4. Quantitative analysis of the polarization characteristics of atherosclerotic plaques

    NASA Astrophysics Data System (ADS)

    Gubarkova, Ekaterina V.; Kirillin, Michail Y.; Dudenkova, Varvara V.; Kiseleva, Elena B.; Moiseev, Alexander A.; Gelikonov, Grigory V.; Timofeeva, Lidia B.; Fiks, Ilya I.; Feldchtein, Felix I.; Gladkova, Natalia D.

    2016-04-01

    In this study we demonstrate the capability of cross-polarization optical coherence tomography (CP OCT) to assess collagen and elastin fibers condition in atherosclerotic plaques basing on ratio of the OCT signal levels in cross- and co- polarizations. We consider the depolarization factor (DF) and the effective birefringence (Δn) as quantitative characteristics of CP OCT images. We revealed that calculation of both DF and Δn in the region of interest (fibrous cap) yields a statistically significant difference between stable and unstable plaques (0.46+/-0.21 vs 0.09+/-0.04 for IDF; (4.7+/-1.0)•10-4 vs (2.5+/-0.7)•10-4 for Δn p<0.05). In parallel with CP OCT we used the nonlinear microscopy for analysis of thin cross-section of atherosclerotic plaque, revealing the different average isotropy index of collagen and elastin fibers for stable and unstable plaques (0.30 +/- 0.10 vs 0.70 +/- 0.08; p<0.001). The proposed approach for quantitative assessment of CP OCT images allows cross-scattering and birefringence characterization of stable and unstable atherosclerotic plaques.

  5. [Development of rapid methods for quantitative analysis of proteolytic reactions].

    PubMed

    Beloivan, O A; Tsvetkova, M N; Bubriak, O A

    2002-01-01

    The approaches for development of express methods for quantitative control of proteolytic reactions are discussed. Recently, these reactions have taken on special significance for revealing many important problems of theoretical and practical medicine and biology as well as for technological, pharmacological and ecological monitoring. Traditional methods can be improved both by use of immobilized enzymes and substrates, and on the basis of combination of various classic biochemical and immunological approaches. The synthesis of substrates with specified properties allows new methods to be realized for the study of the proteinase activity and kinetic characteristics of the corresponding reactions both in vitro and in vivo. An application of biosensor technology is promising trend since it allows the analysis time and cost to be saved, the direct interaction between enzymes and their inhibitors and activators to be studied in a real time scale, the quantitative measurements to be performed both in liquids and in the air. Besides, biosensor technique is well compatible with computer data processing. PMID:12924013

  6. Quantitative analysis of multiple sclerosis: a feasibility study

    NASA Astrophysics Data System (ADS)

    Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.

  7. Quantitative analysis of incipient mineral loss in hard tissues

    NASA Astrophysics Data System (ADS)

    Matvienko, Anna; Mandelis, Andreas; Hellen, Adam; Jeon, Raymond; Abrams, Stephen; Amaechi, Bennett

    2009-02-01

    A coupled diffuse-photon-density-wave and thermal-wave theoretical model was developed to describe the biothermophotonic phenomena in multi-layered hard tissue structures. Photothermal Radiometry was applied as a safe, non-destructive, and highly sensitive tool for the detection of early tooth enamel demineralization to test the theory. Extracted human tooth was treated sequentially with an artificial demineralization gel to simulate controlled mineral loss in the enamel. The experimental setup included a semiconductor laser (659 nm, 120 mW) as the source of the photothermal signal. Modulated laser light generated infrared blackbody radiation from teeth upon absorption and nonradiative energy conversion. The infrared flux emitted by the treated region of the tooth surface and sub-surface was monitored with an infrared detector, both before and after treatment. Frequency scans with a laser beam size of 3 mm were performed in order to guarantee one-dimensionality of the photothermal field. TMR images showed clear differences between sound and demineralized enamel, however this technique is destructive. Dental radiographs did not indicate any changes. The photothermal signal showed clear change even after 1 min of gel treatment. As a result of the fittings, thermal and optical properties of sound and demineralized enamel were obtained, which allowed for quantitative differentiation of healthy and non-healthy regions. In conclusion, the developed model was shown to be a promising tool for non-invasive quantitative analysis of early demineralization of hard tissues.

  8. Quantitative analysis of the reconstruction performance of interpolants

    NASA Technical Reports Server (NTRS)

    Lansing, Donald L.; Park, Stephen K.

    1987-01-01

    The analysis presented provides a quantitative measure of the reconstruction or interpolation performance of linear, shift-invariant interpolants. The performance criterion is the mean square error of the difference between the sampled and reconstructed functions. The analysis is applicable to reconstruction algorithms used in image processing and to many types of splines used in numerical analysis and computer graphics. When formulated in the frequency domain, the mean square error clearly separates the contribution of the interpolation method from the contribution of the sampled data. The equations provide a rational basis for selecting an optimal interpolant; that is, one which minimizes the mean square error. The analysis has been applied to a selection of frequently used data splines and reconstruction algorithms: parametric cubic and quintic Hermite splines, exponential and nu splines (including the special case of the cubic spline), parametric cubic convolution, Keys' fourth-order cubic, and a cubic with a discontinuous first derivative. The emphasis in this paper is on the image-dependent case in which no a priori knowledge of the frequency spectrum of the sampled function is assumed.

  9. Sources of Technical Variability in Quantitative LC-MS Proteomics: Human Brain Tissue Sample Analysis.

    SciTech Connect

    Piehowski, Paul D.; Petyuk, Vladislav A.; Orton, Daniel J.; Xie, Fang; Moore, Ronald J.; Ramirez Restrepo, Manuel; Engel, Anzhelika; Lieberman, Andrew P.; Albin, Roger L.; Camp, David G.; Smith, Richard D.; Myers, Amanda J.

    2013-05-03

    To design a robust quantitative proteomics study, an understanding of both the inherent heterogeneity of the biological samples being studied as well as the technical variability of the proteomics methods and platform is needed. Additionally, accurately identifying the technical steps associated with the largest variability would provide valuable information for the improvement and design of future processing pipelines. We present an experimental strategy that allows for a detailed examination of the variability of the quantitative LC-MS proteomics measurements. By replicating analyses at different stages of processing, various technical components can be estimated and their individual contribution to technical variability can be dissected. This design can be easily adapted to other quantitative proteomics pipelines. Herein, we applied this methodology to our label-free workflow for the processing of human brain tissue. For this application, the pipeline was divided into four critical components: Tissue dissection and homogenization (extraction), protein denaturation followed by trypsin digestion and SPE clean-up (digestion), short-term run-to-run instrumental response fluctuation (instrumental variance), and long-term drift of the quantitative response of the LC-MS/MS platform over the 2 week period of continuous analysis (instrumental stability). From this analysis, we found the following contributions to variability: extraction (72%) >> instrumental variance (16%) > instrumental stability (8.4%) > digestion (3.1%). Furthermore, the stability of the platform and its’ suitability for discovery proteomics studies is demonstrated.

  10. Quantitative image analysis of WE43-T6 cracking behavior

    NASA Astrophysics Data System (ADS)

    Ahmad, A.; Yahya, Z.

    2013-06-01

    Environment-assisted cracking of WE43 cast magnesium (4.2 wt.% Yt, 2.3 wt.% Nd, 0.7% Zr, 0.8% HRE) in the T6 peak-aged condition was induced in ambient air in notched specimens. The mechanism of fracture was studied using electron backscatter diffraction, serial sectioning and in situ observations of crack propagation. The intermetallic (rare earthed-enriched divorced intermetallic retained at grain boundaries and predominantly at triple points) material was found to play a significant role in initiating cracks which leads to failure of this material. Quantitative measurements were required for this project. The populations of the intermetallic and clusters of intermetallic particles were analyzed using image analysis of metallographic images. This is part of the work to generate a theoretical model of the effect of notch geometry on the static fatigue strength of this material.

  11. Preparation of Buffers. An Experiment for Quantitative Analysis Laboratory

    NASA Astrophysics Data System (ADS)

    Buckley, P. T.

    2001-10-01

    In our experience, students who have a solid grounding in the theoretical aspects of buffers, buffer preparation, and buffering capacity are often at a loss when required to actually prepare a buffer in a research setting. However, there are very few published laboratory experiments pertaining to buffers. This laboratory experiment for the undergraduate quantitative analysis lab gives students hands-on experience in the preparation of buffers. By preparing a buffer to a randomly chosen pH value and comparing the theoretical pH to the actual pH, students apply their theoretical understanding of the Henderson-Hasselbalch equation, activity coefficients, and the effect of adding acid or base to a buffer. This experiment gives students experience in buffer preparation for research situations and helps them in advanced courses such as biochemistry where a fundamental knowledge of buffer systems is essential.

  12. Quantitative Image Analysis of HIV-1 Infection in Lymphoid Tissue

    NASA Astrophysics Data System (ADS)

    Haase, Ashley T.; Henry, Keith; Zupancic, Mary; Sedgewick, Gerald; Faust, Russell A.; Melroe, Holly; Cavert, Winston; Gebhard, Kristin; Staskus, Katherine; Zhang, Zhi-Qiang; Dailey, Peter J.; Balfour, Henry H., Jr.; Erice, Alejo; Perelson, Alan S.

    1996-11-01

    Tracking human immunodeficiency virus-type 1 (HIV-1) infection at the cellular level in tissue reservoirs provides opportunities to better understand the pathogenesis of infection and to rationally design and monitor therapy A quantitative technique was developed to determine viral burden in two important cellular compartments in lymphoid tissues. Image analysis and in situ hybridization were combined to show that in the presymptomatic stages of infection there is a large, relatively stable pool of virions on the surfaces of follicular dendritic cells and a smaller pool of productively infected cells Despite evidence of constraints on HIV-1 replication in the infected cell population in lymphoid tissues, estimates of the numbers of these cells and the virus they could produce are consistent with the quantities of virus that have been detected in the bloodstream. The cellular sources of virus production and storage in lymphoid tissues can now be studied with this approach over the course of infection and treatment.

  13. Quantitative microstructure analysis of polymer-modified mortars.

    PubMed

    Jenni, A; Herwegh, M; Zurbriggen, R; Aberle, T; Holzer, L

    2003-11-01

    Digital light, fluorescence and electron microscopy in combination with wavelength-dispersive spectroscopy were used to visualize individual polymers, air voids, cement phases and filler minerals in a polymer-modified cementitious tile adhesive. In order to investigate the evolution and processes involved in formation of the mortar microstructure, quantifications of the phase distribution in the mortar were performed including phase-specific imaging and digital image analysis. The required sample preparation techniques and imaging related topics are discussed. As a form of case study, the different techniques were applied to obtain a quantitative characterization of a specific mortar mixture. The results indicate that the mortar fractionates during different stages ranging from the early fresh mortar until the final hardened mortar stage. This induces process-dependent enrichments of the phases at specific locations in the mortar. The approach presented provides important information for a comprehensive understanding of the functionality of polymer-modified mortars.

  14. Quantitative analysis of forest island pattern in selected Ohio landscapes

    SciTech Connect

    Bowen, G.W.; Burgess, R.L.

    1981-07-01

    The purpose of this study was to quantitatively describe the various aspects of regional distribution patterns of forest islands and relate those patterns to other landscape features. Several maps showing the forest cover of various counties in Ohio were selected as representative examples of forest patterns to be quantified. Ten thousand hectare study areas (landscapes) were delineated on each map. A total of 15 landscapes representing a wide variety of forest island patterns was chosen. Data were converted into a series of continuous variables which contained information pertinent to the sizes, shape, numbers, and spacing of woodlots within a landscape. The continuous variables were used in a factor analysis to describe the variation among landscapes in terms of forest island pattern. The results showed that forest island patterns are related to topography and other environmental features correlated with topography.

  15. Quantitative analysis of gallstones using laser-induced breakdown spectroscopy.

    PubMed

    Singh, Vivek K; Singh, Vinita; Rai, Awadhesh K; Thakur, Surya N; Rai, Pradeep K; Singh, Jagdish P

    2008-11-01

    The utility of laser-induced breakdown spectroscopy (LIBS) for categorizing different types of gallbladder stone has been demonstrated by analyzing their major and minor constituents. LIBS spectra of three types of gallstone have been recorded in the 200-900 nm spectral region. Calcium is found to be the major element in all types of gallbladder stone. The spectrophotometric method has been used to classify the stones. A calibration-free LIBS method has been used for the quantitative analysis of metal elements, and the results have been compared with those obtained from inductively coupled plasma atomic emission spectroscopy (ICP-AES) measurements. The single-shot LIBS spectra from different points on the cross section (in steps of 0.5 mm from one end to the other) of gallstones have also been recorded to study the variation of constituents from the center to the surface. The presence of different metal elements and their possible role in gallstone formation is discussed.

  16. Large-Scale Quantitative Analysis of Painting Arts

    NASA Astrophysics Data System (ADS)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  17. Large-Scale Quantitative Analysis of Painting Arts

    PubMed Central

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-01-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images – the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances. PMID:25501877

  18. Quantitative analysis of gallstones using laser-induced breakdown spectroscopy

    SciTech Connect

    Singh, Vivek K.; Singh, Vinita; Rai, Awadhesh K.; Thakur, Surya N.; Rai, Pradeep K.; Singh, Jagdish P

    2008-11-01

    The utility of laser-induced breakdown spectroscopy (LIBS) for categorizing different types of gallbladder stone has been demonstrated by analyzing their major and minor constituents. LIBS spectra of three types of gallstone have been recorded in the 200-900 nm spectral region. Calcium is found to be the major element in all types of gallbladder stone. The spectrophotometric method has been used to classify the stones. A calibration-free LIBS method has been used for the quantitative analysis of metal elements, and the results have been compared with those obtained from inductively coupled plasma atomic emission spectroscopy (ICP-AES) measurements. The single-shot LIBS spectra from different points on the cross section (in steps of 0.5 mm from one end to the other) of gallstones have also been recorded to study the variation of constituents from the center to the surface. The presence of different metal elements and their possible role in gallstone formation is discussed.

  19. Quantitative multielement analysis using high energy particle bombardment

    NASA Technical Reports Server (NTRS)

    Clark, P. J.; Neal, G. F.; Allen, R. O.

    1974-01-01

    Charged particles ranging in energy from 0.8 to 4.0 MeV are used to induce resonant nuclear reactions, Coulomb excitation (gamma X-rays), and X-ray emission in both thick and thin targets. Quantitative analysis is possible for elements from Li to Pb in complex environmental samples, although the matrix can severely reduce the sensitivity. It is necessary to use a comparator technique for the gamma-rays, while for X-rays an internal standard can be used. A USGS standard rock is analyzed for a total of 28 elements. Water samples can be analyzed either by nebulizing the sample doped with Cs or Y onto a thin formvar film or by extracting the sample (with or without an internal standard) onto ion exchange resin which is pressed into a pellet.

  20. A Novel Quantitative Approach to Concept Analysis: The Internomological Network

    PubMed Central

    Cook, Paul F.; Larsen, Kai R.; Sakraida, Teresa J.; Pedro, Leli

    2012-01-01

    Background When a construct such as patients’ transition to self-management of chronic illness is studied by researchers across multiple disciplines, the meaning of key terms can become confused. This results from inherent problems in language where a term can have multiple meanings (polysemy) and different words can mean the same thing (synonymy). Objectives To test a novel quantitative method for clarifying the meaning of constructs by examining the similarity of published contexts in which they are used. Method Published terms related to the concept transition to self-management of chronic illness were analyzed using the internomological network (INN), a type of latent semantic analysis to calculate the mathematical relationships between constructs based on the contexts in which researchers use each term. This novel approach was tested by comparing results to those from concept analysis, a best-practice qualitative approach to clarifying meanings of terms. By comparing results of the two methods, the best synonyms of transition to self-management, as well as key antecedent, attribute, and consequence terms, were identified. Results Results from INN analysis were consistent with those from concept analysis. The potential synonyms self-management, transition, and adaptation had the greatest utility. Adaptation was the clearest overall synonym, but had lower cross-disciplinary use. The terms coping and readiness had more circumscribed meanings. The INN analysis confirmed key features of transition to self-management, and suggested related concepts not found by the previous review. Discussion The INN analysis is a promising novel methodology that allows researchers to quantify the semantic relationships between constructs. The method works across disciplinary boundaries, and may help to integrate the diverse literature on self-management of chronic illness. PMID:22592387

  1. Spectroscopic analysis and DFT calculations of a food additive Carmoisine

    NASA Astrophysics Data System (ADS)

    Snehalatha, M.; Ravikumar, C.; Hubert Joe, I.; Sekar, N.; Jayakumar, V. S.

    2009-04-01

    FT-IR and Raman techniques were employed for the vibrational characterization of the food additive Carmoisine (E122). The equilibrium geometry, various bonding features, and harmonic vibrational wavenumbers have been investigated with the help of density functional theory (DFT) calculations. A good correlation was found between the computed and experimental wavenumbers. Azo stretching wavenumbers have been lowered due to conjugation and π-electron delocalization. Predicted electronic absorption spectra from TD-DFT calculation have been analysed comparing with the UV-vis spectrum. The first hyperpolarizability of the molecule is calculated. Intramolecular charge transfer (ICT) responsible for the optical nonlinearity of the dye molecule has been discussed theoretically and experimentally. Stability of the molecule arising from hyperconjugative interactions, charge delocalization and C-H⋯O, improper, blue shifted hydrogen bonds have been analysed using natural bond orbital (NBO) analysis.

  2. Quantitative analysis of protein-ligand interactions by NMR.

    PubMed

    Furukawa, Ayako; Konuma, Tsuyoshi; Yanaka, Saeko; Sugase, Kenji

    2016-08-01

    Protein-ligand interactions have been commonly studied through static structures of the protein-ligand complex. Recently, however, there has been increasing interest in investigating the dynamics of protein-ligand interactions both for fundamental understanding of the underlying mechanisms and for drug development. NMR is a versatile and powerful tool, especially because it provides site-specific quantitative information. NMR has widely been used to determine the dissociation constant (KD), in particular, for relatively weak interactions. The simplest NMR method is a chemical-shift titration experiment, in which the chemical-shift changes of a protein in response to ligand titration are measured. There are other quantitative NMR methods, but they mostly apply only to interactions in the fast-exchange regime. These methods derive the dissociation constant from population-averaged NMR quantities of the free and bound states of a protein or ligand. In contrast, the recent advent of new relaxation-based experiments, including R2 relaxation dispersion and ZZ-exchange, has enabled us to obtain kinetic information on protein-ligand interactions in the intermediate- and slow-exchange regimes. Based on R2 dispersion or ZZ-exchange, methods that can determine the association rate, kon, dissociation rate, koff, and KD have been developed. In these approaches, R2 dispersion or ZZ-exchange curves are measured for multiple samples with different protein and/or ligand concentration ratios, and the relaxation data are fitted to theoretical kinetic models. It is critical to choose an appropriate kinetic model, such as the two- or three-state exchange model, to derive the correct kinetic information. The R2 dispersion and ZZ-exchange methods are suitable for the analysis of protein-ligand interactions with a micromolar or sub-micromolar dissociation constant but not for very weak interactions, which are typical in very fast exchange. This contrasts with the NMR methods that are used

  3. Quantitative Analysis of Synaptic Release at the Photoreceptor Synapse

    PubMed Central

    Duncan, Gabriel; Rabl, Katalin; Gemp, Ian; Heidelberger, Ruth; Thoreson, Wallace B.

    2010-01-01

    Abstract Exocytosis from the rod photoreceptor is stimulated by submicromolar Ca2+ and exhibits an unusually shallow dependence on presynaptic Ca2+. To provide a quantitative description of the photoreceptor Ca2+ sensor for exocytosis, we tested a family of conventional and allosteric computational models describing the final Ca2+-binding steps leading to exocytosis. Simulations were fit to two measures of release, evoked by flash-photolysis of caged Ca2+: exocytotic capacitance changes from individual rods and postsynaptic currents of second-order neurons. The best simulations supported the occupancy of only two Ca2+ binding sites on the rod Ca2+ sensor rather than the typical four or five. For most models, the on-rates for Ca2+ binding and maximal fusion rate were comparable to those of other neurons. However, the off-rates for Ca2+ unbinding were unexpectedly slow. In addition to contributing to the high-affinity of the photoreceptor Ca2+ sensor, slow Ca2+ unbinding may support the fusion of vesicles located at a distance from Ca2+ channels. In addition, partial sensor occupancy due to slow unbinding may contribute to the linearization of the first synapse in vision. PMID:20483317

  4. [Analysis of constituents in urushi wax, a natural food additive].

    PubMed

    Jin, Zhe-Long; Tada, Atsuko; Sugimoto, Naoki; Sato, Kyoko; Masuda, Aino; Yamagata, Kazuo; Yamazaki, Takeshi; Tanamoto, Kenichi

    2006-08-01

    Urushi wax is a natural gum base used as a food additive. In order to evaluate the quality of urushi wax as a food additive and to obtain information useful for setting official standards, we investigated the constituents and their concentrations in urushi wax, using the same sample as scheduled for toxicity testing. After methanolysis of urushi wax, the composition of fatty acids was analyzed by GC/MS. The results indicated that the main fatty acids were palmitic acid, oleic acid and stearic acid. LC/MS analysis of urushi wax provided molecular-related ions of the main constituents. The main constituents were identified as triglycerides, namely glyceryl tripalmitate (30.7%), glyceryl dipalmitate monooleate (21.2%), glyceryl dioleate monopalmitate (2.1%), glyceryl monooleate monopalmitate monostearate (2.6%), glyceryl dipalmitate monostearate (5.6%), glyceryl distearate monopalmitate (1.4%). Glyceryl dipalmitate monooleate isomers differing in the binding sites of each constituent fatty acid could be separately determined by LC/MS/MS. PMID:16984037

  5. Decreasing Cloudiness Over China: An Updated Analysis Examining Additional Variables

    SciTech Connect

    Kaiser, D.P.

    2000-01-14

    As preparation of the IPCC's Third Assessment Report takes place, one of the many observed climate variables of key interest is cloud amount. For several nations of the world, there exist records of surface-observed cloud amount dating back to the middle of the 20th Century or earlier, offering valuable information on variations and trends. Studies using such databases include Sun and Groisman (1999) and Kaiser and Razuvaev (1995) for the former Soviet Union, Angel1 et al. (1984) for the United States, Henderson-Sellers (1986) for Europe, Jones and Henderson-Sellers (1992) for Australia, and Kaiser (1998) for China. The findings of Kaiser (1998) differ from the other studies in that much of China appears to have experienced decreased cloudiness over recent decades (1954-1994), whereas the other land regions for the most part show evidence of increasing cloud cover. This paper expands on Kaiser (1998) by analyzing trends in additional meteorological variables for Chi na [station pressure (p), water vapor pressure (e), and relative humidity (rh)] and extending the total cloud amount (N) analysis an additional two years (through 1996).

  6. Quantitative proteome analysis in cardiovascular physiology and pathology. I. Data processing.

    PubMed

    Grussenmeyer, Thomas; Meili-Butz, Silvia; Dieterle, Thomas; Traunecker, Emmanuel; Carrel, Thierry P; Lefkovits, Ivan

    2008-12-01

    Methodological evaluation of the proteomic analysis of cardiovascular-tissue material has been performed with a special emphasis on establishing examinations that allow reliable quantitative analysis of silver-stained readouts. Reliability, reproducibility, robustness and linearity were addressed and clarified. In addition, several types of normalization procedures were evaluated and new approaches are proposed. It has been found that the silver-stained readout offers a convenient approach for quantitation if a linear range for gel loading is defined. In addition, a broad range of a 10-fold input (loading 20-200 microg per gel) fulfills the linearity criteria, although at the lowest input (20 microg) a portion of protein species will remain undetected. The method is reliable and reproducible within a range of 65-200 microg input. The normalization procedure using the sum of all spot intensities from a silver-stained 2D pattern has been shown to be less reliable than other approaches, namely, normalization through median or through involvement of interquartile range. A special refinement of the normalization through virtual segmentation of pattern, and calculation of normalization factor for each stratum provides highly satisfactory results. The presented results not only provide evidence for the usefulness of silver-stained gels for quantitative evaluation, but they are directly applicable to the research endeavor of monitoring alterations in cardiovascular pathophysiology.

  7. Automatic quantitative analysis of cardiac MR perfusion images

    NASA Astrophysics Data System (ADS)

    Breeuwer, Marcel M.; Spreeuwers, Luuk J.; Quist, Marcel J.

    2001-07-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and accurate image analysis methods. This paper focuses on the evaluation of blood perfusion in the myocardium (the heart muscle) from MR images, using contrast-enhanced ECG-triggered MRI. We have developed an automatic quantitative analysis method, which works as follows. First, image registration is used to compensate for translation and rotation of the myocardium over time. Next, the boundaries of the myocardium are detected and for each position within the myocardium a time-intensity profile is constructed. The time interval during which the contrast agent passes for the first time through the left ventricle and the myocardium is detected and various parameters are measured from the time-intensity profiles in this interval. The measured parameters are visualized as color overlays on the original images. Analysis results are stored, so that they can later on be compared for different stress levels of the heart. The method is described in detail in this paper and preliminary validation results are presented.

  8. Multiple Trait Analysis of Genetic Mapping for Quantitative Trait Loci

    PubMed Central

    Jiang, C.; Zeng, Z. B.

    1995-01-01

    We present in this paper models and statistical methods for performing multiple trait analysis on mapping quantitative trait loci (QTL) based on the composite interval mapping method. By taking into account the correlated structure of multiple traits, this joint analysis has several advantages, compared with separate analyses, for mapping QTL, including the expected improvement on the statistical power of the test for QTL and on the precision of parameter estimation. Also this joint analysis provides formal procedures to test a number of biologically interesting hypotheses concerning the nature of genetic correlations between different traits. Among the testing procedures considered are those for joint mapping, pleiotropy, QTL by environment interaction, and pleiotropy vs. close linkage. The test of pleiotropy (one pleiotropic QTL at a genome position) vs. close linkage (multiple nearby nonpleiotropic QTL) can have important implications for our understanding of the nature of genetic correlations between different traits in certain regions of a genome and also for practical applications in animal and plant breeding because one of the major goals in breeding is to break unfavorable linkage. Results of extensive simulation studies are presented to illustrate various properties of the analyses. PMID:7672582

  9. Sensitivity analysis of geometric errors in additive manufacturing medical models.

    PubMed

    Pinto, Jose Miguel; Arrieta, Cristobal; Andia, Marcelo E; Uribe, Sergio; Ramos-Grez, Jorge; Vargas, Alex; Irarrazaval, Pablo; Tejos, Cristian

    2015-03-01

    Additive manufacturing (AM) models are used in medical applications for surgical planning, prosthesis design and teaching. For these applications, the accuracy of the AM models is essential. Unfortunately, this accuracy is compromised due to errors introduced by each of the building steps: image acquisition, segmentation, triangulation, printing and infiltration. However, the contribution of each step to the final error remains unclear. We performed a sensitivity analysis comparing errors obtained from a reference with those obtained modifying parameters of each building step. Our analysis considered global indexes to evaluate the overall error, and local indexes to show how this error is distributed along the surface of the AM models. Our results show that the standard building process tends to overestimate the AM models, i.e. models are larger than the original structures. They also show that the triangulation resolution and the segmentation threshold are critical factors, and that the errors are concentrated at regions with high curvatures. Errors could be reduced choosing better triangulation and printing resolutions, but there is an important need for modifying some of the standard building processes, particularly the segmentation algorithms.

  10. Quantitative DNA Methylation Analysis of Candidate Genes in Cervical Cancer

    PubMed Central

    Siegel, Erin M.; Riggs, Bridget M.; Delmas, Amber L.; Koch, Abby; Hakam, Ardeshir; Brown, Kevin D.

    2015-01-01

    Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2). A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site) per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC) of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97–1.00, p-value = 0.003). Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated. PMID:25826459

  11. Quantitative DNA methylation analysis of candidate genes in cervical cancer.

    PubMed

    Siegel, Erin M; Riggs, Bridget M; Delmas, Amber L; Koch, Abby; Hakam, Ardeshir; Brown, Kevin D

    2015-01-01

    Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2). A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site) per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC) of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97-1.00, p-value = 0.003). Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated.

  12. Quantitative Analysis Of Acoustic Emission From Rock Fracture Experiments

    NASA Astrophysics Data System (ADS)

    Goodfellow, Sebastian David

    This thesis aims to advance the methods of quantitative acoustic emission (AE) analysis by calibrating sensors, characterizing sources, and applying the results to solve engi- neering problems. In the first part of this thesis, we built a calibration apparatus and successfully calibrated two commercial AE sensors. The ErgoTech sensor was found to have broadband velocity sensitivity and the Panametrics V103 was sensitive to surface normal displacement. These calibration results were applied to two AE data sets from rock fracture experiments in order to characterize the sources of AE events. The first data set was from an in situ rock fracture experiment conducted at the Underground Research Laboratory (URL). The Mine-By experiment was a large scale excavation response test where both AE (10 kHz - 1 MHz) and microseismicity (MS) (1 Hz - 10 kHz) were monitored. Using the calibration information, magnitude, stress drop, dimension and energy were successfully estimated for 21 AE events recorded in the tensile region of the tunnel wall. Magnitudes were in the range -7.5 < Mw < -6.8, which is consistent with other laboratory AE results, and stress drops were within the range commonly observed for induced seismicity in the field (0.1 - 10 MPa). The second data set was AE collected during a true-triaxial deformation experiment, where the objectives were to characterize laboratory AE sources and identify issues related to moving the analysis from ideal in situ conditions to more complex laboratory conditions in terms of the ability to conduct quantitative AE analysis. We found AE magnitudes in the range -7.8 < Mw < -6.7 and as with the in situ data, stress release was within the expected range of 0.1 - 10 MPa. We identified four major challenges to quantitative analysis in the laboratory, which in- hibited our ability to study parameter scaling (M0 ∝ fc -3 scaling). These challenges were 0c (1) limited knowledge of attenuation which we proved was continuously evolving, (2

  13. Quantitative Phosphoproteomics Analysis of ERBB3/ERBB4 Signaling

    PubMed Central

    Jacobs, Kris; Klammer, Martin; Jordan, Nicole; Elschenbroich, Sarah; Parade, Marc; Jacoby, Edgar; Linders, Joannes T. M.; Brehmer, Dirk; Cools, Jan; Daub, Henrik

    2016-01-01

    The four members of the epidermal growth factor receptor (EGFR/ERBB) family form homo- and heterodimers which mediate ligand-specific regulation of many key cellular processes in normal and cancer tissues. While signaling through the EGFR has been extensively studied on the molecular level, signal transduction through ERBB3/ERBB4 heterodimers is less well understood. Here, we generated isogenic mouse Ba/F3 cells that express full-length and functional membrane-integrated ERBB3 and ERBB4 or ERBB4 alone, to serve as a defined cellular model for biological and phosphoproteomics analysis of ERBB3/ERBB4 signaling. ERBB3 co-expression significantly enhanced Ba/F3 cell proliferation upon neuregulin-1 (NRG1) treatment. For comprehensive signaling studies we performed quantitative mass spectrometry (MS) experiments to compare the basal ERBB3/ERBB4 cell phosphoproteome to NRG1 treatment of ERBB3/ERBB4 and ERBB4 cells. We employed a workflow comprising differential isotope labeling with mTRAQ reagents followed by chromatographic peptide separation and final phosphopeptide enrichment prior to MS analysis. Overall, we identified 9686 phosphorylation sites which could be confidently localized to specific residues. Statistical analysis of three replicate experiments revealed 492 phosphorylation sites which were significantly changed in NRG1-treated ERBB3/ERBB4 cells. Bioinformatics data analysis recapitulated regulation of mitogen-activated protein kinase and Akt pathways, but also indicated signaling links to cytoskeletal functions and nuclear biology. Comparative assessment of NRG1-stimulated ERBB4 Ba/F3 cells revealed that ERBB3 did not trigger defined signaling pathways but more broadly enhanced phosphoproteome regulation in cells expressing both receptors. In conclusion, our data provide the first global picture of ERBB3/ERBB4 signaling and provide numerous potential starting points for further mechanistic studies. PMID:26745281

  14. Semiautomatic Software For Quantitative Analysis Of Cardiac Positron Tomography Studies

    NASA Astrophysics Data System (ADS)

    Ratib, Osman; Bidaut, Luc; Nienaber, Christoph; Krivokapich, Janine; Schelbert, Heinrich R.; Phelps, Michael E.

    1988-06-01

    In order to derive accurate values for true tissue radiotracers concentrations from gated positron emission tomography (PET) images of the heart, which are critical for quantifying noninvasively regional myocardial blood flow and metabolism, appropriate corrections for partial volume effect (PVE) and contamination from adjacent anatomical structures are required. We therefore developed an integrated software package for quantitative analysis of tomographic images which provides for such corrections. A semiautomatic edge detection technique outlines and partitions the myocardium into sectors. Myocardial wall thickness is measured on the images perpendicularly to the detected edges and used to correct for PVE. The programs automatically correct for radioactive decay, activity calibration and cross contaminations for both static and dynamic studies. Parameters derived with these programs include tracer concentrations and their changes over time. They are used for calculating regional metabolic rates and can be further displayed as color coded parametric images. The approach was validated for PET imaging in 11 dog experiments. 2D echocardiograms (Echo) were recorded simultaneously to validate the edge detection and wall thickness measurement techniques. After correction for PVE using automatic WT measurement, regional tissue tracer concentrations derived from PET images correlated well with true tissue concentrations as determined by well counting (r=0.98). These preliminary studies indicate that the developed automatic image analysis technique allows accurate and convenient evaluation of cardiac PET images for the measurement of both, regional tracer tissue concentrations as well as regional myocardial function.

  15. Inside Single Cells: Quantitative Analysis with Advanced Optics and Nanomaterials

    PubMed Central

    Cui, Yi; Irudayaraj, Joseph

    2014-01-01

    Single cell explorations offer a unique window to inspect molecules and events relevant to mechanisms and heterogeneity constituting the central dogma of biology. A large number of nucleic acids, proteins, metabolites and small molecules are involved in determining and fine-tuning the state and function of a single cell at a given time point. Advanced optical platforms and nanotools provide tremendous opportunities to probe intracellular components with single-molecule accuracy, as well as promising tools to adjust single cell activity. In order to obtain quantitative information (e.g. molecular quantity, kinetics and stoichiometry) within an intact cell, achieving the observation with comparable spatiotemporal resolution is a challenge. For single cell studies both the method of detection and the biocompatibility are critical factors as they determine the feasibility, especially when considering live cell analysis. Although a considerable proportion of single cell methodologies depend on specialized expertise and expensive instruments, it is our expectation that the information content and implication will outweigh the costs given the impact on life science enabled by single cell analysis. PMID:25430077

  16. [Quantitative Analysis of Mn in Soil Samples Using LIBS].

    PubMed

    Zhang, Bao-hua; Jiang, Yong-cheng; Zhang, Xian-yan; Cui, Zhi-feng

    2015-06-01

    The trace element of Manganese element in the agricultural farm (Anhui Huaiyuan Nongkang) soil was quantitatively analyzed by Laser-induced breakdown spectroscopy. The line of 403.1 nm was selected as the analysis line of Mn. The matrix element of Fe in soil was chosen as the internal calibration element and the analysis line was 407.2 nm. Ten soil samples were used to construct calibration curves with traditional method and internal standard method, and four soil samples were selected as test samples. The experimental results showed that the fitting correlation coefficient (r) is 0.954 when using the traditional method, the maximum relative error of the measurement samples is 5.72%, and the detection limit of Mn in soil is 93 mg x kg(-1). While using the internal standard method to construct the calibration curve, the fitting correlation coefficient (r) is 0.983, the relative error of measurement samples is reduced to 4.1%, and the detection limit of Mn in soil is 71 mg x kg(-1). The result indicates that LIBS technique can be used to detect trace element Mn in soil. In a certain extent, the internal standard method can improve the accuracy of measurement.

  17. Quantitative analysis of polyethylene blends by Fourier transform infrared spectroscopy.

    PubMed

    Cran, Marlene J; Bigger, Stephen W

    2003-08-01

    The quantitative analysis of binary polyethylene (PE) blends by Fourier transform infrared (FT-IR) spectroscopy has been achieved based on the ratio of two absorbance peaks in an FT-IR spectrum. The frequencies for the absorbance ratio are selected based on structural entities of the PE components in the blend. A linear relationship between the absorbance ratio and the blend composition was found to exist if one of the absorbance peaks is distinct to one of the components and the other peak is common to both components. It was also found that any peak resulting from short-chain branching in copolymers (such as linear low-density polyethylene (LLDPE) or metallocene-catalyzed LLDPE (mLLDPE)), is suitable for use as the peak that is designated as being distinct to that component. In order to optimize the linearity of the equation, however, the selection of the second common peak is the most important and depends on the blend system studied. Indeed, under certain circumstances peaks that are not spectrally distinct can be used successfully to apply the method. The method exhibits potential for the routine analysis of PE blends that have been calibrated prior to its application.

  18. Quantitative evaluation of midpalatal suture maturation via fractal analysis

    PubMed Central

    Kwak, Kyoung Ho; Kim, Yong-Il; Kim, Yong-Deok

    2016-01-01

    Objective The purpose of this study was to determine whether the results of fractal analysis can be used as criteria for midpalatal suture maturation evaluation. Methods The study included 131 subjects aged over 18 years of age (range 18.1–53.4 years) who underwent cone-beam computed tomography. Skeletonized images of the midpalatal suture were obtained via image processing software and used to calculate fractal dimensions. Correlations between maturation stage and fractal dimensions were calculated using Spearman's correlation coefficient. Optimal fractal dimension cut-off values were determined using a receiver operating characteristic curve. Results The distribution of maturation stages of the midpalatal suture according to the cervical vertebrae maturation index was highly variable, and there was a strong negative correlation between maturation stage and fractal dimension (−0.623, p < 0.001). Fractal dimension was a statistically significant indicator of dichotomous results with regard to maturation stage (area under curve = 0.794, p < 0.001). A test in which fractal dimension was used to predict the resulting variable that splits maturation stages into ABC and D or E yielded an optimal fractal dimension cut-off value of 1.0235. Conclusions There was a strong negative correlation between fractal dimension and midpalatal suture maturation. Fractal analysis is an objective quantitative method, and therefore we suggest that it may be useful for the evaluation of midpalatal suture maturation. PMID:27668195

  19. Quantitative evaluation of midpalatal suture maturation via fractal analysis

    PubMed Central

    Kwak, Kyoung Ho; Kim, Yong-Il; Kim, Yong-Deok

    2016-01-01

    Objective The purpose of this study was to determine whether the results of fractal analysis can be used as criteria for midpalatal suture maturation evaluation. Methods The study included 131 subjects aged over 18 years of age (range 18.1–53.4 years) who underwent cone-beam computed tomography. Skeletonized images of the midpalatal suture were obtained via image processing software and used to calculate fractal dimensions. Correlations between maturation stage and fractal dimensions were calculated using Spearman's correlation coefficient. Optimal fractal dimension cut-off values were determined using a receiver operating characteristic curve. Results The distribution of maturation stages of the midpalatal suture according to the cervical vertebrae maturation index was highly variable, and there was a strong negative correlation between maturation stage and fractal dimension (−0.623, p < 0.001). Fractal dimension was a statistically significant indicator of dichotomous results with regard to maturation stage (area under curve = 0.794, p < 0.001). A test in which fractal dimension was used to predict the resulting variable that splits maturation stages into ABC and D or E yielded an optimal fractal dimension cut-off value of 1.0235. Conclusions There was a strong negative correlation between fractal dimension and midpalatal suture maturation. Fractal analysis is an objective quantitative method, and therefore we suggest that it may be useful for the evaluation of midpalatal suture maturation.

  20. Optimal display conditions for quantitative analysis of stereoscopic cerebral angiograms

    SciTech Connect

    Charland, P.; Peters, T. |

    1996-10-01

    For several years the authors have been using a stereoscopic display as a tool in the planning of stereotactic neurosurgical techniques. This PC-based workstation allows the surgeon to interact with and view vascular images in three dimensions, as well as to perform quantitative analysis of the three-dimensional (3-D) space. Some of the perceptual issues relevant to the presentation of medical images on this stereoscopic display were addressed in five experiments. The authors show that a number of parameters--namely the shape, color, and depth cue, associated with a cursor--as well as the image filtering and observer position, have a role in improving the observer`s perception of a 3-D image and his ability to localize points within the stereoscopically presented 3-D image. However, an analysis of the results indicates that while varying these parameters can lead to an effect on the performance of individual observers, the effects are not consistent across observers, and the mean accuracy remains relatively constant under the different experimental conditions.

  1. Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes

    SciTech Connect

    Xu, Zhe; Wu, Chaochao; Xie, Fang; Slysz, Gordon W.; Tolic, Nikola; Monroe, Matthew E.; Petyuk, Vladislav A.; Payne, Samuel H.; Fujimoto, Grant M.; Moore, Ronald J.; Fillmore, Thomas L.; Schepmoes, Athena A.; Levine, Douglas; Townsend, Reid; Davies, Sherri; Li, Shunqiang; Ellis, Matthew; Boja, Emily; Rivers, Robert; Rodriguez, Henry; Rodland, Karin D.; Liu, Tao; Smith, Richard D.

    2015-01-02

    Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective and robust analytical platform for comprehensive analyses of tissue peptidomes, which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Moreover, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. Peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.

  2. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  3. Development of quantitative duplex real-time PCR method for screening analysis of genetically modified maize.

    PubMed

    Oguchi, Taichi; Onishi, Mari; Minegishi, Yasutaka; Kurosawa, Yasunori; Kasahara, Masaki; Akiyama, Hiroshi; Teshima, Reiko; Futo, Satoshi; Furui, Satoshi; Hino, Akihiro; Kitta, Kazumi

    2009-06-01

    A duplex real-time PCR method was developed for quantitative screening analysis of GM maize. The duplex real-time PCR simultaneously detected two GM-specific segments, namely the cauliflower mosaic virus (CaMV) 35S promoter (P35S) segment and an event-specific segment for GA21 maize which does not contain P35S. Calibration was performed with a plasmid calibrant specially designed for the duplex PCR. The result of an in-house evaluation suggested that the analytical precision of the developed method was almost equivalent to those of simplex real-time PCR methods, which have been adopted as ISO standard methods for the analysis of GMOs in foodstuffs and have also been employed for the analysis of GMOs in Japan. In addition, this method will reduce both the cost and time requirement of routine GMO analysis by half. The high analytical performance demonstrated in the current study would be useful for the quantitative screening analysis of GM maize. We believe the developed method will be useful for practical screening analysis of GM maize, although interlaboratory collaborative studies should be conducted to confirm this. PMID:19602858

  4. Development of quantitative duplex real-time PCR method for screening analysis of genetically modified maize.

    PubMed

    Oguchi, Taichi; Onishi, Mari; Minegishi, Yasutaka; Kurosawa, Yasunori; Kasahara, Masaki; Akiyama, Hiroshi; Teshima, Reiko; Futo, Satoshi; Furui, Satoshi; Hino, Akihiro; Kitta, Kazumi

    2009-06-01

    A duplex real-time PCR method was developed for quantitative screening analysis of GM maize. The duplex real-time PCR simultaneously detected two GM-specific segments, namely the cauliflower mosaic virus (CaMV) 35S promoter (P35S) segment and an event-specific segment for GA21 maize which does not contain P35S. Calibration was performed with a plasmid calibrant specially designed for the duplex PCR. The result of an in-house evaluation suggested that the analytical precision of the developed method was almost equivalent to those of simplex real-time PCR methods, which have been adopted as ISO standard methods for the analysis of GMOs in foodstuffs and have also been employed for the analysis of GMOs in Japan. In addition, this method will reduce both the cost and time requirement of routine GMO analysis by half. The high analytical performance demonstrated in the current study would be useful for the quantitative screening analysis of GM maize. We believe the developed method will be useful for practical screening analysis of GM maize, although interlaboratory collaborative studies should be conducted to confirm this.

  5. Quantitative analysis of protease recognition by inhibitors in plasma using microscale thermophoresis

    PubMed Central

    Dau, T.; Edeleva, E. V.; Seidel, S. A. I.; Stockley, R. A.; Braun, D.; Jenne, D. E.

    2016-01-01

    High abundance proteins like protease inhibitors of plasma display a multitude of interactions in natural environments. Quantitative analysis of such interactions in vivo is essential to study diseases, but have not been forthcoming, as most methods cannot be directly applied in a complex biological environment. Here, we report a quantitative microscale thermophoresis assay capable of deciphering functional deviations from in vitro inhibition data by combining concentration and affinity measurements. We obtained stable measurement signals for the substrate-like interaction of the disease relevant inhibitor α-1-antitrypsin (AAT) Z-variant with catalytically inactive elastase. The signal differentiates between healthy and sick AAT-deficient individuals suggesting that affinity between AAT and elastase is strongly modulated by so-far overlooked additional binding partners from the plasma. PMID:27739542

  6. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  7. Quantitative genetic analysis of salicylic acid perception in Arabidopsis.

    PubMed

    Dobón, Albor; Canet, Juan Vicente; Perales, Lorena; Tornero, Pablo

    2011-10-01

    Salicylic acid (SA) is a phytohormone required for a full resistance against some pathogens in Arabidopsis, and NPR1 (Non-Expressor of Pathogenesis Related Genes 1) is the only gene with a strong effect on resistance induced by SA which has been described. There can be additional components of SA perception that escape the traditional approach of mutagenesis. An alternative to that approach is searching in the natural variation of Arabidopsis. Different methods of analyzing the variation between ecotypes have been tried and it has been found that measuring the growth of a virulent isolate of Pseudomonas syringae after the exogenous application of SA is the most effective one. Two ecotypes, Edi-0 and Stw-0, have been crossed, and their F2 has been studied. There are two significant quantitative trait loci (QTLs) in this population, and there is one QTL in each one of the existing mapping populations Col-4 × Laer-0 and Laer-0 × No-0. They have different characteristics: while one QTL is only detectable at low concentrations of SA, the other acts after the point of crosstalk with methyl jasmonate signalling. Three of the QTLs have candidates described in SA perception as NPR1, its interactors, and a calmodulin binding protein.

  8. Nonparametric survival analysis using Bayesian Additive Regression Trees (BART).

    PubMed

    Sparapani, Rodney A; Logan, Brent R; McCulloch, Robert E; Laud, Purushottam W

    2016-07-20

    Bayesian additive regression trees (BART) provide a framework for flexible nonparametric modeling of relationships of covariates to outcomes. Recently, BART models have been shown to provide excellent predictive performance, for both continuous and binary outcomes, and exceeding that of its competitors. Software is also readily available for such outcomes. In this article, we introduce modeling that extends the usefulness of BART in medical applications by addressing needs arising in survival analysis. Simulation studies of one-sample and two-sample scenarios, in comparison with long-standing traditional methods, establish face validity of the new approach. We then demonstrate the model's ability to accommodate data from complex regression models with a simulation study of a nonproportional hazards scenario with crossing survival functions and survival function estimation in a scenario where hazards are multiplicatively modified by a highly nonlinear function of the covariates. Using data from a recently published study of patients undergoing hematopoietic stem cell transplantation, we illustrate the use and some advantages of the proposed method in medical investigations. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26854022

  9. Nanotechnology patents in the automotive industry (a quantitative & qualitative analysis).

    PubMed

    Prasad, Raghavendra; Bandyopadhyay, Tapas K

    2014-01-01

    The aim of the article is to present a trend in patent filings for application of nanotechnology to the automobile sector across the world, using the keyword-based patent search. Overviews of the patents related to nano technology in the automobile industry have been provided. The current work has started from the worldwide patent search to find the patents on nanotechnology in the automobile industry and classify the patents according to the various parts of an automobile to which they are related and the solutions which they are providing. In the next step various graphs have been produced to get an insight into various trends. In next step, analysis of patents in various classifications, have been performed. The trends shown in graphs provide the quantitative analysis whereas; the qualitative analysis has been done in another section. The classifications of patents based on the solution they provide have been performed by reading the claims, titles, abstract and full texts separately. Patentability of nano technology inventions have been discussed in a view to give an idea of requirements and statutory bars to the patentability of nanotechnology inventions. Another objective of the current work is to suggest appropriate framework for the companies regarding use of nano technology in the automobile industry and a suggestive strategy for patenting of the inventions related to the same. For example, US Patent, with patent number US2008-019426A1 discusses the invention related to Lubricant composition. This patent has been studied and classified to fall under classification of automobile parts. After studying this patent, it is deduced that, the problem of friction in engine is being solved by this patent. One classification is the "automobile part" based while other is the basis of "problem being solved". Hence, two classifications, namely reduction in friction and engine were created. Similarly, after studying all the patents, a similar matrix has been created

  10. The Measles Vaccination Narrative in Twitter: A Quantitative Analysis

    PubMed Central

    Radzikowski, Jacek; Jacobsen, Kathryn H; Croitoru, Arie; Crooks, Andrew; Delamater, Paul L

    2016-01-01

    Background The emergence of social media is providing an alternative avenue for information exchange and opinion formation on health-related issues. Collective discourse in such media leads to the formation of a complex narrative, conveying public views and perceptions. Objective This paper presents a study of Twitter narrative regarding vaccination in the aftermath of the 2015 measles outbreak, both in terms of its cyber and physical characteristics. We aimed to contribute to the analysis of the data, as well as presenting a quantitative interdisciplinary approach to analyze such open-source data in the context of health narratives. Methods We collected 669,136 tweets referring to vaccination from February 1 to March 9, 2015. These tweets were analyzed to identify key terms, connections among such terms, retweet patterns, the structure of the narrative, and connections to the geographical space. Results The data analysis captures the anatomy of the themes and relations that make up the discussion about vaccination in Twitter. The results highlight the higher impact of stories contributed by news organizations compared to direct tweets by health organizations in communicating health-related information. They also capture the structure of the antivaccination narrative and its terms of reference. Analysis also revealed the relationship between community engagement in Twitter and state policies regarding child vaccination. Residents of Vermont and Oregon, the two states with the highest rates of non-medical exemption from school-entry vaccines nationwide, are leading the social media discussion in terms of participation. Conclusions The interdisciplinary study of health-related debates in social media across the cyber-physical debate nexus leads to a greater understanding of public concerns, views, and responses to health-related issues. Further coalescing such capabilities shows promise towards advancing health communication, thus supporting the design of more

  11. Salicylate Detection by Complexation with Iron(III) and Optical Absorbance Spectroscopy: An Undergraduate Quantitative Analysis Experiment

    ERIC Educational Resources Information Center

    Mitchell-Koch, Jeremy T.; Reid, Kendra R.; Meyerhoff, Mark E.

    2008-01-01

    An experiment for the undergraduate quantitative analysis laboratory involving applications of visible spectrophotometry is described. Salicylate, a component found in several medications, as well as the active by-product of aspirin decomposition, is quantified. The addition of excess iron(III) to a solution of salicylate generates a deeply…

  12. Teaching Quantitative Literacy through a Regression Analysis of Exam Performance

    ERIC Educational Resources Information Center

    Lindner, Andrew M.

    2012-01-01

    Quantitative literacy is increasingly essential for both informed citizenship and a variety of careers. Though regression is one of the most common methods in quantitative sociology, it is rarely taught until late in students' college careers. In this article, the author describes a classroom-based activity introducing students to regression…

  13. Quantitative Analysis of Cellular Metabolic Dissipative, Self-Organized Structures

    PubMed Central

    de la Fuente, Ildefonso Martínez

    2010-01-01

    One of the most important goals of the postgenomic era is understanding the metabolic dynamic processes and the functional structures generated by them. Extensive studies during the last three decades have shown that the dissipative self-organization of the functional enzymatic associations, the catalytic reactions produced during the metabolite channeling, the microcompartmentalization of these metabolic processes and the emergence of dissipative networks are the fundamental elements of the dynamical organization of cell metabolism. Here we present an overview of how mathematical models can be used to address the properties of dissipative metabolic structures at different organizational levels, both for individual enzymatic associations and for enzymatic networks. Recent analyses performed with dissipative metabolic networks have shown that unicellular organisms display a singular global enzymatic structure common to all living cellular organisms, which seems to be an intrinsic property of the functional metabolism as a whole. Mathematical models firmly based on experiments and their corresponding computational approaches are needed to fully grasp the molecular mechanisms of metabolic dynamical processes. They are necessary to enable the quantitative and qualitative analysis of the cellular catalytic reactions and also to help comprehend the conditions under which the structural dynamical phenomena and biological rhythms arise. Understanding the molecular mechanisms responsible for the metabolic dissipative structures is crucial for unraveling the dynamics of cellular life. PMID:20957111

  14. Comparison of multivariate calibration methods for quantitative spectral analysis

    SciTech Connect

    Thomas, E.V.; Haaland, D.M. )

    1990-05-15

    The quantitative prediction abilities of four multivariate calibration methods for spectral analyses are compared by using extensive Monte Carlo simulations. The calibration methods compared include inverse least-squares (ILS), classical least-squares (CLS), partial least-squares (PLS), and principal component regression (PCR) methods. ILS is a frequency-limited method while the latter three are capable of full-spectrum calibration. The simulations were performed assuming Beer's law holds and that spectral measurement errors and concentration errors associated with the reference method are normally distributed. Eight different factors that could affect the relative performance of the calibration methods were varied in a two-level, eight-factor experimental design in order to evaluate their effect on the prediction abilities of the four methods. It is found that each of the three full-spectrum methods has its range of superior performance. The frequency-limited ILS method was never the best method, although in the presence of relatively large concentration errors it sometimes yields comparable analysis precision to the full-spectrum methods for the major spectral component. The importance of each factor in the absolute and relative performances of the four methods is compared.

  15. Quantitative Financial Analysis of Alternative Energy Efficiency Shareholder Incentive Mechanisms

    SciTech Connect

    Cappers, Peter; Goldman, Charles; Chait, Michele; Edgar, George; Schlegel, Jeff; Shirley, Wayne

    2008-08-03

    Rising energy prices and climate change are central issues in the debate about our nation's energy policy. Many are demanding increased energy efficiency as a way to help reduce greenhouse gas emissions and lower the total cost of electricity and energy services for consumers and businesses. Yet, as the National Action Plan on Energy Efficiency (NAPEE) pointed out, many utilities continue to shy away from seriously expanding their energy efficiency program offerings because they claim there is insufficient profit-motivation, or even a financial disincentive, when compared to supply-side investments. With the recent introduction of Duke Energy's Save-a-Watt incentive mechanism and ongoing discussions about decoupling, regulators and policymakers are now faced with an expanded and diverse landscape of financial incentive mechanisms, Determining the 'right' way forward to promote deep and sustainable demand side resource programs is challenging. Due to the renaissance that energy efficiency is currently experiencing, many want to better understand the tradeoffs in stakeholder benefits between these alternative incentive structures before aggressively embarking on a path for which course corrections can be time-consuming and costly. Using a prototypical Southwest utility and a publicly available financial model, we show how various stakeholders (e.g. shareholders, ratepayers, etc.) are affected by these different types of shareholder incentive mechanisms under varying assumptions about program portfolios. This quantitative analysis compares the financial consequences associated with a wide range of alternative incentive structures. The results will help regulators and policymakers better understand the financial implications of DSR program incentive regulation.

  16. Quantitative image analysis of HIV-1 infection in lymphoid tissue

    SciTech Connect

    Haase, A.T.; Zupancic, M.; Cavert, W.

    1996-11-08

    Tracking human immunodeficiency virus-type 1 (HIV-1) infection at the cellular level in tissue reservoirs provides opportunities to better understand the pathogenesis of infection and to rationally design and monitor therapy. A quantitative technique was developed to determine viral burden in two important cellular compartments in lymphoid developed to determine viral burden in two important cellular compartments in lymphoid tissues. Image analysis and in situ hybridization were combined to show that in the presymptomatic stages of infection there is a large, relatively stable pool of virions on the surfaces of follicular dendritic cells and a smaller pool of productivity infected cells. Despite evidence of constraints on HIV-1 replication in the infected cell population in lymphoid tissues, estimates of the numbers of these cells and the virus they could produce are consistent with the quantities of virus that have been detected in the bloodstream. The cellular sources of virus production and storage in lymphoid tissues can now be studied with this approach over the course of infection and treatment. 22 refs., 2 figs., 2 tabs.

  17. Quantitative analysis of plasma interleiukin-6 by immunoassay on microchip

    NASA Astrophysics Data System (ADS)

    Abe, K.; Hashimoto, Y.; Yatsushiro, S.; Yamamura, S.; Tanaka, M.; Ooie, T.; Baba, Y.; Kataoka, M.

    2012-03-01

    Sandwich enzyme-linked immunoassay (ELISA) is one of the most frequently employed assays for clinical diagnosis, since this enables the investigator to identify specific protein biomarkers. However, the conventional assay using a 96-well microtitration plate is time- and sample-consuming, and therefore is not suitable for rapid diagnosis. To overcome these drawbacks, we performed a sandwich ELISA on a microchip. We employed the piezoelectric inkjet printing for deposition and fixation of 1st antibody on the microchannnel surface (300 μm width and 100 μm depth). Model analyte was interleukin-6 (IL-6) which was one of the inflammatory cytokine. After blocking the microchannel, antigen, biotin-labeled 2nd antibody, and avidin-labeled peroxidase were infused into the microchannel and incubated for 20 min, 10 min, and 5 min, respectively. This assay could detect 2 pg/ml and quantitatively measure the range of 0-32 pg/ml. Liner regression analysis of plasma IL-6 concentration obtained by microchip and conventional methods exhibited a significant relationship (R2 = 0.9964). This assay reduced the time for the antigen-antibody reaction to 1/6, and the consumption of samples and reagents to 1/50 compared with the conventional method. This assay enables us to determine plasma IL-6 with accuracy, high sensitivity, time saving ability, and low consumption of sample and reagents, and thus will be applicable to clinic diagnosis.

  18. Quantitative produced water analysis using mobile 1H NMR

    NASA Astrophysics Data System (ADS)

    Wagner, Lisabeth; Kalli, Chris; Fridjonsson, Einar O.; May, Eric F.; Stanwix, Paul L.; Graham, Brendan F.; Carroll, Matthew R. J.; Johns, Michael L.

    2016-10-01

    Measurement of oil contamination of produced water is required in the oil and gas industry to the (ppm) level prior to discharge in order to meet typical environmental legislative requirements. Here we present the use of compact, mobile 1H nuclear magnetic resonance (NMR) spectroscopy, in combination with solid phase extraction (SPE), to meet this metrology need. The NMR hardware employed featured a sufficiently homogeneous magnetic field, such that chemical shift differences could be used to unambiguously differentiate, and hence quantitatively detect, the required oil and solvent NMR signals. A solvent system consisting of 1% v/v chloroform in tetrachloroethylene was deployed, this provided a comparable 1H NMR signal intensity for the oil and the solvent (chloroform) and hence an internal reference 1H signal from the chloroform resulting in the measurement being effectively self-calibrating. The measurement process was applied to water contaminated with hexane or crude oil over the range 1-30 ppm. The results were validated against known solubility limits as well as infrared analysis and gas chromatography.

  19. Quantitative analysis of biomedical samples using synchrotron radiation microbeams

    NASA Astrophysics Data System (ADS)

    Ektessabi, Ali; Shikine, Shunsuke; Yoshida, Sohei

    2001-07-01

    X-ray fluorescence (XRF) using a synchrotron radiation (SR) microbeam was applied to investigate distributions and concentrations of elements in single neurons of patients with neurodegenerative diseases. In this paper we introduce a computer code that has been developed to quantify the trace elements and matrix elements at the single cell level. This computer code has been used in studies of several important neurodegenerative diseases such as Alzheimer's disease (AD), Parkinson's disease (PD) and parkinsonism-dementia complex (PDC), as well as in basic biological experiments to determine the elemental changes in cells due to incorporation of foreign metal elements. The substantial nigra (SN) tissue obtained from the autopsy specimens of patients with Guamanian parkinsonism-dementia complex (PDC) and control cases were examined. Quantitative XRF analysis showed that neuromelanin granules of Parkinsonian SN contained higher levels of Fe than those of the control. The concentrations were in the ranges of 2300-3100 ppm and 2000-2400 ppm respectively. On the contrary, Zn and Ni in neuromelanin granules of SN tissue from the PDC case were lower than those of the control. Especially Zn was less than 40 ppm in SN tissue from the PDC case while it was 560-810 ppm in the control. These changes are considered to be closely related to the neuro-degeneration and cell death.

  20. Hyperspectral imaging and quantitative analysis for prostate cancer detection

    PubMed Central

    Akbari, Hamed; Halig, Luma V.; Schuster, David M.; Osunkoya, Adeboye; Master, Viraj; Nieh, Peter T.; Chen, Georgia Z.

    2012-01-01

    Abstract. Hyperspectral imaging (HSI) is an emerging modality for various medical applications. Its spectroscopic data might be able to be used to noninvasively detect cancer. Quantitative analysis is often necessary in order to differentiate healthy from diseased tissue. We propose the use of an advanced image processing and classification method in order to analyze hyperspectral image data for prostate cancer detection. The spectral signatures were extracted and evaluated in both cancerous and normal tissue. Least squares support vector machines were developed and evaluated for classifying hyperspectral data in order to enhance the detection of cancer tissue. This method was used to detect prostate cancer in tumor-bearing mice and on pathology slides. Spatially resolved images were created to highlight the differences of the reflectance properties of cancer versus those of normal tissue. Preliminary results with 11 mice showed that the sensitivity and specificity of the hyperspectral image classification method are 92.8% to 2.0% and 96.9% to 1.3%, respectively. Therefore, this imaging method may be able to help physicians to dissect malignant regions with a safe margin and to evaluate the tumor bed after resection. This pilot study may lead to advances in the optical diagnosis of prostate cancer using HSI technology. PMID:22894488

  1. Quantitative SERS sensors for environmental analysis of naphthalene.

    PubMed

    Péron, O; Rinnert, E; Toury, T; Lamy de la Chapelle, M; Compère, C

    2011-03-01

    In the investigation of chemical pollutants, such as PAHs (Polycyclic Aromatic Hydrocarbons) at low concentration in aqueous medium, Surface-Enhanced Raman Scattering (SERS) stands for an alternative to the inherent low cross-section of normal Raman scattering. Indeed, SERS is a very sensitive spectroscopic technique due to the excitation of the surface plasmon modes of the nanostructured metallic film. The surface of quartz substrates was coated with a hydrophobic film obtained by silanization and subsequently reacted with polystyrene (PS) beads coated with gold nanoparticles. The hydrophobic surface of the SERS substrates pre-concentrates non-polar molecules such as naphthalene. Under laser excitation, the SERS-active substrates allow the detection and the identification of the target molecules localized close to the gold nanoparticles. The morphology of the SERS substrates based on polystyrene beads surrounded by gold nanoparticles was characterized by scanning electron microscopy (SEM). Furthermore, the Raman fingerprint of the polystyrene stands for an internal spectral reference. To this extent, an innovative method to detect and to quantify organic molecules, as naphthalene in the range of 1 to 20 ppm, in aqueous media was carried out. Such SERS-active substrates tend towards an application as quantitative SERS sensors for the environmental analysis of naphthalene. PMID:21165476

  2. Active contour approach for accurate quantitative airway analysis

    NASA Astrophysics Data System (ADS)

    Odry, Benjamin L.; Kiraly, Atilla P.; Slabaugh, Greg G.; Novak, Carol L.; Naidich, David P.; Lerallut, Jean-Francois

    2008-03-01

    Chronic airway disease causes structural changes in the lungs including peribronchial thickening and airway dilatation. Multi-detector computed tomography (CT) yields detailed near-isotropic images of the lungs, and thus the potential to obtain quantitative measurements of lumen diameter and airway wall thickness. Such measurements would allow standardized assessment, and physicians to diagnose and locate airway abnormalities, adapt treatment, and monitor progress over time. However, due to the sheer number of airways per patient, systematic analysis is infeasible in routine clinical practice without automation. We have developed an automated and real-time method based on active contours to estimate both airway lumen and wall dimensions; the method does not require manual contour initialization but only a starting point on the targeted airway. While the lumen contour segmentation is purely region-based, the estimation of the outer diameter considers the inner wall segmentation as well as local intensity variation, in order anticipate the presence of nearby arteries and exclude them. These properties make the method more robust than the Full-Width Half Maximum (FWHM) approach. Results are demonstrated on a phantom dataset with known dimensions and on a human dataset where the automated measurements are compared against two human operators. The average error on the phantom measurements was 0.10mm and 0.14mm for inner and outer diameters, showing sub-voxel accuracy. Similarly, the mean variation from the average manual measurement was 0.14mm and 0.18mm for inner and outer diameters respectively.

  3. Copulation patterns in captive hamadryas baboons: a quantitative analysis.

    PubMed

    Nitsch, Florian; Stueckle, Sabine; Stahl, Daniel; Zinner, Dietmar

    2011-10-01

    For primates, as for many other vertebrates, copulation which results in ejaculation is a prerequisite for reproduction. The probability of ejaculation is affected by various physiological and social factors, for example reproductive state of male and female and operational sex-ratio. In this paper, we present quantitative and qualitative data on patterns of sexual behaviour in a captive group of hamadryas baboons (Papio hamadryas), a species with a polygynous-monandric mating system. We observed more than 700 copulations and analysed factors that can affect the probability of ejaculation. Multilevel logistic regression analysis and Akaike's information criterion (AIC) model selection procedures revealed that the probability of successful copulation increased as the size of female sexual swellings increased, indicating increased probability of ovulation, and as the number of females per one-male unit (OMU) decreased. In contrast, occurrence of female copulation calls, sex of the copulation initiator, and previous male aggression toward females did not affect the probability of ejaculation. Synchrony of oestrus cycles also had no effect (most likely because the sample size was too small). We also observed 29 extra-group copulations by two non-adult males. Our results indicate that male hamadryas baboons copulated more successfully around the time of ovulation and that males in large OMUs with many females may be confronted by time or energy-allocation problems.

  4. Quantitative image analysis of cell colocalization in murine bone marrow.

    PubMed

    Mokhtari, Zeinab; Mech, Franziska; Zehentmeier, Sandra; Hauser, Anja E; Figge, Marc Thilo

    2015-06-01

    Long-term antibody production is a key property of humoral immunity and is accomplished by long-lived plasma cells. They mainly reside in the bone marrow, whose importance as an organ hosting immunological memory is becoming increasingly evident. Signals provided by stromal cells and eosinophils may play an important role for plasma cell maintenance, constituting a survival microenvironment. In this joint study of experiment and theory, we investigated the spatial colocalization of plasma cells, eosinophils and B cells by applying an image-based systems biology approach. To this end, we generated confocal fluorescence microscopy images of histological sections from murine bone marrow that were subsequently analyzed in an automated fashion. This quantitative analysis was combined with computer simulations of the experimental system for hypothesis testing. In particular, we tested the observed spatial colocalization of cells in the bone marrow against the hypothesis that cells are found within available areas at positions that were drawn from a uniform random number distribution. We find that B cells and plasma cells highly colocalize with stromal cells, to an extent larger than in the simulated random situation. While B cells are preferentially in contact with each other, i.e., form clusters among themselves, plasma cells seem to be solitary or organized in aggregates, i.e., loosely defined groups of cells that are not necessarily in direct contact. Our data suggest that the plasma cell bone marrow survival niche facilitates colocalization of plasma cells with stromal cells and eosinophils, respectively, promoting plasma cell longevity.

  5. Quantitative genetic analysis of flowering time in tomato.

    PubMed

    Jiménez-Gómez, José M; Alonso-Blanco, Carlos; Borja, Alicia; Anastasio, Germán; Angosto, Trinidad; Lozano, Rafael; Martínez-Zapater, José M

    2007-03-01

    Artificial selection of cultivated tomato (Solanum lycopersicum L.) has resulted in the generation of early-flowering, day-length-insensitive cultivars, despite its close relationship to other Solanum species that need more time and specific photoperiods to flower. To investigate the genetic mechanisms controlling flowering time in tomato and related species, we performed a quantitative trait locus (QTL) analysis for flowering time in an F2 mapping population derived from S. lycopersicum and its late-flowering wild relative S. chmielewskii. Flowering time was scored as the number of days from sowing to the opening of the first flower (days to flowering), and as the number of leaves under the first inflorescence (leaf number). QTL analyses detected 2 QTLs affecting days to flowering, which explained 55.3% of the total phenotypic variance, and 6 QTLs for leaf number, accounting for 66.7% of the corresponding phenotypic variance. Four of the leaf number QTLs had not previously been detected for this trait in tomato. Colocation of some QTLs with flowering-time genes included in the genetic map suggests PHYB2, FALSIFLORA, and a tomato FLC-like sequence as candidate genes that might have been targets of selection during the domestication of tomato.

  6. Early child grammars: qualitative and quantitative analysis of morphosyntactic production.

    PubMed

    Legendre, Géraldine

    2006-09-10

    This article reports on a series of 5 analyses of spontaneous production of verbal inflection (tense and person-number agreement) by 2-year-olds acquiring French as a native language. A formal analysis of the qualitative and quantitative results is developed using the unique resources of Optimality Theory (OT; Prince & Smolensky, 2004). It is argued that acquisition of morphosyntax proceeds via overlapping grammars (rather than through abrupt changes), which OT formalizes in terms of partial rather than total constraint rankings. Initially, economy of structure constraints take priority over faithfulness constraints that demand faithful expression of a speaker's intent, resulting in child production of tense that is comparable in level to that of child-directed speech. Using the independent Predominant Length of Utterance measure of syntactic development proposed in Vainikka, Legendre, and Todorova (1999), production of agreement is shown first to lag behind tense then to compete with tense at an intermediate stage of development. As the child's development progresses, faithfulness constraints become more dominant, and the overall production of tense and agreement becomes adult-like.

  7. Limits of normality of quantitative thoracic CT analysis

    PubMed Central

    2013-01-01

    Introduction Although computed tomography (CT) is widely used to investigate different pathologies, quantitative data from normal populations are scarce. Reference values may be useful to estimate the anatomical or physiological changes induced by various diseases. Methods We analyzed 100 helical CT scans taken for clinical purposes and referred as nonpathological by the radiologist. Profiles were manually outlined on each CT scan slice and each voxel was classified according to its gas/tissue ratio. For regional analysis, the lungs were divided into 10 sterno-vertebral levels. Results We studied 53 males and 47 females (age 64 ± 13 years); males had a greater total lung volume, lung gas volume and lung tissue. Noninflated tissue averaged 7 ± 4% of the total lung weight, poorly inflated tissue averaged 18 ± 3%, normally inflated tissue averaged 65 ± 8% and overinflated tissue averaged 11 ± 7%. We found a significant correlation between lung weight and subject's height (P <0.0001, r2 = 0.49); the total lung capacity in a supine position was 4,066 ± 1,190 ml, ~1,800 ml less than the predicted total lung capacity in a sitting position. Superimposed pressure averaged 2.6 ± 0.5 cmH2O. Conclusion Subjects without lung disease present significant amounts of poorly inflated and overinflated tissue. Normal lung weight can be predicted from patient's height with reasonable confidence. PMID:23706034

  8. Precessing rotating flows with additional shear: Stability analysis

    NASA Astrophysics Data System (ADS)

    Salhi, A.; Cambon, C.

    2009-03-01

    We consider unbounded precessing rotating flows in which vertical or horizontal shear is induced by the interaction between the solid-body rotation (with angular velocity Ω0 ) and the additional “precessing” Coriolis force (with angular velocity -ɛΩ0 ), normal to it. A “weak” shear flow, with rate 2ɛ of the same order of the Poincaré “small” ratio ɛ , is needed for balancing the gyroscopic torque, so that the whole flow satisfies Euler’s equations in the precessing frame (the so-called admissibility conditions). The base flow case with vertical shear (its cross-gradient direction is aligned with the main angular velocity) corresponds to Mahalov’s [Phys. Fluids A 5, 891 (1993)] precessing infinite cylinder base flow (ignoring boundary conditions), while the base flow case with horizontal shear (its cross-gradient direction is normal to both main and precessing angular velocities) corresponds to the unbounded precessing rotating shear flow considered by Kerswell [Geophys. Astrophys. Fluid Dyn. 72, 107 (1993)]. We show that both these base flows satisfy the admissibility conditions and can support disturbances in terms of advected Fourier modes. Because the admissibility conditions cannot select one case with respect to the other, a more physical derivation is sought: Both flows are deduced from Poincaré’s [Bull. Astron. 27, 321 (1910)] basic state of a precessing spheroidal container, in the limit of small ɛ . A Rapid distortion theory (RDT) type of stability analysis is then performed for the previously mentioned disturbances, for both base flows. The stability analysis of the Kerswell base flow, using Floquet’s theory, is recovered, and its counterpart for the Mahalov base flow is presented. Typical growth rates are found to be the same for both flows at very small ɛ , but significant differences are obtained regarding growth rates and widths of instability bands, if larger ɛ values, up to 0.2, are considered. Finally, both flow cases

  9. Complete multipoint sib-pair analysis of qualitative and quantitative traits

    SciTech Connect

    Kruglyak, L.; Lander, E.S.

    1995-08-01

    Sib-pair analysis is an increasingly important tool for genetic dissection of complex traits. Current methods for sib-pair analysis are primarily based on studying individual genetic markers one at a time and thus fail to use the full inheritance information provided by multipoint linkage analysis. In this paper, we describe how to extract the complete multipoint inheritance information for each sib pair. We then describe methods that use this information to map loci affecting traits, thereby providing a unified approach to both qualitative and quantitative traits. Specifically, complete multipoint approaches are presented for (1) exclusion mapping of qualitative traits; (2) maximum-likelihood mapping of qualitative traits; (3) information-content mapping, showing the extent to which all inheritance information has been extracted at each location in the genome; and (4) quantitative-trait mapping, by two parametric methods and one nonparametric method. In addition, we explore the effects of marker density, marker polymorphism, and availability of parents on the information content of a study. We have implemented the analysis methods in a new computer package, MAPMAKER/SIBS. With this computer package, complete multipoint analysis with dozens of markers in hundreds of sib pairs can be carried out in minutes. 25 refs., 8 figs.

  10. Quantitative real-time single particle analysis of virions.

    PubMed

    Heider, Susanne; Metzner, Christoph

    2014-08-01

    Providing information about single virus particles has for a long time been mainly the domain of electron microscopy. More recently, technologies have been developed-or adapted from other fields, such as nanotechnology-to allow for the real-time quantification of physical virion particles, while supplying additional information such as particle diameter concomitantly. These technologies have progressed to the stage of commercialization increasing the speed of viral titer measurements from hours to minutes, thus providing a significant advantage for many aspects of virology research and biotechnology applications. Additional advantages lie in the broad spectrum of virus species that may be measured and the possibility to determine the ratio of infectious to total particles. A series of disadvantages remain associated with these technologies, such as a low specificity for viral particles. In this review we will discuss these technologies by comparing four systems for real-time single virus particle analysis and quantification.

  11. Quantitative analysis of benzodiazepines in vitreous humor by high-performance liquid chromatography

    PubMed Central

    Bazmi, Elham; Behnoush, Behnam; Akhgari, Maryam; Bahmanabadi, Leila

    2016-01-01

    Objective: Benzodiazepines are frequently screened drugs in emergency toxicology, drugs of abuse testing, and in forensic cases. As the variations of benzodiazepines concentrations in biological samples during bleeding, postmortem changes, and redistribution could be biasing forensic medicine examinations, hence selecting a suitable sample and a validated accurate method is essential for the quantitative analysis of these main drug categories. The aim of this study was to develop a valid method for the determination of four benzodiazepines (flurazepam, lorazepam, alprazolam, and diazepam) in vitreous humor using liquid–liquid extraction and high-performance liquid chromatography. Methods: Sample preparation was carried out using liquid–liquid extraction with n-hexane: ethyl acetate and subsequent detection by high-performance liquid chromatography method coupled to diode array detector. This method was applied to quantify benzodiazepines in 21 authentic vitreous humor samples. Linear curve for each drug was obtained within the range of 30–3000 ng/mL with coefficient of correlation higher than 0.99. Results: The limit of detection and quantitation were 30 and 100 ng/mL respectively for four drugs. The method showed an appropriate intra- and inter-day precision (coefficient of variation < 10%). Benzodiazepines recoveries were estimated to be over 80%. The method showed high selectivity; no additional peak due to interfering substances in samples was observed. Conclusion: The present method was selective, sensitive, accurate, and precise for the quantitative analysis of benzodiazepines in vitreous humor samples in forensic toxicology laboratory. PMID:27635251

  12. Quantitative analysis of benzodiazepines in vitreous humor by high-performance liquid chromatography

    PubMed Central

    Bazmi, Elham; Behnoush, Behnam; Akhgari, Maryam; Bahmanabadi, Leila

    2016-01-01

    Objective: Benzodiazepines are frequently screened drugs in emergency toxicology, drugs of abuse testing, and in forensic cases. As the variations of benzodiazepines concentrations in biological samples during bleeding, postmortem changes, and redistribution could be biasing forensic medicine examinations, hence selecting a suitable sample and a validated accurate method is essential for the quantitative analysis of these main drug categories. The aim of this study was to develop a valid method for the determination of four benzodiazepines (flurazepam, lorazepam, alprazolam, and diazepam) in vitreous humor using liquid–liquid extraction and high-performance liquid chromatography. Methods: Sample preparation was carried out using liquid–liquid extraction with n-hexane: ethyl acetate and subsequent detection by high-performance liquid chromatography method coupled to diode array detector. This method was applied to quantify benzodiazepines in 21 authentic vitreous humor samples. Linear curve for each drug was obtained within the range of 30–3000 ng/mL with coefficient of correlation higher than 0.99. Results: The limit of detection and quantitation were 30 and 100 ng/mL respectively for four drugs. The method showed an appropriate intra- and inter-day precision (coefficient of variation < 10%). Benzodiazepines recoveries were estimated to be over 80%. The method showed high selectivity; no additional peak due to interfering substances in samples was observed. Conclusion: The present method was selective, sensitive, accurate, and precise for the quantitative analysis of benzodiazepines in vitreous humor samples in forensic toxicology laboratory.

  13. Communication about vaccinations in Italian websites: a quantitative analysis.

    PubMed

    Tafuri, Silvio; Gallone, Maria S; Gallone, Maria F; Zorico, Ivan; Aiello, Valeria; Germinario, Cinzia

    2014-01-01

    Babies' parents and people who look for information about vaccination often visit anti-vaccine movement's websites, blogs by naturopathic physicians or natural and alternative medicine practitioners. The aim of this work is to provide a quantitative analysis on the type of information available to Italian people regarding vaccination and a quality analysis of websites retrieved through our searches. A quality score was created to evaluate the technical level of websites. A research was performed through Yahoo, Google, and MSN using the keywords "vaccine" and "vaccination," with the function "OR" in order to identify the most frequently used websites. The 2 keywords were input in Italian, and the first 15 pages retrieved by each search engine were analyzed. 149 websites were selected through this methodology. Fifty-three per cent of the websites belonged to associations, groups, or scientific companies, 32.2% (n = 48) consisted of a personal blog and 14.8% (n = 22) belonged to some of the National Health System offices. Among all analyzed websites, 15.4% (n = 23) came from anti-vaccine movement groups. 37.6% reported webmaster name, 67.8% webmaster e-mail, 28.6% indicated the date of the last update and 46.6% the author's name. The quality score for government sites was higher on average than anti-vaccine websites; although, government sites don't use Web 2.0 functions, as the forums.: National Health System institutions who have to promote vaccination cannot avoid investing in web communication because it cannot be managed by private efforts but must be the result of Public Health, private and scientific association, and social movement synergy.

  14. Quantitative Analysis of Human Cancer Cell Extravasation Using Intravital Imaging.

    PubMed

    Willetts, Lian; Bond, David; Stoletov, Konstantin; Lewis, John D

    2016-01-01

    Metastasis, or the spread of cancer cells from a primary tumor to distant sites, is the leading cause of cancer-associated death. Metastasis is a complex multi-step process comprised of invasion, intravasation, survival in circulation, extravasation, and formation of metastatic colonies. Currently, in vitro assays are limited in their ability to investigate these intricate processes and do not faithfully reflect metastasis as it occurs in vivo. Traditional in vivo models of metastasis are limited by their ability to visualize the seemingly sporadic behavior of where and when cancer cells spread (Reymond et al., Nat Rev Cancer 13:858-870, 2013). The avian embryo model of metastasis is a powerful platform to study many of the critical steps in the metastatic cascade including the migration, extravasation, and invasion of human cancer cells in vivo (Sung et al., Nat Commun 6:7164, 2015; Leong et al., Cell Rep 8, 1558-1570, 2014; Kain et al., Dev Dyn 243:216-28, 2014; Leong et al., Nat Protoc 5:1406-17, 2010; Zijlstra et al., Cancer Cell 13:221-234, 2008; Palmer et al., J Vis Exp 51:2815, 2011). The chicken chorioallantoic membrane (CAM) is a readily accessible and well-vascularized tissue that surrounds the developing embryo. When the chicken embryo is grown in a shell-less, ex ovo environment, the nearly transparent CAM provides an ideal environment for high-resolution fluorescent microcopy approaches. In this model, the embryonic chicken vasculature and labeled cancer cells can be visualized simultaneously to investigate specific steps in the metastatic cascade including extravasation. When combined with the proper image analysis tools, the ex ovo chicken embryo model offers a cost-effective and high-throughput platform for the quantitative analysis of tumor cell metastasis in a physiologically relevant in vivo setting. Here we discuss detailed procedures to quantify cancer cell extravasation in the shell-less chicken embryo model with advanced fluorescence

  15. Quantitative petrographic analysis of Cretaceous sandstones from southwest Montana

    SciTech Connect

    Dyman, T.S. Krystinik, K.B.; Takahashi, K.I.

    1986-05-01

    The Albian Blackleaf Formation and the Cenomanian lower Frontier Formation in southwest Montana lie within or east of the fold and thrust belt in the Cretaceous foreland basin complex. Petrography of these strata record a complex interaction between source-area tectonism, basin subsidence, and sedimentation patterns associated with a cyclic sequence of transgressions and regressions. Because the petrographic data set was large (127 thin sections) and difficult to interpret subjectively, statistical techniques were used to establish sample and variable relationships. Theta-mode cluster and correspondence analysis were used to determine the contributing effect (total variance) of key framework grains. Monocrystalline quartz, plagioclase, potassium feldspar, and sandstone-, limestone-, and volcanic-lithic grain content contribute most to the variation in the framework-grain population. Theta-mode cluster and correspondence analysis were used to identify six petrofacies. Lower Blackleaf petrofacies (I-III) contain abundant monocrystalline quartz (55-90%) and sedimentary lithic grains (10-50%), which are distributed throughout the study area. Petrofacies I-III are differentiated by variable monocrystalline quartz and sedimentary lithic grain content. Upper Blackleaf and lower Frontier petrofacies (IV-VI) exhibit highly variable, sedimentary and volcanic lithic ratios, and contain less monocrystalline quartz (20-50%) than lower Blackleaf petrofacies. Information from quantitative analyses combined with available paleocurrent data indicates that Blackleaf and lower Frontier detritus was derived from variable source areas through time. Lower Blackleaf detritus was derived from Precambrian through Paleozoic sedimentary terranes to the west, north, and east; whereas, upper Blackleaf and lower Frontier detritus was derived from both sedimentary and volcanic terranes to the south.

  16. Hybrid Additive Manufacturing Technologies - An Analysis Regarding Potentials and Applications

    NASA Astrophysics Data System (ADS)

    Merklein, Marion; Junker, Daniel; Schaub, Adam; Neubauer, Franziska

    Imposing the trend of mass customization of lightweight construction in industry, conventional manufacturing processes like forming technology and chipping production are pushed to their limits for economical manufacturing. More flexible processes are needed which were developed by the additive manufacturing technology. This toolless production principle offers a high geometrical freedom and an optimized utilization of the used material. Thus load adjusted lightweight components can be produced in small lot sizes in an economical way. To compensate disadvantages like inadequate accuracy and surface roughness hybrid machines combining additive and subtractive manufacturing are developed. Within this paper the principles of mainly used additive manufacturing processes of metals and their possibility to be integrated into a hybrid production machine are summarized. It is pointed out that in particular the integration of deposition processes into a CNC milling center supposes high potential for manufacturing larger parts with high accuracy. Furthermore the combination of additive and subtractive manufacturing allows the production of ready to use products within one single machine. Additionally actual research for the integration of additive manufacturing processes into the production chain will be analyzed. For the long manufacturing time of additive production processes the combination with conventional manufacturing processes like sheet or bulk metal forming seems an effective solution. Especially large volumes can be produced by conventional processes. In an additional production step active elements can be applied by additive manufacturing. This principle is also investigated for tool production to reduce chipping of the high strength material used for forming tools. The aim is the addition of active elements onto a geometrical simple basis by using Laser Metal Deposition. That process allows the utilization of several powder materials during one process what

  17. Investigating reference genes for quantitative real-time PCR analysis across four chicken tissues.

    PubMed

    Bagés, S; Estany, J; Tor, M; Pena, R N

    2015-04-25

    Accurate normalization of data is required to correct for different efficiencies and errors during the processing of samples in reverse transcription PCR analysis. The chicken is one of the main livestock species and its genome was one of the first reported and used in large scale transcriptomic analysis. Despite this, the chicken has not been investigated regarding the identification of reference genes suitable for the quantitative PCR analysis of growth and fattening genes. In this study, five candidate reference genes (B2M, RPL32, SDHA, TBP and YWHAZ) were evaluated to determine the most stable internal reference for quantitative PCR normalization in the two main commercial muscles (pectoralis major (breast) and biceps femoris (thigh)), liver and abdominal fat. Four statistical methods (geNorm, NormFinder, CV and BestKeeper) were used in the evaluation of the most suitable combination of reference genes. Additionally, a comprehensive ranking was established with the RefFinder tool. This analysis identified YWHAZ and TBP as the recommended combination for the analysis of biceps femoris and liver, YWHAZ and RPL32 for pectoralis major and RPL32 and B2M for abdominal fat and across-tissue studies. The final ranking for each tool changed slightly but overall the results, and most particularly the ability to discard the least robust candidates, were consistent between tools. The selection and number of reference genes were validated using SCD, a target gene related to fat metabolism. Overall, the results can be directly used to quantitate target gene expression in different tissues or in validation studies from larger transcriptomic experiments.

  18. Quantitative analysis of cell-free DNA in ovarian cancer

    PubMed Central

    SHAO, XUEFENG; He, YAN; JI, MIN; CHEN, XIAOFANG; QI, JING; SHI, WEI; HAO, TIANBO; JU, SHAOQING

    2015-01-01

    The aim of the present study was to investigate the association between cell-free DNA (cf-DNA) levels and clinicopathological characteristics of patients with ovarian cancer using a branched DNA (bDNA) technique, and to determine the value of quantitative cf-DNA detection in assisting with the diagnosis of ovarian cancer. Serum specimens were collected from 36 patients with ovarian cancer on days 1, 3 and 7 following surgery, and additional serum samples were also collected from 22 benign ovarian tumor cases, and 19 healthy, non-cancerous ovaries. bDNA techniques were used to detect serum cf-DNA concentrations. All data were analyzed using SPSS version 18.0. The cf-DNA levels were significantly increased in the ovarian cancer group compared with those of the benign ovarian tumor group and healthy ovarian group (P<0.01). Furthermore, cf-DNA levels were significantly increased in stage III and IV ovarian cancer compared with those of stages I and II (P<0.01). In addition, cf-DNA levels were significantly increased on the first day post-surgery (P<0.01), and subsequently demonstrated a gradual decrease. In the ovarian cancer group, the area under the receiver operating characteristic curve of cf-DNA and the sensitivity were 0.917 and 88.9%, respectively, which was higher than those of cancer antigen 125 (0.724, 75%) and human epididymis protein 4 (0.743, 80.6%). There was a correlation between the levels of serum cf-DNA and the occurrence and development of ovarian cancer in the patients evaluated. bDNA techniques possessed higher sensitivity and specificity than other methods for the detection of serum cf-DNA in patients exhibiting ovarian cancer, and bDNA techniques are more useful for detecting cf-DNA than other factors. Thus, the present study demonstrated the potential value for the use of bDNA as an adjuvant diagnostic method for ovarian cancer. PMID:26788153

  19. Quantitative analysis of night skyglow amplification under cloudy conditions

    NASA Astrophysics Data System (ADS)

    Kocifaj, Miroslav; Solano Lamphar, Héctor Antonio

    2014-10-01

    The radiance produced by artificial light is a major source of nighttime over-illumination. It can, however, be treated experimentally using ground-based and satellite data. These two types of data complement each other and together have a high information content. For instance, the satellite data enable upward light emissions to be normalized, and this in turn allows skyglow levels at the ground to be modelled under cloudy or overcast conditions. Excessive night lighting imposes an unacceptable burden on nature, humans and professional astronomy. For this reason, there is a pressing need to determine the total amount of downwelling diffuse radiation. Undoubtedly, cloudy periods can cause a significant increase in skyglow as a result of amplification owing to diffuse reflection from clouds. While it is recognized that the amplification factor (AF) varies with cloud cover, the effects of different types of clouds, of atmospheric turbidity and of the geometrical relationships between the positions of an individual observer, the cloud layer, and the light source are in general poorly known. In this paper the AF is quantitatively analysed considering different aerosol optical depths (AODs), urban layout sizes and cloud types with specific albedos and altitudes. The computational results show that the AF peaks near the edges of a city rather than at its centre. In addition, the AF appears to be a decreasing function of AOD, which is particularly important when modelling the skyglow in regions with apparent temporal or seasonal variability of atmospheric turbidity. The findings in this paper will be useful to those designing engineering applications or modelling light pollution, as well as to astronomers and environmental scientists who aim to predict the amplification of skyglow caused by clouds. In addition, the semi-analytical formulae can be used to estimate the AF levels, especially in densely populated metropolitan regions for which detailed computations may be CPU

  20. Quantitative real-time single particle analysis of virions

    SciTech Connect

    Heider, Susanne; Metzner, Christoph

    2014-08-15

    Providing information about single virus particles has for a long time been mainly the domain of electron microscopy. More recently, technologies have been developed—or adapted from other fields, such as nanotechnology—to allow for the real-time quantification of physical virion particles, while supplying additional information such as particle diameter concomitantly. These technologies have progressed to the stage of commercialization increasing the speed of viral titer measurements from hours to minutes, thus providing a significant advantage for many aspects of virology research and biotechnology applications. Additional advantages lie in the broad spectrum of virus species that may be measured and the possibility to determine the ratio of infectious to total particles. A series of disadvantages remain associated with these technologies, such as a low specificity for viral particles. In this review we will discuss these technologies by comparing four systems for real-time single virus particle analysis and quantification. - Highlights: • We introduce four methods for virus particle-based quantification of viruses. • They allow for quantification of a wide range of samples in under an hour time. • The additional measurement of size and zeta potential is possible for some.

  1. Quantitative analysis of LISA pathfinder test-mass noise

    NASA Astrophysics Data System (ADS)

    Ferraioli, Luigi; Congedo, Giuseppe; Hueller, Mauro; Vitale, Stefano; Hewitson, Martin; Nofrarias, Miquel; Armano, Michele

    2011-12-01

    LISA Pathfinder (LPF) is a mission aiming to test the critical technology for the forthcoming space-based gravitational-wave detectors. The main scientific objective of the LPF mission is to demonstrate test masses free falling with residual accelerations below 3×10-14ms-2/Hz at 1 mHz. Reaching such an ambitious target will require a significant amount of system optimization and characterization, which will in turn require accurate and quantitative noise analysis procedures. In this paper, we discuss two main problems associated with the analysis of the data from LPF: i) excess noise detection and ii) noise parameter identification. The mission is focused on the low-frequency region ([0.1, 10] mHz) of the available signal spectrum. In such a region, the signal is dominated by the force noise acting on test masses. At the same time, the mission duration is limited to 90 days and typical data segments will be 24 hours in length. Considering those constraints, noise analysis is expected to deal with a limited amount of non-Gaussian data, since the spectrum statistics will be far from Gaussian and the lowest available frequency is limited by the data length. In this paper, we analyze the details of the expected statistics for spectral data and develop two suitable excess noise estimators. One is based on the statistical properties of the integrated spectrum, the other is based on the Kolmogorov-Smirnov test. The sensitivity of the estimators is discussed theoretically for independent data, then the algorithms are tested on LPF synthetic data. The test on realistic LPF data allows the effect of spectral data correlations on the efficiency of the different noise excess estimators to be highlighted. It also reveals the versatility of the Kolmogorov-Smirnov approach, which can be adapted to provide reasonable results on correlated data from a modified version of the standard equations for the inversion of the test statistic. Closely related to excess noise detection, the

  2. GISH analysis of disomic Brassica napus-Crambe abyssinica chromosome addition lines produced by microspore culture from monosomic addition lines.

    PubMed

    Wang, Youping; Sonntag, Karin; Rudloff, Eicke; Wehling, Peter; Snowdon, Rod J

    2006-02-01

    Two Brassica napus-Crambe abyssinica monosomic addition lines (2n=39, AACC plus a single chromosome from C. abyssinca) were obtained from the F(2) progeny of the asymmetric somatic hybrid. The alien chromosome from C. abyssinca in the addition line was clearly distinguished by genomic in situ hybridization (GISH). Twenty-seven microspore-derived plants from the addition lines were obtained. Fourteen seedlings were determined to be diploid plants (2n=38) arising from spontaneous chromosome doubling, while 13 seedlings were confirmed as haploid plants. Doubled haploid plants produced after treatment with colchicine and two disomic chromosome addition lines (2n=40, AACC plus a single pair of homologous chromosomes from C. abyssinca) could again be identified by GISH analysis. The lines are potentially useful for molecular genetic analysis of novel C. abyssinica genes or alleles contributing to traits relevant for oilseed rape (B. napus) breeding.

  3. Column precipitation chromatography: an approach to quantitative analysis of eigencolloids.

    PubMed

    Breynaert, E; Maes, A

    2005-08-01

    A new column precipitation chromatography (CPC) technique, capable of quantitatively measuring technetium eigencolloids in aqueous solutions, is presented. The CPC technique is based on the destabilization and precipitation of eigencolloids by polycations in a confined matrix. Tc(IV) colloids can be quantitatively determined from their precipitation onto the CPC column (separation step) and their subsequent elution upon oxidation to pertechnetate by peroxide (elution step). A clean-bed particle removal model was used to explain the experimental results. PMID:16053321

  4. Hydrocarbons on Phoebe, Iapetus, and Hyperion: Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Cruikshank, Dale P.; MoreauDalleOre, Cristina; Pendleton, Yvonne J.; Clark, Roger Nelson

    2012-01-01

    We present a quantitative analysis of the hydrocarbon spectral bands measured on three of Saturn's satellites, Phoebe, Iaperus, and Hyperion. These bands, measured with the Cassini Visible-Infrared Mapping Spectrometer on close fly-by's of these satellites, are the C-H stretching modes of aromatic hydrocarbons at approximately 3.28 micrometers (approximately 3050 per centimeter), and the are four blended bands of aliphatic -CH2- and -CH3 in the range approximately 3.36-3.52 micrometers (approximately 2980- 2840 per centimeter) bably indicating the presence of polycyclic aromatic hydrocarbons (PAH), is unusually strong in comparison to the aliphatic bands, resulting in a unique signarure among Solar System bodies measured so far, and as such offers a means of comparison among the three satellites. The ratio of the C-H bands in aromatic molecules to those in aliphatic molecules in the surface materials of Phoebe, NAro:NAliph approximately 24; for Hyperion the value is approximately 12, while laperus shows an intermediate value. In view of the trend of the evolution (dehydrogenation by heat and radiation) of aliphatic complexes toward more compact molecules and eventually to aromatics, the relative abundances of aliphatic -CH2- and -CH3- is an indication of the lengths of the molecular chain structures, hence the degree of modification of the original material. We derive CH2:CH3 approximately 2.2 in the spectrum of low-albedo material on laperus; this value is the same within measurement errors to the ratio in the diffuse interstellar medium. The similarity in the spectral signatures of the three satellites, plus the apparent weak trend of aromatic/aliphatic abundance from Phoebe to Hyperion, is consistent with, and effectively confirms that the source of the hydrocarbon-bearing material is Phoebe, and that the appearance of that material on the other two satellites arises from the deposition of the inward-spiraling dust that populates the Phoebe ring.

  5. Descriptive Quantitative Analysis of Rearfoot Alignment Radiographic Parameters.

    PubMed

    Meyr, Andrew J; Wagoner, Matthew R

    2015-01-01

    Although the radiographic parameters of the transverse talocalcaneal angle (tTCA), calcaneocuboid angle (CCA), talar head uncovering (THU), calcaneal inclination angle (CIA), talar declination angle (TDA), lateral talar-first metatarsal angle (lTFA), and lateral talocalcaneal angle (lTCA) form the basis of the preoperative evaluation and procedure selection for pes planovalgus deformity, the so-called normal values of these measurements are not well-established. The objectives of the present study were to retrospectively evaluate the descriptive statistics of these radiographic parameters (tTCA, CCA, THU, CIA, TDA, lTFA, and lTCA) in a large population, and, second, to determine an objective basis for defining "normal" versus "abnormal" measurements. As a secondary outcome, the relationship of these variables to the body mass index was assessed. Anteroposterior and lateral foot radiographs from 250 consecutive patients without a history of previous foot and ankle surgery and/or trauma were evaluated. The results revealed a mean measurement of 24.12°, 13.20°, 74.32%, 16.41°, 26.64°, 8.37°, and 43.41° for the tTCA, CCA, THU, CIA, TDA, lTFA, and lTCA, respectively. These were generally in line with the reported historical normal values. Descriptive statistical analysis demonstrated that the tTCA, THU, and TDA met the standards to be considered normally distributed but that the CCA, CIA, lTFA, and lTCA demonstrated data characteristics of both parametric and nonparametric distributions. Furthermore, only the CIA (R = -0.2428) and lTCA (R = -0.2449) demonstrated substantial correlation with the body mass index. No differentiations in deformity progression were observed when the radiographic parameters were plotted against each other to lead to a quantitative basis for defining "normal" versus "abnormal" measurements. PMID:26002682

  6. Hydrocarbons on Phoebe, Iapetus, and Hyperion: Quantitative Analysis

    NASA Astrophysics Data System (ADS)

    Cruikshank, D. P.; Dalle Ore, C. M.; Pendleton, Y. J.; Clark, R. N.

    2012-12-01

    We present a quantitative analysis of the hydrocarbon spectral bands measured on three of Saturn's satellites, Phoebe, Iapetus, and Hyperion. These bands, measured with the Cassini Visible-Infrared Mapping Spectrometer on close fly-bys of these satellites, are the C-H stretching modes of aromatic hydrocarbons at ~3.28 μm (~3050 cm-1), and the are four blended bands of aliphatic -CH2- and -CH3 in the range ~3.36-3.52 μm (~2980-2840 cm-1). In these data, the aromatic band, probably indicating the presence of polycyclic aromatic hydrocarbons (PAH), is unusually strong in comparison to the aliphatic bands, resulting in a unique signature among Solar System bodies measured so far, and as such offers a means of comparison among the three satellites. The ratio of the C-H bands in aromatic molecules to those in aliphatic molecules in the surface materials of Phoebe, NAro:NAliph ~24; for Hyperion the value is ~12, while Iapetus shows an intermediate value. In view of the trend of the evolution (dehydrogenation by heat and radiation) of aliphatic complexes toward more compact molecules and eventually to aromatics, the relative abundances of aliphatic -CH2- and -CH3- is an indication of the lengths of the molecular chain structures, hence the degree of modification of the original material. We derive CH2:CH3 ~2.2 in the spectrum of low-albedo material on Iapetus; this value is the same within measurement errors to the ratio in the diffuse interstellar medium. The similarity in the spectral signatures of the three satellites, plus the apparent weak trend of aromatic/aliphatic abundance from Phoebe to Hyperion, is consistent with, and effectively confirms that the source of the hydrocarbon-bearing material is Phoebe, and that the appearance of that material on the other two satellites arises from the deposition of the inward-spiraling dust that populates the Phoebe ring.

  7. Quantitative analysis of harmonic convergence in mosquito auditory interactions.

    PubMed

    Aldersley, Andrew; Champneys, Alan; Homer, Martin; Robert, Daniel

    2016-04-01

    This article analyses the hearing and behaviour of mosquitoes in the context of inter-individual acoustic interactions. The acoustic interactions of tethered live pairs of Aedes aegypti mosquitoes, from same and opposite sex mosquitoes of the species, are recorded on independent and unique audio channels, together with the response of tethered individual mosquitoes to playbacks of pre-recorded flight tones of lone or paired individuals. A time-dependent representation of each mosquito's non-stationary wing beat frequency signature is constructed, based on Hilbert spectral analysis. A range of algorithmic tools is developed to automatically analyse these data, and used to perform a robust quantitative identification of the 'harmonic convergence' phenomenon. The results suggest that harmonic convergence is an active phenomenon, which does not occur by chance. It occurs for live pairs, as well as for lone individuals responding to playback recordings, whether from the same or opposite sex. Male-female behaviour is dominated by frequency convergence at a wider range of harmonic combinations than previously reported, and requires participation from both partners in the duet. New evidence is found to show that male-male interactions are more varied than strict frequency avoidance. Rather, they can be divided into two groups: convergent pairs, typified by tightly bound wing beat frequencies, and divergent pairs, that remain widely spaced in the frequency domain. Overall, the results reveal that mosquito acoustic interaction is a delicate and intricate time-dependent active process that involves both individuals, takes place at many different frequencies, and which merits further enquiry. PMID:27053654

  8. Quantitative analysis of mycoflora on commercial domestic fruits in Japan.

    PubMed

    Watanabe, Maiko; Tsutsumi, Fumiyuki; Konuma, Rumi; Lee, Ken-Ichi; Kawarada, Kensuke; Sugita-Konishi, Yoshiko; Kumagai, Susumu; Takatori, Kosuke; Konuma, Hirotaka; Hara-Kudo, Yukiko

    2011-09-01

    A comprehensive and quantitative analysis of the mycoflora on the surface of commercial fruit was performed. Nine kinds of fruits grown in Japan were tested. Overall fungal counts on the fruits ranged from 3.1 to 6.5 log CFU/g. The mean percentages of the total yeast counts were higher than those of molds in samples of apples, Japanese pears, and strawberries, ranging from 58.5 to 67.0%, and were lower than those of molds in samples of the other six fruits, ranging from 9.8 to 48.3%. Cladosporium was the most frequent fungus and was found in samples of all nine types of fruits, followed by Penicillium found in eight types of fruits. The fungi with the highest total counts in samples of the various fruits were Acremonium in cantaloupe melons (47.6% of the total fungal count), Aspergillus in grapes (32.2%), Aureobasidium in apples (21.3%), blueberries (63.6%), and peaches (33.6%), Cladosporium in strawberries (38.4%), Cryptococcus in Japanese pears (37.6%), Penicillium in mandarins (22.3%), and Sporobolomyces in lemons (26.9%). These results demonstrated that the mycoflora on the surfaces of these fruits mainly consists of common pre- and postharvest inhabitants of the plants or in the environment; fungi that produce mycotoxins or cause market diseases were not prominent in the mycoflora of healthy fruits. These findings suggest fruits should be handled carefully with consideration given to fungal contaminants, including nonpathogenic fungi, to control the quality of fruits and processed fruit products. PMID:21902918

  9. Quantitative analysis of harmonic convergence in mosquito auditory interactions

    PubMed Central

    Aldersley, Andrew; Champneys, Alan; Robert, Daniel

    2016-01-01

    This article analyses the hearing and behaviour of mosquitoes in the context of inter-individual acoustic interactions. The acoustic interactions of tethered live pairs of Aedes aegypti mosquitoes, from same and opposite sex mosquitoes of the species, are recorded on independent and unique audio channels, together with the response of tethered individual mosquitoes to playbacks of pre-recorded flight tones of lone or paired individuals. A time-dependent representation of each mosquito's non-stationary wing beat frequency signature is constructed, based on Hilbert spectral analysis. A range of algorithmic tools is developed to automatically analyse these data, and used to perform a robust quantitative identification of the ‘harmonic convergence’ phenomenon. The results suggest that harmonic convergence is an active phenomenon, which does not occur by chance. It occurs for live pairs, as well as for lone individuals responding to playback recordings, whether from the same or opposite sex. Male–female behaviour is dominated by frequency convergence at a wider range of harmonic combinations than previously reported, and requires participation from both partners in the duet. New evidence is found to show that male–male interactions are more varied than strict frequency avoidance. Rather, they can be divided into two groups: convergent pairs, typified by tightly bound wing beat frequencies, and divergent pairs, that remain widely spaced in the frequency domain. Overall, the results reveal that mosquito acoustic interaction is a delicate and intricate time-dependent active process that involves both individuals, takes place at many different frequencies, and which merits further enquiry. PMID:27053654

  10. Quantitative PCR analysis of salivary pathogen burden in periodontitis.

    PubMed

    Salminen, Aino; Kopra, K A Elisa; Hyvärinen, Kati; Paju, Susanna; Mäntylä, Päivi; Buhlin, Kåre; Nieminen, Markku S; Sinisalo, Juha; Pussinen, Pirkko J

    2015-01-01

    Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9 ± 9.2 years) with coronary artery disease (CAD) diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR (qPCR). Median salivary concentrations of Porphyromonas gingivalis, Tannerella forsythia, and Prevotella intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary Aggregatibacter actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4-5 mm periodontal pockets, ≥6 mm pockets, and alveolar bone loss (ABL). High level of T. forsythia was associated also with bleeding on probing (BOP). The combination of the four bacteria, i.e., the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR) of 2.40 (95% CI 1.39-4.13). When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51-4.52). The highest OR 3.59 (95% CI 1.94-6.63) was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and T

  11. Altered resting-state functional activity in posttraumatic stress disorder: A quantitative meta-analysis

    PubMed Central

    Wang, Ting; Liu, Jia; Zhang, Junran; Zhan, Wang; Li, Lei; Wu, Min; Huang, Hua; Zhu, Hongyan; Kemp, Graham J.; Gong, Qiyong

    2016-01-01

    Many functional neuroimaging studies have reported differential patterns of spontaneous brain activity in posttraumatic stress disorder (PTSD), but the findings are inconsistent and have not so far been quantitatively reviewed. The present study set out to determine consistent, specific regional brain activity alterations in PTSD, using the Effect Size Signed Differential Mapping technique to conduct a quantitative meta-analysis of resting-state functional neuroimaging studies of PTSD that used either a non-trauma (NTC) or a trauma-exposed (TEC) comparison control group. Fifteen functional neuroimaging studies were included, comparing 286 PTSDs, 203 TECs and 155 NTCs. Compared with NTC, PTSD patients showed hyperactivity in the right anterior insula and bilateral cerebellum, and hypoactivity in the dorsal medial prefrontal cortex (mPFC); compared with TEC, PTSD showed hyperactivity in the ventral mPFC. The pooled meta-analysis showed hypoactivity in the posterior insula, superior temporal, and Heschl’s gyrus in PTSD. Additionally, subgroup meta-analysis (non-medicated subjects vs. NTC) identified abnormal activation in the prefrontal-limbic system. In meta-regression analyses, mean illness duration was positively associated with activity in the right cerebellum (PTSD vs. NTC), and illness severity was negatively associated with activity in the right lingual gyrus (PTSD vs. TEC). PMID:27251865

  12. Quantitative Analysis by Isotopic Dilution Using Mass Spectroscopy: The Determination of Caffeine by GC-MS.

    ERIC Educational Resources Information Center

    Hill, Devon W.; And Others

    1988-01-01

    Describes a laboratory technique for quantitative analysis of caffeine by an isotopic dilution method for coupled gas chromatography-mass spectroscopy. Discusses caffeine analysis and experimental methodology. Lists sample caffeine concentrations found in common products. (MVL)

  13. Quantitative analysis of localized surface plasmons based on molecular probing.

    PubMed

    Deeb, Claire; Bachelot, Renaud; Plain, Jérôme; Baudrion, Anne-Laure; Jradi, Safi; Bouhelier, Alexandre; Soppera, Olivier; Jain, Prashant K; Huang, Libai; Ecoffet, Carole; Balan, Lavinia; Royer, Pascal

    2010-08-24

    We report on the quantitative characterization of the plasmonic optical near-field of a single silver nanoparticle. Our approach relies on nanoscale molecular molding of the confined electromagnetic field by photoactivated molecules. We were able to directly image the dipolar profile of the near-field distribution with a resolution better than 10 nm and to quantify the near-field depth and its enhancement factor. A single nanoparticle spectral signature was also assessed. This quantitative characterization constitutes a prerequisite for developing nanophotonic applications.

  14. Chemical fingerprint and quantitative analysis for quality control of polyphenols extracted from pomegranate peel by HPLC.

    PubMed

    Li, Jianke; He, Xiaoye; Li, Mengying; Zhao, Wei; Liu, Liu; Kong, Xianghong

    2015-06-01

    A simple and efficient HPLC fingerprint method was developed and validated for quality control of the polyphenols extracted from pomegranate peel (PPPs). Ten batches of pomegranate collected from different orchards in Shaanxi Lintong of China were used to establish the fingerprint. For the fingerprint analysis, 15 characteristic peaks were selected to evaluate the similarities of 10 batches of the PPPs. The similarities of the PPPs samples were all more than 0.968, indicating that the samples from different areas of Lintong were consistent. Additionally, simultaneous quantification of eight monophenols (including gallic acid, punicalagin, catechin, chlorogenic acid, caffeic acid, epicatechin, rutin, and ellagic acid) in the PPPs was conducted to interpret the consistency of the quality test. The results demonstrated that the HPLC fingerprint as a characteristic distinguishing method combining similarity evaluation and quantitative analysis can be successfully used to assess the quality and to identify the authenticity of the PPPs.

  15. Quantitative Analysis of Autophagy using Advanced 3D Fluorescence Microscopy

    PubMed Central

    Changou, Chun A.; Wolfson, Deanna L.; Ahluwalia, Balpreet Singh; Bold, Richard J.; Kung, Hsing-Jien; Chuang, Frank Y.S.

    2013-01-01

    Prostate cancer is the leading form of malignancies among men in the U.S. While surgery carries a significant risk of impotence and incontinence, traditional chemotherapeutic approaches have been largely unsuccessful. Hormone therapy is effective at early stage, but often fails with the eventual development of hormone-refractory tumors. We have been interested in developing therapeutics targeting specific metabolic deficiency of tumor cells. We recently showed that prostate tumor cells specifically lack an enzyme (argininosuccinate synthase, or ASS) involved in the synthesis of the amino acid arginine1. This condition causes the tumor cells to become dependent on exogenous arginine, and they undergo metabolic stress when free arginine is depleted by arginine deiminase (ADI)1,10. Indeed, we have shown that human prostate cancer cells CWR22Rv1 are effectively killed by ADI with caspase-independent apoptosis and aggressive autophagy (or macroautophagy)1,2,3. Autophagy is an evolutionarily-conserved process that allows cells to metabolize unwanted proteins by lysosomal breakdown during nutritional starvation4,5. Although the essential components of this pathway are well-characterized6,7,8,9, many aspects of the molecular mechanism are still unclear - in particular, what is the role of autophagy in the death-response of prostate cancer cells after ADI treatment? In order to address this question, we required an experimental method to measure the level and extent of autophagic response in cells - and since there are no known molecular markers that can accurately track this process, we chose to develop an imaging-based approach, using quantitative 3D fluorescence microscopy11,12. Using CWR22Rv1 cells specifically-labeled with fluorescent probes for autophagosomes and lysosomes, we show that 3D image stacks acquired with either widefield deconvolution microscopy (and later, with super-resolution, structured-illumination microscopy) can clearly capture the early stages of

  16. Quantitative analysis of autophagy using advanced 3D fluorescence microscopy.

    PubMed

    Changou, Chun A; Wolfson, Deanna L; Ahluwalia, Balpreet Singh; Bold, Richard J; Kung, Hsing-Jien; Chuang, Frank Y S

    2013-01-01

    Prostate cancer is the leading form of malignancies among men in the U.S. While surgery carries a significant risk of impotence and incontinence, traditional chemotherapeutic approaches have been largely unsuccessful. Hormone therapy is effective at early stage, but often fails with the eventual development of hormone-refractory tumors. We have been interested in developing therapeutics targeting specific metabolic deficiency of tumor cells. We recently showed that prostate tumor cells specifically lack an enzyme (argininosuccinate synthase, or ASS) involved in the synthesis of the amino acid arginine(1). This condition causes the tumor cells to become dependent on exogenous arginine, and they undergo metabolic stress when free arginine is depleted by arginine deiminase (ADI)(1,10). Indeed, we have shown that human prostate cancer cells CWR22Rv1 are effectively killed by ADI with caspase-independent apoptosis and aggressive autophagy (or macroautophagy)(1,2,3). Autophagy is an evolutionarily-conserved process that allows cells to metabolize unwanted proteins by lysosomal breakdown during nutritional starvation(4,5). Although the essential components of this pathway are well-characterized(6,7,8,9), many aspects of the molecular mechanism are still unclear - in particular, what is the role of autophagy in the death-response of prostate cancer cells after ADI treatment? In order to address this question, we required an experimental method to measure the level and extent of autophagic response in cells - and since there are no known molecular markers that can accurately track this process, we chose to develop an imaging-based approach, using quantitative 3D fluorescence microscopy(11,12). Using CWR22Rv1 cells specifically-labeled with fluorescent probes for autophagosomes and lysosomes, we show that 3D image stacks acquired with either widefield deconvolution microscopy (and later, with super-resolution, structured-illumination microscopy) can clearly capture the early

  17. Quantitative analysis of autophagy using advanced 3D fluorescence microscopy.

    PubMed

    Changou, Chun A; Wolfson, Deanna L; Ahluwalia, Balpreet Singh; Bold, Richard J; Kung, Hsing-Jien; Chuang, Frank Y S

    2013-05-03

    Prostate cancer is the leading form of malignancies among men in the U.S. While surgery carries a significant risk of impotence and incontinence, traditional chemotherapeutic approaches have been largely unsuccessful. Hormone therapy is effective at early stage, but often fails with the eventual development of hormone-refractory tumors. We have been interested in developing therapeutics targeting specific metabolic deficiency of tumor cells. We recently showed that prostate tumor cells specifically lack an enzyme (argininosuccinate synthase, or ASS) involved in the synthesis of the amino acid arginine(1). This condition causes the tumor cells to become dependent on exogenous arginine, and they undergo metabolic stress when free arginine is depleted by arginine deiminase (ADI)(1,10). Indeed, we have shown that human prostate cancer cells CWR22Rv1 are effectively killed by ADI with caspase-independent apoptosis and aggressive autophagy (or macroautophagy)(1,2,3). Autophagy is an evolutionarily-conserved process that allows cells to metabolize unwanted proteins by lysosomal breakdown during nutritional starvation(4,5). Although the essential components of this pathway are well-characterized(6,7,8,9), many aspects of the molecular mechanism are still unclear - in particular, what is the role of autophagy in the death-response of prostate cancer cells after ADI treatment? In order to address this question, we required an experimental method to measure the level and extent of autophagic response in cells - and since there are no known molecular markers that can accurately track this process, we chose to develop an imaging-based approach, using quantitative 3D fluorescence microscopy(11,12). Using CWR22Rv1 cells specifically-labeled with fluorescent probes for autophagosomes and lysosomes, we show that 3D image stacks acquired with either widefield deconvolution microscopy (and later, with super-resolution, structured-illumination microscopy) can clearly capture the early

  18. Quantitative analysis of real-time tissue elastography for evaluation of liver fibrosis

    PubMed Central

    Shi, Ying; Wang, Xing-Hua; Zhang, Huan-Hu; Zhang, Hai-Qing; Tu, Ji-Zheng; Wei, Kun; Li, Juan; Liu, Xiao-Li

    2014-01-01

    The present study aimed to investigate the feasibility of quantitative analysis of liver fibrosis using real-time tissue elastography (RTE) and its pathological and molecule biological basis. Methods: Fifty-four New Zealand rabbits were subcutaneously injected with thioacetamide (TAA) to induce liver fibrosis as the model group, and another eight New Zealand rabbits served as the normal control group. Four rabbits were randomly taken every two weeks for real-time tissue elastography (RTE) and quantitative analysis of tissue diffusion. The obtained twelve characteristic quantities included relative mean value (MEAN), standard deviation (SD), blue area % (% AREA), complexity (COMP), kurtosis (KURT), skewness (SKEW), contrast (CONT), entropy (ENT), inverse different moment (IDM), angular secon moment (ASM), correlation (CORR) and liver fibrosis index (LF Index). Rabbits were executed and liver tissues were taken for pathological staging of liver fibrosis (grouped by pathological stage into S0 group, S1 group, S2 group, S3 group and S4 group). In addition, the collagen I (Col I) and collagen III (Col III) expression levels in liver tissue were detected by Western blot. Results: Except for KURT, there were significant differences among the other eleven characteristic quantities (P < 0.05). LF Index, Col I and Col III expression levels showed a rising trend with increased pathological staging of liver fibrosis, presenting a positive correlation with the pathological staging of liver fibrosis (r = 0.718, r = 0.693, r = 0.611, P < 0.05). Conclusion: RTE quantitative analysis is expected for noninvasive evaluation of the pathological staging of liver fibrosis. PMID:24955175

  19. Portland Cement (KS and API Class G) and Relative Quantitative Analysis

    NASA Astrophysics Data System (ADS)

    LEE, Seung-Woo; CHAE, Gi-Tak; KIM, Taehee

    2015-04-01

    Portland cement is a common component consisting of a sealing material for wellbores for geological carbon storage to prevent vertical fluid migration and provide mechanical support. Portland cement was reacted with carbon dioxide (CO2) in supercritical, gaseous, and aqueous phases at various pressure and temperature conditions to simulate a cement-CO2 reaction along the wellbore from the carbon injection depth to the near surface. The reaction of the cement phase with CO2 can lead to important changes in its structure and properties. In this study, two types of cement were used: KS Portland cement and API Class G Portland cement. The hydrated cement sample columns (14 mm diameter X 90 mm long; water-to-cement ratio = 0.5) were reacted with CO2 in the saturated and the unsaturated condition. Fly-ash was used as additives to promote carbonation. These conditions were maintained under high pressure (8 MPa) and temperature (40 degree Celsius) for 10 and 100 days. To analyze the degree of carbonation after cement carbonation, relative quantitative analysis was proposed. And Rietveld method were conducted to evaluate a relative quantitative analysis (RQA) with an aragonite-calcite equation. This method can be an alternative to the general quantitative analysis method to identify the state of cement carbonation between Portland cement and CO2. Based on an understanding of cement carbonation and its relative quantification, we propose that our method should be used to select the optimized cement for CO2 storage. Using our method, KS (Korea Standard) Portland cement (type I) and API Class G Portland cement have been compared with respect to the characterization of each cement and to the cement carbonation of each cement.

  20. MOLD SPECIFIC QUANTITATIVE PCR: THE EMERGING STANDARD IN MOLD ANALYSIS

    EPA Science Inventory

    Today I will talk about the use of quantitative or Real time PCR for the standardized identification and quantification of molds. There are probably at least 100,000 species of molds or fungi. But there are actually about 100 typically found indoors. Some pose a threat to human...

  1. Teaching Quantitative Research Methods: A Quasi-Experimental Analysis.

    ERIC Educational Resources Information Center

    Bridges, George S.; Gillmore, Gerald M.; Pershing, Jana L.; Bates, Kristin A.

    1998-01-01

    Describes an experiment designed to introduce aspects of quantitative reasoning to a large, substantively-focused class in the social sciences. Reveals that participating students' abilities to interpret and manipulate empirical data increased significantly, independent of baseline SAT verbal and mathematics scores. Discusses implications for…

  2. Quantitative and Qualitative Analysis of Biomarkers in Fusarium verticillioides

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In this study, a combination HPLC-DART-TOF-MS system was utilized to identify and quantitatively analyze carbohydrates in wild type and mutant strains of Fusarium verticillioides. Carbohydrate fractions were isolated from F. verticillioides cellular extracts by HPLC using a cation-exchange size-excl...

  3. Teaching Quantitative Reasoning for Nonscience Majors through Carbon Footprint Analysis

    ERIC Educational Resources Information Center

    Boose, David L.

    2014-01-01

    Quantitative reasoning is a key intellectual skill, applicable across disciplines and best taught in the context of authentic, relevant problems. Here, I describe and assess a laboratory exercise that has students calculate their "carbon footprint" and evaluate the impacts of various behavior choices on that footprint. Students gather…

  4. [Quantitative analysis of thiram by surface-enhanced raman spectroscopy combined with feature extraction Algorithms].

    PubMed

    Zhang, Bao-hua; Jiang, Yong-cheng; Sha, Wen; Zhang, Xian-yi; Cui, Zhi-feng

    2015-02-01

    Three feature extraction algorithms, such as the principal component analysis (PCA), the discrete cosine transform (DCT) and the non-negative factorization (NMF), were used to extract the main information of the spectral data in order to weaken the influence of the spectral fluctuation on the subsequent quantitative analysis results based on the SERS spectra of the pesticide thiram. Then the extracted components were respectively combined with the linear regression algorithm--the partial least square regression (PLSR) and the non-linear regression algorithm--the support vector machine regression (SVR) to develop the quantitative analysis models. Finally, the effect of the different feature extraction algorithms on the different kinds of the regression algorithms was evaluated by using 5-fold cross-validation method. The experiments demonstrate that the analysis results of SVR are better than PLSR for the non-linear relationship between the intensity of the SERS spectrum and the concentration of the analyte. Further, the feature extraction algorithms can significantly improve the analysis results regardless of the regression algorithms which mainly due to extracting the main information of the source spectral data and eliminating the fluctuation. Additionally, PCA performs best on the linear regression model and NMF is best on the non-linear model, and the predictive error can be reduced nearly three times in the best case. The root mean square error of cross-validation of the best regression model (NMF+SVR) is 0.0455 micormol x L(-1) (10(-6) mol x L(-1)), and it attains the national detection limit of thiram, so the method in this study provides a novel method for the fast detection of thiram. In conclusion, the study provides the experimental references the selecting the feature extraction algorithms on the analysis of the SERS spectrum, and some common findings of feature extraction can also help processing of other kinds of spectroscopy.

  5. Porosity Measurements and Analysis for Metal Additive Manufacturing Process Control.

    PubMed

    Slotwinski, John A; Garboczi, Edward J; Hebenstreit, Keith M

    2014-01-01

    Additive manufacturing techniques can produce complex, high-value metal parts, with potential applications as critical metal components such as those found in aerospace engines and as customized biomedical implants. Material porosity in these parts is undesirable for aerospace parts - since porosity could lead to premature failure - and desirable for some biomedical implants - since surface-breaking pores allows for better integration with biological tissue. Changes in a part's porosity during an additive manufacturing build may also be an indication of an undesired change in the build process. Here, we present efforts to develop an ultrasonic sensor for monitoring changes in the porosity in metal parts during fabrication on a metal powder bed fusion system. The development of well-characterized reference samples, measurements of the porosity of these samples with multiple techniques, and correlation of ultrasonic measurements with the degree of porosity are presented. A proposed sensor design, measurement strategy, and future experimental plans on a metal powder bed fusion system are also presented.

  6. Porosity Measurements and Analysis for Metal Additive Manufacturing Process Control.

    PubMed

    Slotwinski, John A; Garboczi, Edward J; Hebenstreit, Keith M

    2014-01-01

    Additive manufacturing techniques can produce complex, high-value metal parts, with potential applications as critical metal components such as those found in aerospace engines and as customized biomedical implants. Material porosity in these parts is undesirable for aerospace parts - since porosity could lead to premature failure - and desirable for some biomedical implants - since surface-breaking pores allows for better integration with biological tissue. Changes in a part's porosity during an additive manufacturing build may also be an indication of an undesired change in the build process. Here, we present efforts to develop an ultrasonic sensor for monitoring changes in the porosity in metal parts during fabrication on a metal powder bed fusion system. The development of well-characterized reference samples, measurements of the porosity of these samples with multiple techniques, and correlation of ultrasonic measurements with the degree of porosity are presented. A proposed sensor design, measurement strategy, and future experimental plans on a metal powder bed fusion system are also presented. PMID:26601041

  7. Porosity Measurements and Analysis for Metal Additive Manufacturing Process Control

    PubMed Central

    Slotwinski, John A; Garboczi, Edward J; Hebenstreit, Keith M

    2014-01-01

    Additive manufacturing techniques can produce complex, high-value metal parts, with potential applications as critical metal components such as those found in aerospace engines and as customized biomedical implants. Material porosity in these parts is undesirable for aerospace parts - since porosity could lead to premature failure - and desirable for some biomedical implants - since surface-breaking pores allows for better integration with biological tissue. Changes in a part’s porosity during an additive manufacturing build may also be an indication of an undesired change in the build process. Here, we present efforts to develop an ultrasonic sensor for monitoring changes in the porosity in metal parts during fabrication on a metal powder bed fusion system. The development of well-characterized reference samples, measurements of the porosity of these samples with multiple techniques, and correlation of ultrasonic measurements with the degree of porosity are presented. A proposed sensor design, measurement strategy, and future experimental plans on a metal powder bed fusion system are also presented. PMID:26601041

  8. Additional EIPC Study Analysis: Interim Report on High Priority Topics

    SciTech Connect

    Hadley, Stanton W

    2013-11-01

    Between 2010 and 2012 the Eastern Interconnection Planning Collaborative (EIPC) conducted a major long-term resource and transmission study of the Eastern Interconnection (EI). With guidance from a Stakeholder Steering Committee (SSC) that included representatives from the Eastern Interconnection States Planning Council (EISPC) among others, the project was conducted in two phases. Phase 1 involved a long-term capacity expansion analysis that involved creation of eight major futures plus 72 sensitivities. Three scenarios were selected for more extensive transmission- focused evaluation in Phase 2. Five power flow analyses, nine production cost model runs (including six sensitivities), and three capital cost estimations were developed during this second phase. The results from Phase 1 and 2 provided a wealth of data that could be examined further to address energy-related questions. A list of 13 topics was developed for further analysis; this paper discusses the first five.

  9. Disclosure of hydraulic fracturing fluid chemical additives: analysis of regulations.

    PubMed

    Maule, Alexis L; Makey, Colleen M; Benson, Eugene B; Burrows, Isaac J; Scammell, Madeleine K

    2013-01-01

    Hydraulic fracturing is used to extract natural gas from shale formations. The process involves injecting into the ground fracturing fluids that contain thousands of gallons of chemical additives. Companies are not mandated by federal regulations to disclose the identities or quantities of chemicals used during hydraulic fracturing operations on private or public lands. States have begun to regulate hydraulic fracturing fluids by mandating chemical disclosure. These laws have shortcomings including nondisclosure of proprietary or "trade secret" mixtures, insufficient penalties for reporting inaccurate or incomplete information, and timelines that allow for after-the-fact reporting. These limitations leave lawmakers, regulators, public safety officers, and the public uninformed and ill-prepared to anticipate and respond to possible environmental and human health hazards associated with hydraulic fracturing fluids. We explore hydraulic fracturing exemptions from federal regulations, as well as current and future efforts to mandate chemical disclosure at the federal and state level.

  10. Disclosure of hydraulic fracturing fluid chemical additives: analysis of regulations.

    PubMed

    Maule, Alexis L; Makey, Colleen M; Benson, Eugene B; Burrows, Isaac J; Scammell, Madeleine K

    2013-01-01

    Hydraulic fracturing is used to extract natural gas from shale formations. The process involves injecting into the ground fracturing fluids that contain thousands of gallons of chemical additives. Companies are not mandated by federal regulations to disclose the identities or quantities of chemicals used during hydraulic fracturing operations on private or public lands. States have begun to regulate hydraulic fracturing fluids by mandating chemical disclosure. These laws have shortcomings including nondisclosure of proprietary or "trade secret" mixtures, insufficient penalties for reporting inaccurate or incomplete information, and timelines that allow for after-the-fact reporting. These limitations leave lawmakers, regulators, public safety officers, and the public uninformed and ill-prepared to anticipate and respond to possible environmental and human health hazards associated with hydraulic fracturing fluids. We explore hydraulic fracturing exemptions from federal regulations, as well as current and future efforts to mandate chemical disclosure at the federal and state level. PMID:23552653

  11. Risk analysis of sulfites used as food additives in China.

    PubMed

    Zhang, Jian Bo; Zhang, Hong; Wang, Hua Li; Zhang, Ji Yue; Luo, Peng Jie; Zhu, Lei; Wang, Zhu Tian

    2014-02-01

    This study was to analyze the risk of sulfites in food consumed by the Chinese people and assess the health protection capability of maximum-permitted level (MPL) of sulfites in GB 2760-2011. Sulfites as food additives are overused or abused in many food categories. When the MPL in GB 2760-2011 was used as sulfites content in food, the intake of sulfites in most surveyed populations was lower than the acceptable daily intake (ADI). Excess intake of sulfites was found in all the surveyed groups when a high percentile of sulfites in food was in taken. Moreover, children aged 1-6 years are at a high risk to intake excess sulfites. The primary cause for the excess intake of sulfites in Chinese people is the overuse and abuse of sulfites by the food industry. The current MPL of sulfites in GB 2760-2011 protects the health of most populations.

  12. Global quantitative analysis of phosphorylation underlying phencyclidine signaling and sensorimotor gating in the prefrontal cortex

    PubMed Central

    McClatchy, Daniel B.; Savas, Jeffrey N.; Martínez-Bartolomé, Salvador; Park, Sung Kyu; Maher, Pamela; Powell, Susan B.; Yates, John R.

    2015-01-01

    Prepulse inhibition (PPI) is an example of sensorimotor gating and deficits in PPI have been demonstrated in schizophrenia patients. Phencyclidine (PCP) suppression of PPI in animals has been studied to elucidate the pathological elements of schizophrenia. However, the molecular mechanisms underlying PCP treatment or PPI in the brain are still poorly understood. In this study, quantitative phosphoproteomic analysis was performed on the prefrontal cortex from rats that were subjected to PPI after being systemically injected with PCP or saline. PCP down-regulated phosphorylation events were significantly enriched in proteins associated with long-term potentiation (LTP). Importantly, this dataset identifies functionally novel phosphorylation sites on known LTP-associated signaling molecules. In addition, mutagenesis of a significantly altered phosphorylation site on xCT (SLC7A11), the light chain of system xc-, the cystine/glutamate antiporter, suggests that PCP also regulates the activity of this protein. Finally, new insights were also derived on PPI signaling independent of PCP treatment. This is the first quantitative phosphorylation proteomic analysis providing new molecular insights into sensorimotor gating. PMID:25869802

  13. Quantitative and qualitative HPLC analysis of thermogenic weight loss products.

    PubMed

    Schaneberg, B T; Khan, I A

    2004-11-01

    An HPLC qualitative and quantitative method of seven analytes (caffeine, ephedrine, forskolin, icariin, pseudoephedrine, synephrine, and yohimbine) in thermogenic weight loss preparations available on the market is described in this paper. After 45 min the seven analytes were separated and detected in the acetonitrile: water (80:20) extract. The method uses a Waters XTerra RP18 (5 microm particle size) column as the stationary phase, a gradient mobile phase of water (5.0 mM SDS) and acetonitrile, and a UV detection of 210 nm. The correlation coefficients for the calibration curves and the recovery rates ranged from 0.994 to 0.999 and from 97.45% to 101.05%, respectively. The qualitative and quantitative results are discussed. PMID:15587578

  14. Quantitative sectioning and noise analysis for structured illumination microscopy

    PubMed Central

    Hagen, Nathan; Gao, Liang; Tkaczyk, Tomasz S.

    2011-01-01

    Structured illumination (SI) has long been regarded as a nonquantitative technique for obtaining sectioned microscopic images. Its lack of quantitative results has restricted the use of SI sectioning to qualitative imaging experiments, and has also limited researchers’ ability to compare SI against competing sectioning methods such as confocal microscopy. We show how to modify the standard SI sectioning algorithm to make the technique quantitative, and provide formulas for calculating the noise in the sectioned images. The results indicate that, for an illumination source providing the same spatially-integrated photon flux at the object plane, and for the same effective slice thicknesses, SI sectioning can provide higher SNR images than confocal microscopy for an equivalent setup when the modulation contrast exceeds about 0.09. PMID:22274364

  15. Quantitative architectural analysis: a new approach to cortical mapping.

    PubMed

    Schleicher, Axel; Morosan, Patricia; Amunts, Katrin; Zilles, Karl

    2009-11-01

    Results from functional imaging studies are often still interpreted using the classical architectonic brain maps of Brodmann and his successors. One obvious weakness in traditional, architectural mapping is the subjective nature of localizing borders between cortical areas by means of a purely visual, microscopical examination of histological specimens. To overcome this limitation, objective mapping procedures based on quantitative cytoarchitecture have been generated. As a result, new maps for various species including man were established. In our contribution, principles of quantitative cytoarchitecture and algorithm-based cortical mapping are described for a cytoarchitectural parcellation of the human auditory cortex. Defining cortical borders based on quantified changes in cortical lamination is the decisive step towards a novel, highly improved probabilistic brain atlas.

  16. Quantitative analysis of the human T cell palmitome

    PubMed Central

    Morrison, Eliot; Kuropka, Benno; Kliche, Stefanie; Brügger, Britta; Krause, Eberhard; Freund, Christian

    2015-01-01

    Palmitoylation is a reversible post-translational modification used to inducibly compartmentalize proteins in cellular membranes, affecting the function of receptors and intracellular signaling proteins. The identification of protein “palmitomes” in several cell lines raises the question to what extent this modification is conserved in primary cells. Here we use primary T cells with acyl-biotin exchange and quantitative mass spectrometry to identify a pool of proteins previously unreported as palmitoylated in vivo. PMID:26111759

  17. Comprehensive objective maps of macromolecular conformations by quantitative SAXS analysis

    PubMed Central

    Hura, Greg L.; Budworth, Helen; Dyer, Kevin N.; Rambo, Robert P.; Hammel, Michal

    2013-01-01

    Comprehensive perspectives of macromolecular conformations are required to connect structure to biology. Here we present a small angle X-ray scattering (SAXS) Structural Similarity Map (SSM) and Volatility of Ratio (VR) metric providing comprehensive, quantitative and objective (superposition-independent) perspectives on solution state conformations. We validate VR and SSM utility on human MutSβ, a key ABC ATPase and chemotherapeutic target, by revealing MutSβ DNA sculpting and identifying multiple conformational states for biological activity. PMID:23624664

  18. Fluorescent microscopy approaches of quantitative soil microbial analysis

    NASA Astrophysics Data System (ADS)

    Ivanov, Konstantin; Polyanskaya, Lubov

    2015-04-01

    Classical fluorescent microscopy method was used during the last decades in various microbiological studies of terrestrial ecosystems. The method provides representative results and simple application which is allow to use it both as routine part of amplitudinous research and in small-scaled laboratories. Furthermore, depending on research targets a lot of modifications of fluorescent microscopy method were established. Combination and comparison of several approaches is an opportunity of quantitative estimation of microbial community in soil. The first analytical part of the study was dedicated to soil bacterial density estimation by fluorescent microscopy in dynamic of several 30-days experiments. The purpose of research was estimation of changes in soil bacterial community on the different soil horizons under aerobic and anaerobic conditions with adding nutrients in two experimental sets: cellulose and chitin. Was modified the nalidixic acid method for inhibition of DNA division of gram-negative bacteria, and the method provides the quantification of this bacterial group by fluorescent microscopy. Established approach allowed to estimate 3-4 times more cells of gram-negative bacteria in soil. The functions of actinomyces in soil polymer destruction are traditionally considered as dominant in comparison to gram-negative bacterial group. However, quantification of gram-negative bacteria in chernozem and peatland provides underestimation of classical notion for this bacterial group. Chitin introduction had no positive effect to gram-negative bacterial population density changes in chernozem but concurrently this nutrient provided the fast growing dynamics at the first 3 days of experiment both under aerobic and anaerobic conditions. This is confirming chitinolytic activity of gram-negative bacteria in soil organic matter decomposition. At the next part of research modified method for soil gram-negative bacteria quantification was compared to fluorescent in situ

  19. Spectroscopic and Chemometric Analysis of Binary and Ternary Edible Oil Mixtures: Qualitative and Quantitative Study.

    PubMed

    Jović, Ozren; Smolić, Tomislav; Primožič, Ines; Hrenar, Tomica

    2016-04-19

    The aim of this study was to investigate the feasibility of FTIR-ATR spectroscopy coupled with the multivariate numerical methodology for qualitative and quantitative analysis of binary and ternary edible oil mixtures. Four pure oils (extra virgin olive oil, high oleic sunflower oil, rapeseed oil, and sunflower oil), as well as their 54 binary and 108 ternary mixtures, were analyzed using FTIR-ATR spectroscopy in combination with principal component and discriminant analysis, partial least-squares, and principal component regression. It was found that the composition of all 166 samples can be excellently represented using only the first three principal components describing 98.29% of total variance in the selected spectral range (3035-2989, 1170-1140, 1120-1100, 1093-1047, and 930-890 cm(-1)). Factor scores in 3D space spanned by these three principal components form a tetrahedral-like arrangement: pure oils being at the vertices, binary mixtures at the edges, and ternary mixtures on the faces of a tetrahedron. To confirm the validity of results, we applied several cross-validation methods. Quantitative analysis was performed by minimization of root-mean-square error of cross-validation values regarding the spectral range, derivative order, and choice of method (partial least-squares or principal component regression), which resulted in excellent predictions for test sets (R(2) > 0.99 in all cases). Additionally, experimentally more demanding gas chromatography analysis of fatty acid content was carried out for all specimens, confirming the results obtained by FTIR-ATR coupled with principal component analysis. However, FTIR-ATR provided a considerably better model for prediction of mixture composition than gas chromatography, especially for high oleic sunflower oil.

  20. Automated quantitative characterization of alginate/hydroxyapatite bone tissue engineering scaffolds by means of micro-CT image analysis.

    PubMed

    Brun, Francesco; Turco, Gianluca; Accardo, Agostino; Paoletti, Sergio

    2011-12-01

    Accurate image acquisition techniques and analysis protocols for a reliable characterization of tissue engineering scaffolds are yet to be well defined. To this aim, the most promising imaging technique seems to be the X-ray computed microtomography (μ-CT). However critical issues of the analysis process deal with the representativeness of the selected Volume of Interest (VOI) and, most significantly, its segmentation. This article presents an image analysis protocol that computes a set of quantitative descriptors suitable for characterizing the morphology and the micro-architecture of alginate/hydroxyapatite bone tissue engineering scaffolds. Considering different VOIs extracted from different μ-CT datasets, an automated segmentation technique is suggested and compared against a manual segmentation. Variable sizes of VOIs are also considered in order to assess their representativeness. The resulting image analysis protocol is reproducible, parameter-free and it automatically provides accurate quantitative information in addition to the simple qualitative observation of the acquired images.

  1. An Inexpensive Electrodeposition Device and Its Use in a Quantitative Analysis Laboratory Exercise

    ERIC Educational Resources Information Center

    Parker, Richard H.

    2011-01-01

    An experimental procedure, using an apparatus that is easy to construct, was developed to incorporate a quantitative electrogravimetric determination of the solution nickel content into an undergraduate or advanced high school quantitative analysis laboratory. This procedure produces results comparable to the procedure used for the gravimetric…

  2. Quantitative analysis of cardiovascular modulation in respiratory neural activity.

    PubMed

    Dick, Thomas E; Morris, Kendall F

    2004-05-01

    We propose the 'delta(2)-statistic' for assessing the magnitude and statistical significance of arterial pulse-modulated activity of single neurones and present the results of applying this tool to medullary respiratory-modulated units. This analytical tool is a modification of the eta(2)-statistic and, consequently, based on the analysis of variance. The eta(2)-statistic reflects the consistency of respiratory-modulated activity on a cycle-by-cycle basis. However, directly applying this test to activity during the cardiac cycle proved ineffective because subjects-by-treatments matrices did not contain enough 'information'. We increased information by dividing the cardiac cycle into fewer bins, excluding cycles without activity and summing activity over multiple cycles. The analysed neuronal activity was an existing data set examining the neural control of respiration and cough. Neurones were recorded in the nuclei of the solitary tracts, and in the rostral and caudal ventral respiratory groups of decerebrate, neuromuscularly blocked, ventilated cats (n= 19). Two hundred of 246 spike trains were respiratory modulated; of these 53% were inspiratory (I), 36.5% expiratory (E), 6% IE phase spanning and 4.5% EI phase spanning and responsive to airway stimulation. Nearly half (n= 96/200) of the respiratory-modulated units were significantly pulse modulated and 13 were highly modulated with delta(2) values exceeding 0.3. In 10 of these highly modulated units, eta(2) values were greater than 0.3 and all 13 had, at least, a portion of their activity during expiration. We conclude that cardiorespiratory interaction is reciprocal; in addition to respiratory-modulated activity in a subset of neuronal activity patterns controlling the cardiovascular system, pulse-modulated activity exists in a subset of neuronal activity patterns controlling the respiratory system. Thus, cardio-ventilatory coupling apparent in respiratory motor output is evident and, perhaps, derived from the

  3. Inheritance and quantitative trait locus analysis of low-light tolerance in cucumber (Cucumis sativus L.).

    PubMed

    Li, D D; Qin, Z W; Lian, H; Yu, G B; Sheng, Y Y; Liu, F

    2015-09-09

    The low-light tolerance index was investigated in a set of 123 F2:3 lines during the seedling stage across 2 seasons, and the heredity of low-light tolerance was assessed via different ge-netic analysis methods. The results of the classical analysis showed that low-light tolerance is controlled by an additive-dominant poly-gene, and the polygenic inheritance rate of separate generations was >30%. In addition, 5 quantitative trait loci (QTLs) exhibited a low-light tolerance index across both seasons, including 2 QTLs (Llti1.1 and Llti1.2) on the 1st linkage group (variances of 6.0 and 9.5%) and 3 QTLs (Llti2.1, Llti2.1, and Llti2.1) on the 2nd linkage group (variances of 10.1-14.0%). The classical analysis method and QTL information on the heredity of low-light tolerance showed that it is controlled by several major genes and a mini-polygene. The results will facilitate the breeding of resistance to low-light stress in cucumber.

  4. Vervets revisited: A quantitative analysis of alarm call structure and context specificity

    PubMed Central

    Price, Tabitha; Wadewitz, Philip; Cheney, Dorothy; Seyfarth, Robert; Hammerschmidt, Kurt; Fischer, Julia

    2015-01-01

    The alarm calls of vervet monkeys (Chlorocebus pygerythrus) constitute the classic textbook example of semantic communication in nonhuman animals, as vervet monkeys give acoustically distinct calls to different predators and these calls elicit appropriate responses in conspecifics. They also give similar sounding calls in aggressive contexts, however. Despite the central role the vervet alarm calls have played for understanding the evolution of communication, a comprehensive, quantitative analysis of the acoustic structure of these calls was lacking. We used 2-step cluster analysis to identify objective call types and discriminant function analysis to assess context specificity. Alarm calls given in response to leopards, eagles, and snakes could be well distinguished, while the inclusion of calls given in aggressive contexts yielded some overlap, specifically between female calls given to snakes, eagles and during aggression, as well as between male vervet barks (additionally recorded in South Africa) in leopard and aggressive contexts. We suggest that both cognitive appraisal of the situation and internal state contribute to the variation in call usage and structure. While the semantic properties of vervet alarm calls bear little resemblance to human words, the existing acoustic variation, possibly together with additional contextual information, allows listeners to select appropriate responses. PMID:26286236

  5. Inheritance and quantitative trait locus analysis of low-light tolerance in cucumber (Cucumis sativus L.).

    PubMed

    Li, D D; Qin, Z W; Lian, H; Yu, G B; Sheng, Y Y; Liu, F

    2015-01-01

    The low-light tolerance index was investigated in a set of 123 F2:3 lines during the seedling stage across 2 seasons, and the heredity of low-light tolerance was assessed via different ge-netic analysis methods. The results of the classical analysis showed that low-light tolerance is controlled by an additive-dominant poly-gene, and the polygenic inheritance rate of separate generations was >30%. In addition, 5 quantitative trait loci (QTLs) exhibited a low-light tolerance index across both seasons, including 2 QTLs (Llti1.1 and Llti1.2) on the 1st linkage group (variances of 6.0 and 9.5%) and 3 QTLs (Llti2.1, Llti2.1, and Llti2.1) on the 2nd linkage group (variances of 10.1-14.0%). The classical analysis method and QTL information on the heredity of low-light tolerance showed that it is controlled by several major genes and a mini-polygene. The results will facilitate the breeding of resistance to low-light stress in cucumber. PMID:26400292

  6. Vervets revisited: A quantitative analysis of alarm call structure and context specificity.

    PubMed

    Price, Tabitha; Wadewitz, Philip; Cheney, Dorothy; Seyfarth, Robert; Hammerschmidt, Kurt; Fischer, Julia

    2015-01-01

    The alarm calls of vervet monkeys (Chlorocebus pygerythrus) constitute the classic textbook example of semantic communication in nonhuman animals, as vervet monkeys give acoustically distinct calls to different predators and these calls elicit appropriate responses in conspecifics. They also give similar sounding calls in aggressive contexts, however. Despite the central role the vervet alarm calls have played for understanding the evolution of communication, a comprehensive, quantitative analysis of the acoustic structure of these calls was lacking. We used 2-step cluster analysis to identify objective call types and discriminant function analysis to assess context specificity. Alarm calls given in response to leopards, eagles, and snakes could be well distinguished, while the inclusion of calls given in aggressive contexts yielded some overlap, specifically between female calls given to snakes, eagles and during aggression, as well as between male vervet barks (additionally recorded in South Africa) in leopard and aggressive contexts. We suggest that both cognitive appraisal of the situation and internal state contribute to the variation in call usage and structure. While the semantic properties of vervet alarm calls bear little resemblance to human words, the existing acoustic variation, possibly together with additional contextual information, allows listeners to select appropriate responses. PMID:26286236

  7. Additional challenges for uncertainty analysis in river engineering

    NASA Astrophysics Data System (ADS)

    Berends, Koen; Warmink, Jord; Hulscher, Suzanne

    2016-04-01

    the proposed intervention. The implicit assumption underlying such analysis is that both models are commensurable. We hypothesize that they are commensurable only to a certain extent. In an idealised study we have demonstrated that prediction performance loss should be expected with increasingly large engineering works. When accounting for parametric uncertainty of floodplain roughness in model identification, we see uncertainty bounds for predicted effects of interventions increase with increasing intervention scale. Calibration of these types of models therefore seems to have a shelf-life, beyond which calibration does not longer improves prediction. Therefore a qualification scheme for model use is required that can be linked to model validity. In this study, we characterize model use along three dimensions: extrapolation (using the model with different external drivers), extension (using the model for different output or indicators) and modification (using modified models). Such use of models is expected to have implications for the applicability of surrogating modelling for efficient uncertainty analysis as well, which is recommended for future research. Warmink, J. J.; Straatsma, M. W.; Huthoff, F.; Booij, M. J. & Hulscher, S. J. M. H. 2013. Uncertainty of design water levels due to combined bed form and vegetation roughness in the Dutch river Waal. Journal of Flood Risk Management 6, 302-318 . DOI: 10.1111/jfr3.12014

  8. Quantitative two-process analysis of avoidance conditioning in goldfish.

    PubMed

    Zhuikov, A Y; Couvillon, P A; Bitterman, M E

    1994-01-01

    The shuttlebox performance of goldfish was studied under standardized conditions in a variety of problems--with or without an avoidance contingency, a conditioned stimulus (CS)-termination contingency, and an escape contingency. The effects of CS-only, unconditioned stimulus (US)-only, and explicitly unpaired training were also examined. All the data could be simulated quantitatively with a version of O. H. Mowrer's (1947) 2-process theory expressed in 2 learning equations (1 classical, the other instrumental) and a performance equation. The good fit suggests that the theory is worth developing further with new experiments designed to challenge it.

  9. Meta-analysis of results from quantitative trait loci mapping studies on pig chromosome 4.

    PubMed

    Silva, K M; Bastiaansen, J W M; Knol, E F; Merks, J W M; Lopes, P S; Guimarães, S E F; van Arendonk, J A M

    2011-06-01

    Meta-analysis of results from multiple studies could lead to more precise quantitative trait loci (QTL) position estimates compared to the individual experiments. As the raw data from many different studies are not readily available, the use of results from published articles may be helpful. In this study, we performed a meta-analysis of QTL on chromosome 4 in pig, using data from 25 separate experiments. First, a meta-analysis was performed for individual traits: average daily gain and backfat thickness. Second, a meta-analysis was performed for the QTL of three traits affecting loin yield: loin eye area, carcass length and loin meat weight. Third, 78 QTL were selected from 20 traits that could be assigned to one of three broad categories: carcass, fatness or growth traits. For each analysis, the number of identified meta-QTL was smaller than the number of initial QTL. The reduction in the number of QTL ranged from 71% to 86% compared to the total number before the meta-analysis. In addition, the meta-analysis reduced the QTL confidence intervals by as much as 85% compared to individual QTL estimates. The reduction in the confidence interval was greater when a large number of independent QTL was included in the meta-analysis. Meta-QTL related to growth and fatness were found in the same region as the FAT1 region. Results indicate that the meta-analysis is an efficient strategy to estimate the number and refine the positions of QTL when QTL estimates are available from multiple populations and experiments. This strategy can be used to better target further studies such as the selection of candidate genes related to trait variation.

  10. Kinetic analysis of microbial respiratory response to substrate addition

    NASA Astrophysics Data System (ADS)

    Blagodatskaya, Evgenia; Blagodatsky, Sergey; Yuyukina, Tatayna; Kuzyakov, Yakov

    2010-05-01

    Heterotrophic component of CO2 emitted from soil is mainly due to the respiratory activity of soil microorganisms. Field measurements of microbial respiration can be used for estimation of C-budget in soil, while laboratory estimation of respiration kinetics allows the elucidation of mechanisms of soil C sequestration. Physiological approaches based on 1) time-dependent or 2) substrate-dependent respiratory response of soil microorganisms decomposing the organic substrates allow to relate the functional properties of soil microbial community with decomposition rates of soil organic matter. We used a novel methodology combining (i) microbial growth kinetics and (ii) enzymes affinity to the substrate to show the shift in functional properties of the soil microbial community after amendments with substrates of contrasting availability. We combined the application of 14C labeled glucose as easily available C source to soil with natural isotope labeling of old and young soil SOM. The possible contribution of two processes: isotopic fractionation and preferential substrate utilization to the shifts in δ13C during SOM decomposition in soil after C3-C4 vegetation change was evaluated. Specific growth rate (µ) of soil microorganisms was estimated by fitting the parameters of the equation v(t) = A + B * exp(µ*t), to the measured CO2 evolution rate (v(t)) after glucose addition, and where A is the initial rate of non-growth respiration, B - initial rate of the growing fraction of total respiration. Maximal mineralization rate (Vmax), substrate affinity of microbial enzymes (Ks) and substrate availability (Sn) were determined by Michaelis-Menten kinetics. To study the effect of plant originated C on δ13C signature of SOM we compared the changes in isotopic composition of different C pools in C3 soil under grassland with C3-C4 soil where C4 plant Miscanthus giganteus was grown for 12 years on the plot after grassland. The shift in 13δ C caused by planting of M. giganteus

  11. Quantitative analysis of laminin 5 gene expression in human keratinocytes.

    PubMed

    Akutsu, Nobuko; Amano, Satoshi; Nishiyama, Toshio

    2005-05-01

    To examine the expression of laminin 5 genes (LAMA3, LAMB3, and LAMC2) encoding the three polypeptide chains alpha3, beta3, and gamma2, respectively, in human keratinocytes, we developed novel quantitative polymerase chain reaction (PCR) methods utilizing Thermus aquaticus DNA polymerase, specific primers, and fluorescein-labeled probes with the ABI PRISM 7700 sequence detector system. Gene expression levels of LAMA3, LAMB3, and LAMC2 and glyceraldehyde-3-phosphate dehydrogenase were quantitated reproducibly and sensitively in the range from 1 x 10(2) to 1 x 10(8) gene copies. Basal gene expression level of LAMB3 was about one-tenth of that of LAMA3 or LAMC2 in human keratinocytes, although there was no clear difference among immunoprecipitated protein levels of alpha3, beta3, and gamma2 synthesized in radio-labeled keratinocytes. Human serum augmented gene expressions of LAMA3, LAMB3, and LAMC2 in human keratinocytes to almost the same extent, and this was associated with an increase of the laminin 5 protein content measured by a specific sandwich enzyme-linked immunosorbent assay. These results demonstrate that the absolute mRNA levels generated from the laminin 5 genes do not determine the translated protein levels of the laminin 5 chains in keratinocytes, and indicate that the expression of the laminin 5 genes may be controlled by common regulation mechanisms. PMID:15854126

  12. Quantitative phenotypic analysis of multistress response in Zygosaccharomyces rouxii complex.

    PubMed

    Solieri, Lisa; Dakal, Tikam C; Bicciato, Silvio

    2014-06-01

    Zygosaccharomyces rouxii complex comprises three yeasts clusters sourced from sugar- and salt-rich environments: haploid Zygosaccharomyces rouxii, diploid Zygosaccharomyces sapae and allodiploid/aneuploid strains of uncertain taxonomic affiliations. These yeasts have been characterized with respect to gene copy number variation, karyotype variability and change in ploidy, but functional diversity in stress responses has not been explored yet. Here, we quantitatively analysed the stress response variation in seven strains of the Z. rouxii complex by modelling growth variables via model and model-free fitting methods. Based on the spline fit as most reliable modelling method, we resolved different interstrain responses to 15 environmental perturbations. Compared with Z. rouxii CBS 732(T) and Z. sapae strains ABT301(T) and ABT601, allodiploid strain ATCC 42981 and aneuploid strains CBS 4837 and CBS 4838 displayed higher multistress resistance and better performance in glycerol respiration even in the presence of copper. μ-based logarithmic phenotypic index highlighted that ABT601 is a slow-growing strain insensitive to stress, whereas ABT301(T) grows fast on rich medium and is sensitive to suboptimal conditions. Overall, the differences in stress response could imply different adaptation mechanisms to sugar- and salt-rich niches. The obtained phenotypic profiling contributes to provide quantitative insights for elucidating the adaptive mechanisms to stress in halo- and osmo-tolerant Zygosaccharomyces yeasts. PMID:24533625

  13. Quantitative analysis of radiation-induced changes in sperm morphology

    SciTech Connect

    Young, I.T.; Gledhill, B.L.; Lake, S.; Wyrobek, A.J.

    1982-09-01

    When developing spermatogenic cells are exposed to radiation, chemical carcinogens or mutagens, the transformation in the morphology of the mature sperm can be used to determine the severity of the exposure. In this study five groups of mice with three mice per group received testicular doses of X irradiation at dosage levels ranging from 0 rad to 120 rad. A random sample of 100 mature sperm per mouse was analyzed five weeks later for the quantitative morphologic transformation as a function of dosage level. The cells were stained with gallocyanin chrome alum (GCA) so that only the DNA in the sperm head was visible. The ACUity quantitative microscopy system at Lawrence Livermore National Laboratory was used to scan the sperm at a sampling density of 16 points per linear micrometer and with 256 brightness levels per point. The contour of each cell was extracted using conventional thresholding techniques on the high-contrast images. For each contour a variety of shape features was then computed to characterize the morphology of that cell. Using the control group and the distribution of their shape features to establish the variability of a normal sperm population, the 95% limits on normal morphology were established. Using only four shape features, a doubling dose of approximately 39 rad was determined. That is, at 39 rad exposure the percentage of abnormal cells was twice that occurring in the control population. This compared to a doubling dose of approximately 70 rad obtained from a concurrent visual procedure.

  14. [Multiple dependent variables LS-SVM regression algorithm and its application in NIR spectral quantitative analysis].

    PubMed

    An, Xin; Xu, Shuo; Zhang, Lu-Da; Su, Shi-Guang

    2009-01-01

    In the present paper, on the basis of LS-SVM algorithm, we built a multiple dependent variables LS-SVM (MLS-SVM) regression model whose weights can be optimized, and gave the corresponding algorithm. Furthermore, we theoretically explained the relationship between MLS-SVM and LS-SVM. Sixty four broomcorn samples were taken as experimental material, and the sample ratio of modeling set to predicting set was 51 : 13. We first selected randomly and uniformly five weight groups in the interval [0, 1], and then in the way of leave-one-out (LOO) rule determined one appropriate weight group and parameters including penalizing parameters and kernel parameters in the model according to the criterion of the minimum of average relative error. Then a multiple dependent variables quantitative analysis model was built with NIR spectrum and simultaneously analyzed three chemical constituents containing protein, lysine and starch. Finally, the average relative errors between actual values and predicted ones by the model of three components for the predicting set were 1.65%, 6.47% and 1.37%, respectively, and the correlation coefficients were 0.9940, 0.8392 and 0.8825, respectively. For comparison, LS-SVM was also utilized, for which the average relative errors were 1.68%, 6.25% and 1.47%, respectively, and the correlation coefficients were 0.9941, 0.8310 and 0.8800, respectively. It is obvious that MLS-SVM algorithm is comparable to LS-SVM algorithm in modeling analysis performance, and both of them can give satisfying results. The result shows that the model with MLS-SVM algorithm is capable of doing multi-components NIR quantitative analysis synchronously. Thus MLS-SVM algorithm offers a new multiple dependent variables quantitative analysis approach for chemometrics. In addition, the weights have certain effect on the prediction performance of the model with MLS-SVM, which is consistent with our intuition and is validated in this study. Therefore, it is necessary to optimize

  15. Compositional GC-FID analysis of the additives to PVC, focusing on the gaskets of lids for glass jars.

    PubMed

    Biedermann-Brem, Sandra; Biedermann, Maurus; Fiselier, Katell; Grob, Koni

    2005-12-01

    A gas chromatographic (FID) method is described which aims at the quantitative compositional analysis of the additives in plasticized PVC, particularly the plastisols used as gaskets for lids of glass jars. An extract of the PVC is analysed directly as well as after transesterification to ethyl esters. Transesterification enables the analysis of epoxidized soya bean and linseed oil (ESBO and ELO) as well as polyadipates. For most other additives, the shifts in the chromatogram resulting from transesterification is used to confirm the identifications made by direct analysis. In the gaskets of 69 lids from the European market used for packaging oily foods, a broad variety of plastisol compositions was found, many or possibly all of which do not comply with legal requirements. In 62% of these lids, ESBO was the principal plasticizer, whereas in 25% a phthalate had been used. PMID:16356892

  16. QUANTITATIVE CT ANALYSIS, AIRFLOW OBSTRUCTION AND LUNG CANCER IN THE PITTSBURGH LUNG SCREENING STUDY

    PubMed Central

    Wilson, David O; Leader, Joseph K; Fuhrman, Carl R; Reilly, John J; Sciurba, Frank C.; Weissfeld, Joel L

    2011-01-01

    Background To study the relationship between emphysema, airflow obstruction and lung cancer in a high risk population we performed quantitative analysis of screening computed tomography (CT) scans. Methods Subjects completed questionnaires, spirometry and low-dose helical chest CT. Analyses compared cases and controls according to automated quantitative analysis of lung parenchyma and airways measures. Results Our case-control study of 117 matched pairs of lung cancer cases and controls did not reveal any airway or lung parenchymal findings on quantitative analysis of screening CT scans that were associated with increased lung cancer risk. Airway measures including wall area %, lumen perimeter, lumen area and average wall HU, and parenchymal measures including lung fraction < −910 Hounsfield Units (HU), were not statistically different between cases and controls. Conclusions The relationship between visual assessment of emphysema and increased lung cancer risk could not be verified by quantitative analysis of low-dose screening CT scans in a high risk tobacco exposed population. PMID:21610523

  17. Quantitative iTRAQ secretome analysis of Aspergillus niger reveals novel hydrolytic enzymes.

    PubMed

    Adav, Sunil S; Li, An A; Manavalan, Arulmani; Punt, Peter; Sze, Siu Kwan

    2010-08-01

    The natural lifestyle of Aspergillus niger made them more effective secretors of hydrolytic proteins and becomes critical when this species were exploited as hosts for the commercial secretion of heterologous proteins. The protein secretion profile of A. niger and its mutant at different pH was explored using iTRAQ-based quantitative proteomics approach coupled with liquid chromatography-tandem mass spectrometry (LC-MS/MS). This study characterized 102 highly confident unique proteins in the secretome with zero false discovery rate based on decoy strategy. The iTRAQ technique identified and relatively quantified many hydrolyzing enzymes such as cellulases, hemicellulases, glycoside hydrolases, proteases, peroxidases, and protein translocating transporter proteins during fermentation. The enzymes have potential application in lignocellulosic biomass hydrolysis for biofuel production, for example, the cellulolytic and hemicellulolytic enzymes glucan 1,4-alpha-glucosidase, alpha-glucosidase C, endoglucanase, alpha l-arabinofuranosidase, beta-mannosidase, glycosyl hydrolase; proteases such as tripeptidyl-peptidase, aspergillopepsin, and other enzymes including cytochrome c oxidase, cytochrome c oxidase, glucose oxidase were highly expressed in A. niger and its mutant secretion. In addition, specific enzyme production can be stimulated by controlling pH of the culture medium. Our results showed comprehensive unique secretory protein profile of A. niger, its regulation at different pH, and the potential application of iTRAQ-based quantitative proteomics for the microbial secretome analysis.

  18. Quantitative laser-induced breakdown spectroscopy analysis of calcified tissue samples

    NASA Astrophysics Data System (ADS)

    Samek, O.; Beddows, D. C. S.; Telle, H. H.; Kaiser, J.; Liška, M.; Cáceres, J. O.; Gonzáles Ureña, A.

    2001-06-01

    We report on the application of laser-induced breakdown spectroscopy (LIBS) to the analysis of important minerals and the accumulation of potentially toxic elements in calcified tissue, to trace e.g. the influence of environmental exposure, and other medical or biological factors. This theme was exemplified for quantitative detection and mapping of Al, Pb and Sr in representative samples, including teeth (first teeth of infants, second teeth of children and teeth of adults) and bones (tibia and femur). In addition to identifying and quantifying major and trace elements in the tissues, one- and two-dimensional profiles and maps were generated. Such maps (a) provide time/concentration relations, (b) allow to follow mineralisation of the hydroxyapatite matrix and the migration of the elements within it and (c) enable to identify disease states, such as caries in teeth. In order to obtain quantitative calibration, reference samples in the form of pressed pellets with calcified tissue-equivalent material (majority compound of pellets is CaCO 3) were used whose physical properties closely resembled hydroxyapatite. Compounds of Al, Sr and Pb were added to the pellets, containing atomic concentrations in the range 100-10 000 ppm relative to the Ca content of the matrix. Analytical results based on this calibration against artificial samples for the trace elements under investigation agree with literature values, and with our atomic absorption spectroscopy (AAS) cross-validation measurements.

  19. Quantitative Spectral Morphology Analysis of Unusually Red and Blue L Dwarfs

    NASA Astrophysics Data System (ADS)

    Camnasio, Sara; Alam, Munazza Khalida; Rice, Emily L.; Cruz, Kelle L.; Faherty, Jacqueline K.; Mace, Gregory N.; Martin, Emily; Logsdon, Sarah E.; McLean, Ian S.; Brown Dwarfs in New York City (BDNYC)

    2016-01-01

    In an effort to constrain the properties of photometric color outliers, we present a quantitative spectral morphology analysis of medium-resolution NIRSPEC (R~2,000), SpeX cross-dispersed (R~2,000), Palomar TripleSpec (R~2600), and Magellan FIRE (R~6000) J-band spectra for a sample of unusually red and blue L dwarfs. Some red L dwarfs are low surface gravity, young objects whose spectra present weak Na I doublets and FeH absorption bands, but strong VO features (Cruz et al. 2009). Some blue L dwarfs are subdwarfs with low metallicity spectral features such as greater H2 absorption, stronger metal hydride bands, and enhanced TiO absorption (Burgasser et al 2008c). We fit 3rd order polynomials to the pseudo-continuum in order to provide a quantitative comparison of spectral morphology with other peculiar L dwarfs, field standards, young L dwarfs, and L subdwarf. The results indicated that the coefficients of the fit correlate with spectral type, but are independent of color. This newly found trend provides a parameter which can be utilized as an additional tool in characterizing quantifiable differences in the spectra of brown dwarfs. Furthermore, this method can be applied in studying the atmospheric properties of exoplanets, given their similarities with brown dwarfs in mass and photospheric properties.

  20. Bridging the gaps for global sustainable development: a quantitative analysis.

    PubMed

    Udo, Victor E; Jansson, Peter Mark

    2009-09-01

    Global human progress occurs in a complex web of interactions between society, technology and the environment as driven by governance and infrastructure management capacity among nations. In our globalizing world, this complex web of interactions over the last 200 years has resulted in the chronic widening of economic and political gaps between the haves and the have-nots with consequential global cultural and ecosystem challenges. At the bottom of these challenges is the issue of resource limitations on our finite planet with increasing population. The problem is further compounded by pleasure-driven and poverty-driven ecological depletion and pollution by the haves and the have-nots respectively. These challenges are explored in this paper as global sustainable development (SD) quantitatively; in order to assess the gaps that need to be bridged. Although there has been significant rhetoric on SD with very many qualitative definitions offered, very few quantitative definitions of SD exist. The few that do exist tend to measure SD in terms of social, energy, economic and environmental dimensions. In our research, we used several human survival, development, and progress variables to create an aggregate SD parameter that describes the capacity of nations in three dimensions: social sustainability, environmental sustainability and technological sustainability. Using our proposed quantitative definition of SD and data from relatively reputable secondary sources, 132 nations were ranked and compared. Our comparisons indicate a global hierarchy of needs among nations similar to Maslow's at the individual level. As in Maslow's hierarchy of needs, nations that are struggling to survive are less concerned with environmental sustainability than advanced and stable nations. Nations such as the United States, Canada, Finland, Norway and others have higher SD capacity, and thus, are higher on their hierarchy of needs than nations such as Nigeria, Vietnam, Mexico and other

  1. Quantitative Selection Analysis of Bacteriophage φCbK Susceptibility in Caulobacter crescentus.

    PubMed

    Christen, Matthias; Beusch, Christian; Bösch, Yvonne; Cerletti, Dario; Flores-Tinoco, Carlos Eduardo; Del Medico, Luca; Tschan, Flavia; Christen, Beat

    2016-01-29

    Classical molecular genetics uses stringent selective conditions to identify mutants with distinct phenotypic responses. Mutations giving rise to less pronounced phenotypes are often missed. However, to gain systems-level insights into complex genetic interaction networks requires genome-wide assignment of quantitative phenotypic traits. In this paper, we present a quantitative selection approach coupled with transposon sequencing (QS-TnSeq) to globally identify the cellular components that orchestrate susceptibility of the cell cycle model bacterium Caulobacter crescentus toward bacteriophage φCbK infection. We found that 135 genes representing 3.30% of the Caulobacter genome exhibit significant accumulation of transposon insertions upon φCbK selection. More than 85% thereof consist of new factors not previously associated with phage φCbK susceptibility. Using hierarchical clustering of dose-dependent TnSeq datasets, we grouped these genes into functional modules that correlate with different stages of the φCbK infection process. We assign φCbK susceptibility to eight new genes that represent novel components of the pilus secretion machinery. Further, we demonstrate that, from 86 motility genes, only seven genes encoding structural and regulatory components of the flagellar hook increase phage resistance when disrupted by transposons, suggesting a link between flagellar hook assembly and pili biogenesis. In addition, we observe high recovery of Tn5 insertions within regulatory sequences of the genes encoding the essential NADH:ubiquinone oxidoreductase complex indicating that intact proton motive force is crucial for effective phage propagation. In sum, QS-TnSeq is broadly applicable to perform quantitative and genome-wide systems-genetics analysis of complex phenotypic traits. PMID:26593064

  2. Quantitative Selection Analysis of Bacteriophage φCbK Susceptibility in Caulobacter crescentus.

    PubMed

    Christen, Matthias; Beusch, Christian; Bösch, Yvonne; Cerletti, Dario; Flores-Tinoco, Carlos Eduardo; Del Medico, Luca; Tschan, Flavia; Christen, Beat

    2016-01-29

    Classical molecular genetics uses stringent selective conditions to identify mutants with distinct phenotypic responses. Mutations giving rise to less pronounced phenotypes are often missed. However, to gain systems-level insights into complex genetic interaction networks requires genome-wide assignment of quantitative phenotypic traits. In this paper, we present a quantitative selection approach coupled with transposon sequencing (QS-TnSeq) to globally identify the cellular components that orchestrate susceptibility of the cell cycle model bacterium Caulobacter crescentus toward bacteriophage φCbK infection. We found that 135 genes representing 3.30% of the Caulobacter genome exhibit significant accumulation of transposon insertions upon φCbK selection. More than 85% thereof consist of new factors not previously associated with phage φCbK susceptibility. Using hierarchical clustering of dose-dependent TnSeq datasets, we grouped these genes into functional modules that correlate with different stages of the φCbK infection process. We assign φCbK susceptibility to eight new genes that represent novel components of the pilus secretion machinery. Further, we demonstrate that, from 86 motility genes, only seven genes encoding structural and regulatory components of the flagellar hook increase phage resistance when disrupted by transposons, suggesting a link between flagellar hook assembly and pili biogenesis. In addition, we observe high recovery of Tn5 insertions within regulatory sequences of the genes encoding the essential NADH:ubiquinone oxidoreductase complex indicating that intact proton motive force is crucial for effective phage propagation. In sum, QS-TnSeq is broadly applicable to perform quantitative and genome-wide systems-genetics analysis of complex phenotypic traits.

  3. Modeling of X-Ray Fluorescence for Quantitative Analysis

    NASA Astrophysics Data System (ADS)

    Zarkadas, Charalambos

    2010-03-01

    Quantitative XRF algorithms involve mathematical procedures intended to solve a set of equations expressing the total fluorescence intensity of selected X-ray element lines emitted after sample irradiation by a photon source. These equations [1] have been derived under the assumptions of a parallel exciting beam and that of a perfectly flat and uniform sample and have been extended up to date to describe composite cases such as multilayered samples and samples exhibiting particle size effects. In state of the art algorithms the equations include most of the physical processes which can contribute to the measured fluorescence signal and make use of evaluated databases for the Fundamental Parameters included in the calculations. The accuracy of the results obtained depends on a great extent on the completeness of the model used to describe X-ray fluorescence intensities and on the compliance of the actual experimental conditions to the basic assumptions under which the mathematical formulas were derived.

  4. Quantitative Analysis of Matrine in Liquid Crystalline Nanoparticles by HPLC

    PubMed Central

    Peng, Xinsheng; Hu, Min; Ling, Yahao; Tian, Yuan; Zhou, Yanxing; Zhou, Yanfang

    2014-01-01

    A reversed-phase high-performance liquid chromatographic method has been developed to quantitatively determine matrine in liquid crystal nanoparticles. The chromatographic method is carried out using an isocratic system. The mobile phase was composed of methanol-PBS(pH6.8)-triethylamine (50 : 50 : 0.1%) with a flow rate of 1 mL/min with SPD-20A UV/vis detector and the detection wavelength was at 220 nm. The linearity of matrine is in the range of 1.6 to 200.0 μg/mL. The regression equation is y = 10706x − 2959 (R2 = 1.0). The average recovery is 101.7%; RSD = 2.22%  (n = 9). This method provides a simple and accurate strategy to determine matrine in liquid crystalline nanoparticle. PMID:24834359

  5. Cross-bridge model of muscle contraction. Quantitative analysis.

    PubMed Central

    Eisenberg, E; Hill, T L; Chen, Y

    1980-01-01

    We recently presented, in a qualitative manner, a cross-bridge model of muscle contraction which was based on a biochemical kinetic cycle for the actomyosin ATPase activity. This cross-bridge model consisted of two cross-bridge states detached from actin and two cross-bridge states attached to actin. In the present paper, we attempt to fit this model quantitatively to both biochemical and physiological data. We find that the resulting complete cross-bridge model is able to account reasonably well for both the isometric transient data observed when a muscle is subjected to a sudden change in length and for the relationship between the velocity of muscle contraction in vivo and the actomyosin ATPase activity in vitro. This model also illustrates the interrelationship between biochemical and physiological data necessary for the development of a complete cross-bridge model of muscle contraction. PMID:6455168

  6. Quantitative error analysis for computer assisted navigation: a feasibility study

    PubMed Central

    Güler, Ö.; Perwög, M.; Kral, F.; Schwarm, F.; Bárdosi, Z. R.; Göbel, G.; Freysinger, W.

    2013-01-01

    Purpose The benefit of computer-assisted navigation depends on the registration process, at which patient features are correlated to some preoperative imagery. The operator-induced uncertainty in localizing patient features – the User Localization Error (ULE) - is unknown and most likely dominating the application accuracy. This initial feasibility study aims at providing first data for ULE with a research navigation system. Methods Active optical navigation was done in CT-images of a plastic skull, an anatomic specimen (both with implanted fiducials) and a volunteer with anatomical landmarks exclusively. Each object was registered ten times with 3, 5, 7, and 9 registration points. Measurements were taken at 10 (anatomic specimen and volunteer) and 11 targets (plastic skull). The active NDI Polaris system was used under ideal working conditions (tracking accuracy 0.23 mm root mean square, RMS; probe tip calibration was 0.18 mm RMS. Variances of tracking along the principal directions were measured as 0.18 mm2, 0.32 mm2, and 0.42 mm2. ULE was calculated from predicted application accuracy with isotropic and anisotropic models and from experimental variances, respectively. Results The ULE was determined from the variances as 0.45 mm (plastic skull), 0.60 mm (anatomic specimen), and 4.96 mm (volunteer). The predicted application accuracy did not yield consistent values for the ULE. Conclusions Quantitative data of application accuracy could be tested against prediction models with iso- and anisotropic noise models and revealed some discrepancies. This could potentially be due to the facts that navigation and one prediction model wrongly assume isotropic noise (tracking is anisotropic), while the anisotropic noise prediction model assumes an anisotropic registration strategy (registration is isotropic in typical navigation systems). The ULE data are presumably the first quantitative values for the precision of localizing anatomical landmarks and implanted fiducials

  7. Quantitative and sensitive analysis of CN molecules using laser induced low pressure He plasma

    NASA Astrophysics Data System (ADS)

    Pardede, Marincan; Hedwig, Rinda; Abdulmadjid, Syahrun Nur; Lahna, Kurnia; Idris, Nasrullah; Jobiliong, Eric; Suyanto, Hery; Marpaung, Alion Mangasi; Suliyanti, Maria Margaretha; Ramli, Muliadi; Tjia, May On; Lie, Tjung Jie; Lie, Zener Sukra; Kurniawan, Davy Putra; Kurniawan, Koo Hendrik; Kagawa, Kiichiro

    2015-03-01

    We report the results of experimental study on CN 388.3 nm and C I 247.8 nm emission characteristics using 40 mJ laser irradiation with He and N2 ambient gases. The results obtained with N2 ambient gas show undesirable interference effect between the native CN emission and the emission of CN molecules arising from the recombination of native C ablated from the sample with the N dissociated from the ambient gas. This problem is overcome by the use of He ambient gas at low pressure of 2 kPa, which also offers the additional advantages of cleaner and stronger emission lines. The result of applying this favorable experimental condition to emission spectrochemical measurement of milk sample having various protein concentrations is shown to yield a close to linear calibration curve with near zero extrapolated intercept. Additionally, a low detection limit of 5 μg/g is found in this experiment, making it potentially applicable for quantitative and sensitive CN analysis. The visibility of laser induced breakdown spectroscopy with low pressure He gas is also demonstrated by the result of its application to spectrochemical analysis of fossil samples. Furthermore, with the use of CO2 ambient gas at 600 Pa mimicking the Mars atmosphere, this technique also shows promising applications to exploration in Mars.

  8. Quantitative and sensitive analysis of CN molecules using laser induced low pressure He plasma

    SciTech Connect

    Pardede, Marincan; Hedwig, Rinda; Abdulmadjid, Syahrun Nur; Lahna, Kurnia; Idris, Nasrullah; Ramli, Muliadi; Jobiliong, Eric; Suyanto, Hery; Marpaung, Alion Mangasi; Suliyanti, Maria Margaretha; Tjia, May On

    2015-03-21

    We report the results of experimental study on CN 388.3 nm and C I 247.8 nm emission characteristics using 40 mJ laser irradiation with He and N{sub 2} ambient gases. The results obtained with N{sub 2} ambient gas show undesirable interference effect between the native CN emission and the emission of CN molecules arising from the recombination of native C ablated from the sample with the N dissociated from the ambient gas. This problem is overcome by the use of He ambient gas at low pressure of 2 kPa, which also offers the additional advantages of cleaner and stronger emission lines. The result of applying this favorable experimental condition to emission spectrochemical measurement of milk sample having various protein concentrations is shown to yield a close to linear calibration curve with near zero extrapolated intercept. Additionally, a low detection limit of 5 μg/g is found in this experiment, making it potentially applicable for quantitative and sensitive CN analysis. The visibility of laser induced breakdown spectroscopy with low pressure He gas is also demonstrated by the result of its application to spectrochemical analysis of fossil samples. Furthermore, with the use of CO{sub 2} ambient gas at 600 Pa mimicking the Mars atmosphere, this technique also shows promising applications to exploration in Mars.

  9. [Research progress of quantitative analysis for respiratory sinus arrhythmia].

    PubMed

    Sun, Congcong; Zhang, Zhengbo; Wang, Buqing; Liu, Hongyun; Ang, Qing; Wang, Weidong

    2011-12-01

    Respiratory sinus arrhythmia (RSA) is known as fluctuations of heart rate associated with breathing. It has been increasingly used as a noninvasive index of cardiac vagal tone in psychophysiological research recently. Its analysis is often influenced or distorted by respiratory parameters, posture and action, etc. This paper reviews five methods of quantification, including the root mean square of successive differences (RMSSD), peak valley RSA (pvRSA), cosinor fitting, spectral analysis, and joint timing-frequency analysis (JTFA). Paced breathing, analysis of covariance, residua method and msRSA per liter tidal volume are adjustment strategies of measurement and analysis of RSA in this article as well. At last, some prospects of solutions of the problems of RSA research are given.

  10. Quantitative analysis of numerical solvers for oscillatory biomolecular system models

    PubMed Central

    Quo, Chang F; Wang, May D

    2008-01-01

    Background This article provides guidelines for selecting optimal numerical solvers for biomolecular system models. Because various parameters of the same system could have drastically different ranges from 10-15 to 1010, the ODEs can be stiff and ill-conditioned, resulting in non-unique, non-existing, or non-reproducible modeling solutions. Previous studies have not examined in depth how to best select numerical solvers for biomolecular system models, which makes it difficult to experimentally validate the modeling results. To address this problem, we have chosen one of the well-known stiff initial value problems with limit cycle behavior as a test-bed system model. Solving this model, we have illustrated that different answers may result from different numerical solvers. We use MATLAB numerical solvers because they are optimized and widely used by the modeling community. We have also conducted a systematic study of numerical solver performances by using qualitative and quantitative measures such as convergence, accuracy, and computational cost (i.e. in terms of function evaluation, partial derivative, LU decomposition, and "take-off" points). The results show that the modeling solutions can be drastically different using different numerical solvers. Thus, it is important to intelligently select numerical solvers when solving biomolecular system models. Results The classic Belousov-Zhabotinskii (BZ) reaction is described by the Oregonator model and is used as a case study. We report two guidelines in selecting optimal numerical solver(s) for stiff, complex oscillatory systems: (i) for problems with unknown parameters, ode45 is the optimal choice regardless of the relative error tolerance; (ii) for known stiff problems, both ode113 and ode15s are good choices under strict relative tolerance conditions. Conclusions For any given biomolecular model, by building a library of numerical solvers with quantitative performance assessment metric, we show that it is possible

  11. Quantitative analysis of agricultural land use change in China

    NASA Astrophysics Data System (ADS)

    Chou, Jieming; Dong, Wenjie; Wang, Shuyu; Fu, Yuqing

    This article reviews the potential impacts of climate change on land use change in China. Crop sown area is used as index to quantitatively analyze the temporal-spatial changes and the utilization of the agricultural land. A new concept is defined as potential multiple cropping index to reflect the potential sowing ability. The impacting mechanism, land use status and its surplus capacity are investigated as well. The main conclusions are as following; During 1949-2010, the agricultural land was the greatest in amount in the middle of China, followed by that in the country's eastern and western regions. The most rapid increase and decrease of agricultural land were observed in Xinjiang and North China respectively, Northwest China and South China is also changed rapid. The variation trend before 1980 differed significantly from that after 1980. Agricultural land was affected by both natural and social factors, such as regional climate and environmental changes, population growth, economic development, and implementation of policies. In this paper, the effects of temperature and urbanization on the coverage of agriculture land are evaluated, and the results show that the urbanization can greatly affects the amount of agriculture land in South China, Northeast China, Xinjiang and Southwest China. From 1980 to 2009, the extent of agricultural land use had increased as the surplus capacity had decreased. Still, large remaining potential space is available, but the future utilization of agricultural land should be carried out with scientific planning and management for the sustainable development.

  12. Quantitative analysis of chromosome condensation in fission yeast.

    PubMed

    Petrova, Boryana; Dehler, Sascha; Kruitwagen, Tom; Hériché, Jean-Karim; Miura, Kota; Haering, Christian H

    2013-03-01

    Chromosomes undergo extensive conformational rearrangements in preparation for their segregation during cell divisions. Insights into the molecular mechanisms behind this still poorly understood condensation process require the development of new approaches to quantitatively assess chromosome formation in vivo. In this study, we present a live-cell microscopy-based chromosome condensation assay in the fission yeast Schizosaccharomyces pombe. By automatically tracking the three-dimensional distance changes between fluorescently marked chromosome loci at high temporal and spatial resolution, we analyze chromosome condensation during mitosis and meiosis and deduct defined parameters to describe condensation dynamics. We demonstrate that this method can determine the contributions of condensin, topoisomerase II, and Aurora kinase to mitotic chromosome condensation. We furthermore show that the assay can identify proteins required for mitotic chromosome formation de novo by isolating mutants in condensin, DNA polymerase ε, and F-box DNA helicase I that are specifically defective in pro-/metaphase condensation. Thus, the chromosome condensation assay provides a direct and sensitive system for the discovery and characterization of components of the chromosome condensation machinery in a genetically tractable eukaryote.

  13. Quantitative analysis of TALE-DNA interactions suggests polarity effects.

    PubMed

    Meckler, Joshua F; Bhakta, Mital S; Kim, Moon-Soo; Ovadia, Robert; Habrian, Chris H; Zykovich, Artem; Yu, Abigail; Lockwood, Sarah H; Morbitzer, Robert; Elsäesser, Janett; Lahaye, Thomas; Segal, David J; Baldwin, Enoch P

    2013-04-01

    Transcription activator-like effectors (TALEs) have revolutionized the field of genome engineering. We present here a systematic assessment of TALE DNA recognition, using quantitative electrophoretic mobility shift assays and reporter gene activation assays. Within TALE proteins, tandem 34-amino acid repeats recognize one base pair each and direct sequence-specific DNA binding through repeat variable di-residues (RVDs). We found that RVD choice can affect affinity by four orders of magnitude, with the relative RVD contribution in the order NG > HD ≈ NN > NI > NK. The NN repeat preferred the base G over A, whereas the NK repeat bound G with 10(3)-fold lower affinity. We compared AvrBs3, a naturally occurring TALE that recognizes its target using some atypical RVD-base combinations, with a designed TALE that precisely matches 'standard' RVDs with the target bases. This comparison revealed unexpected differences in sensitivity to substitutions of the invariant 5'-T. Another surprising observation was that base mismatches at the 5' end of the target site had more disruptive effects on affinity than those at the 3' end, particularly in designed TALEs. These results provide evidence that TALE-DNA recognition exhibits a hitherto un-described polarity effect, in which the N-terminal repeats contribute more to affinity than C-terminal ones.

  14. Quantitative analysis of TALE–DNA interactions suggests polarity effects

    PubMed Central

    Meckler, Joshua F.; Bhakta, Mital S.; Kim, Moon-Soo; Ovadia, Robert; Habrian, Chris H.; Zykovich, Artem; Yu, Abigail; Lockwood, Sarah H.; Morbitzer, Robert; Elsäesser, Janett; Lahaye, Thomas; Segal, David J.; Baldwin, Enoch P.

    2013-01-01

    Transcription activator-like effectors (TALEs) have revolutionized the field of genome engineering. We present here a systematic assessment of TALE DNA recognition, using quantitative electrophoretic mobility shift assays and reporter gene activation assays. Within TALE proteins, tandem 34-amino acid repeats recognize one base pair each and direct sequence-specific DNA binding through repeat variable di-residues (RVDs). We found that RVD choice can affect affinity by four orders of magnitude, with the relative RVD contribution in the order NG > HD ∼ NN ≫ NI > NK. The NN repeat preferred the base G over A, whereas the NK repeat bound G with 103-fold lower affinity. We compared AvrBs3, a naturally occurring TALE that recognizes its target using some atypical RVD-base combinations, with a designed TALE that precisely matches ‘standard’ RVDs with the target bases. This comparison revealed unexpected differences in sensitivity to substitutions of the invariant 5′-T. Another surprising observation was that base mismatches at the 5′ end of the target site had more disruptive effects on affinity than those at the 3′ end, particularly in designed TALEs. These results provide evidence that TALE–DNA recognition exhibits a hitherto un-described polarity effect, in which the N-terminal repeats contribute more to affinity than C-terminal ones. PMID:23408851

  15. Temporal Kinetics and Quantitative Analysis of Cryptococcus neoformans Nonlytic Exocytosis

    PubMed Central

    Stukes, Sabriya A.; Cohen, Hillel W.

    2014-01-01

    Cryptococcus neoformans is a facultative intracellular pathogen and the causative agent of cryptococcosis, a disease that is often fatal to those with compromised immune systems. C. neoformans has the capacity to escape phagocytic cells through a process known as nonlytic exocytosis whereby the cryptococcal cell is released from the macrophage into the extracellular environment, leaving both the host and pathogen alive. Little is known about the mechanism behind nonlytic exocytosis, but there is evidence that both the fungal and host cells contribute to the process. In this study, we used time-lapse movies of C. neoformans-infected macrophages to delineate the kinetics and quantitative aspects of nonlytic exocytosis. We analyzed approximately 800 macrophages containing intracellular C. neoformans and identified 163 nonlytic exocytosis events that were further characterized into three subcategories: type I (complete emptying of macrophage), type II (partial emptying of macrophage), and type III (cell-to-cell transfer). The majority of type I and II events occurred after several hours of intracellular residence, whereas type III events occurred significantly (P < 0.001) earlier in the course of macrophage infection. Our results show that nonlytic exocytosis is a morphologically and temporally diverse process that occurs relatively rapidly in the course of macrophage infection. PMID:24595144

  16. On the in vivo action of erythropoietin: a quantitative analysis.

    PubMed

    Papayannopoulou, T; Finch, C A

    1972-05-01

    The composite response of the erythron to exogenous erythropoietin has been studied in normal, splenectomized, and polycythemic mice. After stimulation the normal animal doubled its marrow nucleated red cells by the 3rd day with little further change by the 5th. Nucleated red cells within the spleen began to increase sharply on the 2nd day and, by the 5th, exceeded those in the marrow. The total nucleated erythroid response represented a fourfold increase. Reticulocytes lagged behind the expansion of the nucleated red cell mass, but by the 5th day the original ratio was re-established. Hemoglobin synthesis was increased, but the ratio of hemoglobin synthesized in nucleated red cells and reticulocytes was basically unchanged. Early displacement of marrow reticulocytes into circulation and the production of a larger red cell also occurred. No evidence of a change in the number of erythroid mitoses was found; only a slight decrease in the average cell cycle time was demonstrated. Thus, whereas erythropoietin stimulation induced several changes in erythropoiesis, the increased number of cells entering into the maturing pool appeared to be of greatest quantitative significance.Splenectomy reduced the proliferative response of the erythron over 5 days stimulation to three-fourths that found in the normal animal. This difference, also reflected in a proportionately lower reticulocyte response and increment in circulating red cell mass, suggests that erythropoiesis within the mouse marrow is spatially or otherwise restricted and that the spleen provided a supplemental area of erythroid expansion.

  17. On the in vivo action of erythropoietin: a quantitative analysis

    PubMed Central

    Papayannopoulou, Thalia; Finch, Clement A.

    1972-01-01

    The composite response of the erythron to exogenous erythropoietin has been studied in normal, splenectomized, and polycythemic mice. After stimulation the normal animal doubled its marrow nucleated red cells by the 3rd day with little further change by the 5th. Nucleated red cells within the spleen began to increase sharply on the 2nd day and, by the 5th, exceeded those in the marrow. The total nucleated erythroid response represented a fourfold increase. Reticulocytes lagged behind the expansion of the nucleated red cell mass, but by the 5th day the original ratio was re-established. Hemoglobin synthesis was increased, but the ratio of hemoglobin synthesized in nucleated red cells and reticulocytes was basically unchanged. Early displacement of marrow reticulocytes into circulation and the production of a larger red cell also occurred. No evidence of a change in the number of erythroid mitoses was found; only a slight decrease in the average cell cycle time was demonstrated. Thus, whereas erythropoietin stimulation induced several changes in erythropoiesis, the increased number of cells entering into the maturing pool appeared to be of greatest quantitative significance. Splenectomy reduced the proliferative response of the erythron over 5 days stimulation to three-fourths that found in the normal animal. This difference, also reflected in a proportionately lower reticulocyte response and increment in circulating red cell mass, suggests that erythropoiesis within the mouse marrow is spatially or otherwise restricted and that the spleen provided a supplemental area of erythroid expansion. PMID:5020431

  18. Quantitative Analysis of CME Deflections in the Corona

    NASA Astrophysics Data System (ADS)

    Gui, Bin; Shen, Chenglong; Wang, Yuming; Ye, Pinzhong; Liu, Jiajia; Wang, Shui; Zhao, Xuepu

    2011-07-01

    In this paper, ten CME events viewed by the STEREO twin spacecraft are analyzed to study the deflections of CMEs during their propagation in the corona. Based on the three-dimensional information of the CMEs derived by the graduated cylindrical shell (GCS) model (Thernisien, Howard, and Vourlidas in Astrophys. J. 652, 1305, 2006), it is found that the propagation directions of eight CMEs had changed. By applying the theoretical method proposed by Shen et al. ( Solar Phys. 269, 389, 2011) to all the CMEs, we found that the deflections are consistent, in strength and direction, with the gradient of the magnetic energy density. There is a positive correlation between the deflection rate and the strength of the magnetic energy density gradient and a weak anti-correlation between the deflection rate and the CME speed. Our results suggest that the deflections of CMEs are mainly controlled by the background magnetic field and can be quantitatively described by the magnetic energy density gradient (MEDG) model.

  19. Quantitative proteomic analysis of amphotericin B resistance in Leishmania infantum

    PubMed Central

    Brotherton, Marie-Christine; Bourassa, Sylvie; Légaré, Danielle; Poirier, Guy G.; Droit, Arnaud; Ouellette, Marc

    2014-01-01

    Amphotericin B (AmB) in its liposomal form is now considered as either first- or second-line treatment against Leishmania infections in different part of the world. Few cases of AmB resistance have been reported and resistance mechanisms toward AmB are still poorly understood. This paper reports a large-scale comparative proteomic study in the context of AmB resistance. Quantitative proteomics using stable isotope labeling of amino acids in cell culture (SILAC) was used to better characterize cytoplasmic and membrane-enriched (ME) proteomes of the in vitro generated Leishmania infantum AmB resistant mutant AmB1000.1. In total, 97 individual proteins were found as differentially expressed between the mutant and its parental sensitive strain (WT). More than half of these proteins were either metabolic enzymes or involved in transcription or translation processes. Key energetic pathways such as glycolysis and TCA cycle were up-regulated in the mutant. Interestingly, many proteins involved in reactive oxygen species (ROS) scavenging and heat-shock proteins were also up-regulated in the resistant mutant. This work provides a basis for further investigations to understand the roles of proteins differentially expressed in relation with AmB resistance. PMID:25057462

  20. Space-to-Ground Communication for Columbus: A Quantitative Analysis.

    PubMed

    Uhlig, Thomas; Mannel, Thurid; Fortunato, Antonio; Illmer, Norbert

    2015-01-01

    The astronauts on board the International Space Station (ISS) are only the most visible part of a much larger team engaged around the clock in the performance of science and technical activities in space. The bulk of such team is scattered around the globe in five major Mission Control Centers (MCCs), as well as in a number of smaller payload operations centres. Communication between the crew in space and the flight controllers at those locations is an essential element and one of the key drivers to efficient space operations. Such communication can be carried out in different forms, depending on available technical assets and the selected operational approach for the activity at hand. This paper focuses on operational voice communication and provides a quantitative overview of the balance achieved in the Columbus program between collaborative space/ground operations and autonomous on-board activity execution. An interpretation of the current situation is provided, together with a description of potential future approaches for deep space exploration missions.

  1. Quantitative Proteome Analysis of Leishmania donovani under Spermidine Starvation

    PubMed Central

    Singh, Shalini; Dubey, Vikash Kumar

    2016-01-01

    We have earlier reported antileishmanial activity of hypericin by spermidine starvation. In the current report, we have used label free proteome quantitation approach to identify differentially modulated proteins after hypericin treatment. A total of 141 proteins were found to be differentially regulated with ANOVA P value less than 0.05 in hypericin treated Leishmania promastigotes. Differentially modulated proteins have been broadly classified under nine major categories. Increase in ribosomal protein S7 protein suggests the repression of translation. Inhibition of proteins related to ubiquitin proteasome system, RNA binding protein and translation initiation factor also suggests altered translation. We have also observed increased expression of Hsp 90, Hsp 83–1 and stress inducible protein 1. Significant decreased level of cyclophilin was observed. These stress related protein could be cellular response of the parasite towards hypericin induced cellular stress. Also, defective metabolism, biosynthesis and replication of nucleic acids, flagellar movement and signalling of the parasite were observed as indicated by altered expression of proteins involved in these pathways. The data was analyzed rigorously to get further insight into hypericin induced parasitic death. PMID:27123864

  2. Analysis of quantitative trait loci for behavioral laterality in mice.

    PubMed Central

    Roubertoux, Pierre L; Le Roy, Isabelle; Tordjman, Sylvie; Cherfou, Améziane; Migliore-Samour, Danièle

    2003-01-01

    Laterality is believed to have genetic components, as has been deduced from family studies in humans and responses to artificial selection in mice, but these genetic components are unknown and the underlying physiological mechanisms are still a subject of dispute. We measured direction of laterality (preferential use of left or right paws) and degree of laterality (absolute difference between the use of left and right paws) in C57BL/6ByJ (B) and NZB/BlNJ (N) mice and in their F(1) and F(2) intercrosses. Measurements were taken of both forepaws and hind paws. Quantitative trait loci (QTL) did not emerge for direction but did for degree of laterality. One QTL for forepaw (LOD score = 5.6) and the second QTL for hind paw (LOD score = 7.2) were both located on chromosome 4 and their peaks were within the same confidence interval. A QTL for plasma luteinizing hormone concentration was also found in the confidence interval of these two QTL. These results suggest that the physiological mechanisms underlying degree of laterality react to gonadal steroids. PMID:12663540

  3. Space-to-Ground Communication for Columbus: A Quantitative Analysis.

    PubMed

    Uhlig, Thomas; Mannel, Thurid; Fortunato, Antonio; Illmer, Norbert

    2015-01-01

    The astronauts on board the International Space Station (ISS) are only the most visible part of a much larger team engaged around the clock in the performance of science and technical activities in space. The bulk of such team is scattered around the globe in five major Mission Control Centers (MCCs), as well as in a number of smaller payload operations centres. Communication between the crew in space and the flight controllers at those locations is an essential element and one of the key drivers to efficient space operations. Such communication can be carried out in different forms, depending on available technical assets and the selected operational approach for the activity at hand. This paper focuses on operational voice communication and provides a quantitative overview of the balance achieved in the Columbus program between collaborative space/ground operations and autonomous on-board activity execution. An interpretation of the current situation is provided, together with a description of potential future approaches for deep space exploration missions. PMID:26290898

  4. Quantitative analysis of task selection for brain-computer interfaces

    NASA Astrophysics Data System (ADS)

    Llera, Alberto; Gómez, Vicenç; Kappen, Hilbert J.

    2014-10-01

    Objective. To assess quantitatively the impact of task selection in the performance of brain-computer interfaces (BCI). Approach. We consider the task-pairs derived from multi-class BCI imagery movement tasks in three different datasets. We analyze for the first time the benefits of task selection on a large-scale basis (109 users) and evaluate the possibility of transferring task-pair information across days for a given subject. Main results. Selecting the subject-dependent optimal task-pair among three different imagery movement tasks results in approximately 20% potential increase in the number of users that can be expected to control a binary BCI. The improvement is observed with respect to the best task-pair fixed across subjects. The best task-pair selected for each subject individually during a first day of recordings is generally a good task-pair in subsequent days. In general, task learning from the user side has a positive influence in the generalization of the optimal task-pair, but special attention should be given to inexperienced subjects. Significance. These results add significant evidence to existing literature that advocates task selection as a necessary step towards usable BCIs. This contribution motivates further research focused on deriving adaptive methods for task selection on larger sets of mental tasks in practical online scenarios.

  5. Quantitative analysis of pheromone-binding protein specificity

    PubMed Central

    Katti, S.; Lokhande, N.; González, D.; Cassill, A.; Renthal, R.

    2012-01-01

    Many pheromones have very low water solubility, posing experimental difficulties for quantitative binding measurements. A new method is presented for determining thermodynamically valid dissociation constants for ligands binding to pheromone-binding proteins (OBPs), using β-cyclodextrin as a solubilizer and transfer agent. The method is applied to LUSH, a Drosophila OBP that binds the pheromone 11-cis vaccenyl acetate (cVA). Refolding of LUSH expressed in E. coli was assessed by measuring N-phenyl-1-naphthylamine (NPN) binding and Förster resonance energy transfer between LUSH tryptophan 123 (W123) and NPN. Binding of cVA was measured from quenching of W123 fluorescence as a function of cVA concentration. The equilibrium constant for transfer of cVA between β-cyclodextrin and LUSH was determined from a linked equilibria model. This constant, multiplied by the β-cyclodextrin-cVA dissociation constant, gives the LUSH-cVA dissociation constant: ~100 nM. It was also found that other ligands quench W123 fluorescence. The LUSH-ligand dissociation constants were determined to be ~200 nM for the silk moth pheromone bombykol and ~90 nM for methyl oleate. The results indicate that the ligand-binding cavity of LUSH can accommodate a variety ligands with strong binding interactions. Implications of this for the pheromone receptor model proposed by Laughlin et al. (Cell 133: 1255–65, 2008) are discussed. PMID:23121132

  6. Quantitative analysis of virus and plasmid trafficking in cells

    NASA Astrophysics Data System (ADS)

    Lagache, Thibault; Dauty, Emmanuel; Holcman, David

    2009-01-01

    Intracellular transport of DNA carriers is a fundamental step of gene delivery. By combining both theoretical and numerical approaches we study here single and several viruses and DNA particles trafficking in the cell cytoplasm to a small nuclear pore. We present a physical model to account for certain aspects of cellular organization, starting with the observation that a viral trajectory consists of epochs of pure diffusion and epochs of active transport along microtubules. We define a general degradation rate to describe the limitations of the delivery of plasmid or viral particles to a nuclear pore imposed by various types of direct and indirect hydrolysis activity inside the cytoplasm. By replacing the switching dynamics by a single steady state stochastic description, we obtain estimates for the probability and the mean time for the first one of many particles to go from the cell membrane to a small nuclear pore. Computational simulations confirm that our model can be used to analyze and interpret viral trajectories and estimate quantitatively the success of nuclear delivery.

  7. Space-to-Ground Communication for Columbus: A Quantitative Analysis

    PubMed Central

    Uhlig, Thomas; Mannel, Thurid; Fortunato, Antonio; Illmer, Norbert

    2015-01-01

    The astronauts on board the International Space Station (ISS) are only the most visible part of a much larger team engaged around the clock in the performance of science and technical activities in space. The bulk of such team is scattered around the globe in five major Mission Control Centers (MCCs), as well as in a number of smaller payload operations centres. Communication between the crew in space and the flight controllers at those locations is an essential element and one of the key drivers to efficient space operations. Such communication can be carried out in different forms, depending on available technical assets and the selected operational approach for the activity at hand. This paper focuses on operational voice communication and provides a quantitative overview of the balance achieved in the Columbus program between collaborative space/ground operations and autonomous on-board activity execution. An interpretation of the current situation is provided, together with a description of potential future approaches for deep space exploration missions. PMID:26290898

  8. Depression in Parkinson's disease: a quantitative and qualitative analysis.

    PubMed Central

    Gotham, A M; Brown, R G; Marsden, C D

    1986-01-01

    Depression is a common feature of Parkinson's disease, a fact of both clinical and theoretical significance. Assessment of depression in Parkinson's disease is complicated by overlapping symptomatology in the two conditions, making global assessments based on observer or self-ratings of doubtful validity. The present study aimed to provide both a quantitative and qualitative description of the nature of the depressive changes found in Parkinson's disease as compared with normal elderly subjects and arthritis patients. As with previous studies, the patients with Parkinson's disease scored significantly higher than normal controls on various self-ratings of depression and anxiety but, in this study, did not differ from those with arthritis. Qualitatively, both the Parkinson's disease and the arthritis groups had depression characterised by pessimism and hopelessness, decreased motivation and drive, and increased concern with health. In contrast, the negative affective feelings of guilt, self-blame and worthlessness were absent in both patient groups. This pattern of depression was significantly associated with severity of illness and functional disability. However, these factors account for only a modest proportion of the variability in test scores. Probable unexplored factors are individual differences in coping style and availability of support. PMID:3701347

  9. Quantitative proteomic analysis of the Salmonella-lettuce interaction.

    PubMed

    Zhang, Yuping; Nandakumar, Renu; Bartelt-Hunt, Shannon L; Snow, Daniel D; Hodges, Laurie; Li, Xu

    2014-11-01

    Human pathogens can internalize food crops through root and surface uptake and persist inside crop plants. The goal of the study was to elucidate the global modulation of bacteria and plant protein expression after Salmonella internalizes lettuce. A quantitative proteomic approach was used to analyse the protein expression of Salmonella enterica serovar Infantis and lettuce cultivar Green Salad Bowl 24 h after infiltrating S. Infantis into lettuce leaves. Among the 50 differentially expressed proteins identified by comparing internalized S. Infantis against S. Infantis grown in Luria Broth, proteins involved in glycolysis were down-regulated, while one protein involved in ascorbate uptake was up-regulated. Stress response proteins, especially antioxidant proteins, were up-regulated. The modulation in protein expression suggested that internalized S. Infantis might utilize ascorbate as a carbon source and require multiple stress response proteins to cope with stresses encountered in plants. On the other hand, among the 20 differentially expressed lettuce proteins, proteins involved in defense response to bacteria were up-regulated. Moreover, the secreted effector PipB2 of S. Infantis and R proteins of lettuce were induced after bacterial internalization into lettuce leaves, indicating human pathogen S. Infantis triggered the defense mechanisms of lettuce, which normally responds to plant pathogens.

  10. Quantitative assessment of human motion using video motion analysis

    NASA Technical Reports Server (NTRS)

    Probe, John D.

    1990-01-01

    In the study of the dynamics and kinematics of the human body, a wide variety of technologies was developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development coupled with recent advances in video technology have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System to develop data on shirt-sleeved and space-suited human performance in order to plan efficient on orbit intravehicular and extravehicular activities. The system is described.

  11. An Introduction to Error Analysis for Quantitative Chemistry

    ERIC Educational Resources Information Center

    Neman, R. L.

    1972-01-01

    Describes two formulas for calculating errors due to instrument limitations which are usually found in gravimetric volumetric analysis and indicates their possible applications to other fields of science. (CC)

  12. The quantitative evaluation of the Clinical and Translational Science Awards (CTSA) program based on science mapping and scientometric analysis.

    PubMed

    Zhang, Yin; Wang, Lei; Diao, Tianxi

    2013-12-01

    The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results.

  13. Dependency, democracy, and infant mortality: a quantitative, cross-national analysis of less developed countries.

    PubMed

    Shandra, John M; Nobles, Jenna; London, Bruce; Williamson, John B

    2004-07-01

    This study presents quantitative, sociological models designed to account for cross-national variation in infant mortality rates. We consider variables linked to four different theoretical perspectives: the economic modernization, social modernization, political modernization, and dependency perspectives. The study is based on a panel regression analysis of a sample of 59 developing countries. Our preliminary analysis based on additive models replicates prior studies to the extent that we find that indicators linked to economic and social modernization have beneficial effects on infant mortality. We also find support for hypotheses derived from the dependency perspective suggesting that multinational corporate penetration fosters higher levels of infant mortality. Subsequent analysis incorporating interaction effects suggest that the level of political democracy conditions the effects of dependency relationships based upon exports, investments from multinational corporations, and international lending institutions. Transnational economic linkages associated with exports, multinational corporations, and international lending institutions adversely affect infant mortality more strongly at lower levels of democracy than at higher levels of democracy: intranational, political factors interact with the international, economic forces to affect infant mortality. We conclude with some brief policy recommendations and suggestions for the direction of future research. PMID:15110423

  14. Enhancing local action planning through quantitative flood risk analysis: a case study in Spain

    NASA Astrophysics Data System (ADS)

    Castillo-Rodríguez, Jesica Tamara; Escuder-Bueno, Ignacio; Perales-Momparler, Sara; Ramón Porta-Sancho, Juan

    2016-07-01

    This article presents a method to incorporate and promote quantitative risk analysis to support local action planning against flooding. The proposed approach aims to provide a framework for local flood risk analysis, combining hazard mapping with vulnerability data to quantify risk in terms of expected annual affected population, potential injuries, number of fatalities, and economic damages. Flood risk is estimated combining GIS data of loads, system response, and consequences and using event tree modelling for risk calculation. The study area is the city of Oliva, located on the eastern coast of Spain. Results from risk modelling have been used to inform local action planning and to assess the benefits of structural and non-structural risk reduction measures. Results show the potential impact on risk reduction of flood defences and improved warning communication schemes through local action planning: societal flood risk (in terms of annual expected affected population) would be reduced up to 51 % by combining both structural and non-structural measures. In addition, the effect of seasonal population variability is analysed (annual expected affected population ranges from 82 to 107 %, compared with the current situation, depending on occupancy rates in hotels and campsites). Results highlight the need for robust and standardized methods for urban flood risk analysis replicability at regional and national scale.

  15. Dependency, democracy, and infant mortality: a quantitative, cross-national analysis of less developed countries.

    PubMed

    Shandra, John M; Nobles, Jenna; London, Bruce; Williamson, John B

    2004-07-01

    This study presents quantitative, sociological models designed to account for cross-national variation in infant mortality rates. We consider variables linked to four different theoretical perspectives: the economic modernization, social modernization, political modernization, and dependency perspectives. The study is based on a panel regression analysis of a sample of 59 developing countries. Our preliminary analysis based on additive models replicates prior studies to the extent that we find that indicators linked to economic and social modernization have beneficial effects on infant mortality. We also find support for hypotheses derived from the dependency perspective suggesting that multinational corporate penetration fosters higher levels of infant mortality. Subsequent analysis incorporating interaction effects suggest that the level of political democracy conditions the effects of dependency relationships based upon exports, investments from multinational corporations, and international lending institutions. Transnational economic linkages associated with exports, multinational corporations, and international lending institutions adversely affect infant mortality more strongly at lower levels of democracy than at higher levels of democracy: intranational, political factors interact with the international, economic forces to affect infant mortality. We conclude with some brief policy recommendations and suggestions for the direction of future research.

  16. GPR-Analyzer: a simple tool for quantitative analysis of hierarchical multispecies microarrays.

    PubMed

    Dittami, Simon M; Edvardsen, Bente

    2013-10-01

    Monitoring of marine microalgae is important to predict and manage harmful algae blooms. It currently relies mainly on light-microscopic identification and enumeration of algal cells, yet several molecular tools are currently being developed to complement traditional methods. MIcroarray Detection of Toxic ALgae (MIDTAL) is an FP7-funded EU project aiming to establish a hierarchical multispecies microarray as one of these tools. Prototype arrays are currently being tested with field samples, yet the analysis of the large quantities of data generated by these arrays presents a challenge as suitable analysis tools or protocols are scarce. This paper proposes a two-part protocol for the analysis of the MIDTAL and other hierarchical multispecies arrays: Signal-to-noise ratios can be used to determine the presence or absence of signals and to identify potential false-positives considering parallel and hierarchical probes. In addition, normalized total signal intensities are recommended for comparisons between microarrays and in order to relate signals for specific probes to cell concentrations using external calibration curves. Hybridization- and probe-specific detection limits can be calculated to help evaluate negative results. The suggested analyses were implemented in "GPR-Analyzer", a platform-independent and graphical user interface-based application, enabling non-specialist users to quickly and quantitatively analyze hierarchical multispecies microarrays. It is available online at http://folk.uio.no/edvardse/gpranalyzer . PMID:22767354

  17. Quantitative assessment of human motion using video motion analysis

    NASA Technical Reports Server (NTRS)

    Probe, John D.

    1993-01-01

    In the study of the dynamics and kinematics of the human body a wide variety of technologies has been developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development, coupled with recent advances in video technology, have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System (APAS) to develop data on shirtsleeved and space-suited human performance in order to plan efficient on-orbit intravehicular and extravehicular activities. APAS is a fully integrated system of hardware and software for biomechanics and the analysis of human performance and generalized motion measurement. Major components of the complete system include the video system, the AT compatible computer, and the proprietary software.

  18. Software applications toward quantitative metabolic flux analysis and modeling.

    PubMed

    Dandekar, Thomas; Fieselmann, Astrid; Majeed, Saman; Ahmed, Zeeshan

    2014-01-01

    Metabolites and their pathways are central for adaptation and survival. Metabolic modeling elucidates in silico all the possible flux pathways (flux balance analysis, FBA) and predicts the actual fluxes under a given situation, further refinement of these models is possible by including experimental isotopologue data. In this review, we initially introduce the key theoretical concepts and different analysis steps in the modeling process before comparing flux calculation and metabolite analysis programs such as C13, BioOpt, COBRA toolbox, Metatool, efmtool, FiatFlux, ReMatch, VANTED, iMAT and YANA. Their respective strengths and limitations are discussed and compared to alternative software. While data analysis of metabolites, calculation of metabolic fluxes, pathways and their condition-specific changes are all possible, we highlight the considerations that need to be taken into account before deciding on a specific software. Current challenges in the field include the computation of large-scale networks (in elementary mode analysis), regulatory interactions and detailed kinetics, and these are discussed in the light of powerful new approaches.

  19. SearchLight: a freely available web-based quantitative spectral analysis tool (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Prabhat, Prashant; Peet, Michael; Erdogan, Turan

    2016-03-01

    In order to design a fluorescence experiment, typically the spectra of a fluorophore and of a filter set are overlaid on a single graph and the spectral overlap is evaluated intuitively. However, in a typical fluorescence imaging system the fluorophores and optical filters are not the only wavelength dependent variables - even the excitation light sources have been changing. For example, LED Light Engines may have a significantly different spectral response compared to the traditional metal-halide lamps. Therefore, for a more accurate assessment of fluorophore-to-filter-set compatibility, all sources of spectral variation should be taken into account simultaneously. Additionally, intuitive or qualitative evaluation of many spectra does not necessarily provide a realistic assessment of the system performance. "SearchLight" is a freely available web-based spectral plotting and analysis tool that can be used to address the need for accurate, quantitative spectral evaluation of fluorescence measurement systems. This tool is available at: http://searchlight.semrock.com/. Based on a detailed mathematical framework [1], SearchLight calculates signal, noise, and signal-to-noise ratio for multiple combinations of fluorophores, filter sets, light sources and detectors. SearchLight allows for qualitative and quantitative evaluation of the compatibility of filter sets with fluorophores, analysis of bleed-through, identification of optimized spectral edge locations for a set of filters under specific experimental conditions, and guidance regarding labeling protocols in multiplexing imaging assays. Entire SearchLight sessions can be shared with colleagues and collaborators and saved for future reference. [1] Anderson, N., Prabhat, P. and Erdogan, T., Spectral Modeling in Fluorescence Microscopy, http://www.semrock.com (2010).

  20. Quantitative analysis of nailfold capillary morphology in patients with fibromyalgia

    PubMed Central

    Choi, Dug-Hyun

    2015-01-01

    Background/Aims Nailfold capillaroscopy (NFC) has been used to examine morphological and functional microcirculation changes in connective tissue diseases. It has been demonstrated that NFC patterns reflect abnormal microvascular dynamics, which may play a role in fibromyalgia (FM) syndrome. The aim of this study was to determine NFC patterns in FM, and their association with clinical features of FM. Methods A total of 67 patients with FM, and 30 age- and sex-matched healthy controls, were included. Nailfold capillary patterns were quantitatively analyzed using computerized NFC. The parameters of interest were as follows: number of capillaries within the central 3 mm, deletion score, apical limb width, capillary width, and capillary dimension. Capillary dimension was determined by calculating the number of capillaries using the Adobe Photoshop version 7.0. Results FM patients had a lower number of capillaries and higher deletion scores on NFC compared to healthy controls (17.3 ± 1.7 vs. 21.8 ± 2.9, p < 0.05; 2.2 ± 0.9 vs. 0.7 ± 0.6, p < 0.05, respectively). Both apical limb width (µm) and capillary width (µm) were significantly decreased in FM patients (1.1 ± 0.2 vs. 3.7 ± 0.6; 5.4 ± 0.5 vs. 7.5 ± 1.4, respectively), indicating that FM patients have abnormally decreased digital capillary diameter and density. Interestingly, there was no difference in capillary dimension between the two groups, suggesting that the length or tortuosity of capillaries in FM patients is increased to compensate for diminished microcirculation. Conclusions FM patients had altered capillary density and diameter in the digits. Diminished microcirculation on NFC may alter capillary density and increase tortuosity. PMID:26161020

  1. Quantitative analysis of wrist electrodermal activity during sleep

    PubMed Central

    Sano, Akane; Picard, Rosalind W.; Stickgold, Robert

    2015-01-01

    We present the first quantitative characterization of electrodermal activity (EDA) patterns on the wrists of healthy adults during sleep using dry electrodes. We compare the new results on the wrist to prior findings on palmar or finger EDA by characterizing data measured from 80 nights of sleep consisting of 9 nights of wrist and palm EDA from 9 healthy adults sleeping at home, 56 nights of wrist and palm EDA from one healthy adult sleeping at home, and 15 nights of wrist EDA from 15 healthy adults in a sleep laboratory, with the latter compared to concurrent polysomnography. While high frequency patterns of EDA called “storms” were identified by eye in the 1960’s, we systematically compare thresholds for automatically detecting EDA peaks and establish criteria for EDA storms. We found that more than 80% of EDA peaks occurred in non-REM sleep, specifically during slow-wave sleep (SWS) and non-REM stage 2 sleep (NREM2). Also, EDA amplitude is higher in SWS than in other sleep stages. Longer EDA storms were more likely in the first two quarters of sleep and during SWS and NREM2. We also found from the home studies (65 nights) that EDA levels were higher and the skin conductance peaks were larger and more frequent when measured on the wrist than when measured on the palm. These EDA high frequency peaks and high amplitude were sometimes associated with higher skin temperature, but more work is needed looking at neurological and other EDA elicitors in order to elucidate their complete behavior. PMID:25286449

  2. A comparative analysis of British and Taiwanese students' conceptual and procedural knowledge of fraction addition

    NASA Astrophysics Data System (ADS)

    Li, Hui-Chuan

    2014-10-01

    This study examines students' procedural and conceptual achievement in fraction addition in England and Taiwan. A total of 1209 participants (561 British students and 648 Taiwanese students) at ages 12 and 13 were recruited from England and Taiwan to take part in the study. A quantitative design by means of a self-designed written test is adopted as central to the methodological considerations. The test has two major parts: the concept part and the skill part. The former is concerned with students' conceptual knowledge of fraction addition and the latter is interested in students' procedural competence when adding fractions. There were statistically significant differences both in concept and skill parts between the British and Taiwanese groups with the latter having a higher score. The analysis of the students' responses to the skill section indicates that the superiority of Taiwanese students' procedural achievements over those of their British peers is because most of the former are able to apply algorithms to adding fractions far more successfully than the latter. Earlier, Hart [1] reported that around 30% of the British students in their study used an erroneous strategy (adding tops and bottoms, for example, 2/3 + 1/7 = 3/10) while adding fractions. This study also finds that nearly the same percentage of the British group remained using this erroneous strategy to add fractions as Hart found in 1981. The study also provides evidence to show that students' understanding of fractions is confused and incomplete, even those who are successfully able to perform operations. More research is needed to be done to help students make sense of the operations and eventually attain computational competence with meaningful grounding in the domain of fractions.

  3. Balancing Yin and Yang: Teaching and Learning Qualitative Data Analysis Within an Undergraduate Quantitative Data Analysis Course.

    ERIC Educational Resources Information Center

    Clark, Roger; Lang, Angela

    2002-01-01

    Describes an undergraduate sociology course that taught qualitative and quantitative data analysis. Focuses on two students and how they dealt with and overcame anxiety issues, subsequently achieving higher levels of learning and new learning strategies. (KDR)

  4. BioMercator V3: an upgrade of genetic map compilation and quantitative trait loci meta-analysis algorithms

    PubMed Central

    Sosnowski, Olivier; Charcosset, Alain; Joets, Johann

    2012-01-01

    Summary: Compilation of genetic maps combined to quantitative trait loci (QTL) meta-analysis has proven to be a powerful approach contributing to the identification of candidate genes underlying quantitative traits. BioMercator was the first software offering a complete set of algorithms and visualization tool covering all steps required to perform QTL meta-analysis. Despite several limitations, the software is still widely used. We developed a new version proposing additional up to date methods and improving graphical representation and exploration of large datasets. Availability and implementation: BioMercator V3 is implemented in JAVA and freely available (http://moulon.inra.fr/biomercator) Contact: joets@moulon.inra.fr PMID:22661647

  5. Performing Quantitative Imaging Acquisition, Analysis and Visualization Using the Best of Open Source and Commercial Software Solutions

    PubMed Central

    Shenoy, Shailesh M.

    2016-01-01

    A challenge in any imaging laboratory, especially one that uses modern techniques, is to achieve a sustainable and productive balance between using open source and commercial software to perform quantitative image acquisition, analysis and visualization. In addition to considering the expense of software licensing, one must consider factors such as the quality and usefulness of the software’s support, training and documentation. Also, one must consider the reproducibility with which multiple people generate results using the same software to perform the same analysis, how one may distribute their methods to the community using the software and the potential for achieving automation to improve productivity. PMID:27516727

  6. Quantitative Immunofluorescence Analysis of Nucleolus-Associated Chromatin.

    PubMed

    Dillinger, Stefan; Németh, Attila

    2016-01-01

    The nuclear distribution of eu- and heterochromatin is nonrandom, heterogeneous, and dynamic, which is mirrored by specific spatiotemporal arrangements of histone posttranslational modifications (PTMs). Here we describe a semiautomated method for the analysis of histone PTM localization patterns within the mammalian nucleus using confocal laser scanning microscope images of fixed, immunofluorescence stained cells as data source. The ImageJ-based process includes the segmentation of the nucleus, furthermore measurements of total fluorescence intensities, the heterogeneity of the staining, and the frequency of the brightest pixels in the region of interest (ROI). In the presented image analysis pipeline, the perinucleolar chromatin is selected as primary ROI, and the nuclear periphery as secondary ROI.

  7. Concentration Analysis: A Quantitative Assessment of Student States.

    ERIC Educational Resources Information Center

    Bao, Lei; Redish, Edward F.

    Multiple-choice tests such as the Force Concept Inventory (FCI) provide useful instruments to probe the distribution of student difficulties on a large scale. However, traditional analysis often relies solely on scores (number of students giving the correct answer). This ignores what can be significant and important information: the distribution…

  8. Reflectance spectroscopy: quantitative analysis techniques for remote sensing applications.

    USGS Publications Warehouse

    Clark, R.N.; Roush, T.L.

    1984-01-01

    Several methods for the analysis of remotely sensed reflectance data are compared, including empirical methods and scattering theories, both of which are important for solving remote sensing problems. The concept of the photon mean path length and the implications for use in modeling reflectance spectra are presented.-from Authors

  9. Concentration Analysis: A Quantitative Assessment of Student States.

    ERIC Educational Resources Information Center

    Bao, Lei; Redish, Edward F.

    2001-01-01

    Explains that multiple-choice tests such as the Force Concept Inventory (FCI) provide useful instruments to probe the distribution of student difficulties on a large scale. Introduces a new method, concentration analysis, to measure how students' responses on multiple-choice questions are distributed. (Contains 18 references.) (Author/YDS)

  10. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    ERIC Educational Resources Information Center

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  11. EXPLoRA-web: linkage analysis of quantitative trait loci using bulk segregant analysis.

    PubMed

    Pulido-Tamayo, Sergio; Duitama, Jorge; Marchal, Kathleen

    2016-07-01

    Identification of genomic regions associated with a phenotype of interest is a fundamental step toward solving questions in biology and improving industrial research. Bulk segregant analysis (BSA) combined with high-throughput sequencing is a technique to efficiently identify these genomic regions associated with a trait of interest. However, distinguishing true from spuriously linked genomic regions and accurately delineating the genomic positions of these truly linked regions requires the use of complex statistical models currently implemented in software tools that are generally difficult to operate for non-expert users. To facilitate the exploration and analysis of data generated by bulked segregant analysis, we present EXPLoRA-web, a web service wrapped around our previously published algorithm EXPLoRA, which exploits linkage disequilibrium to increase the power and accuracy of quantitative trait loci identification in BSA analysis. EXPLoRA-web provides a user friendly interface that enables easy data upload and parallel processing of different parameter configurations. Results are provided graphically and as BED file and/or text file and the input is expected in widely used formats, enabling straightforward BSA data analysis. The web server is available at http://bioinformatics.intec.ugent.be/explora-web/.

  12. EXPLoRA-web: linkage analysis of quantitative trait loci using bulk segregant analysis

    PubMed Central

    Pulido-Tamayo, Sergio; Duitama, Jorge; Marchal, Kathleen

    2016-01-01

    Identification of genomic regions associated with a phenotype of interest is a fundamental step toward solving questions in biology and improving industrial research. Bulk segregant analysis (BSA) combined with high-throughput sequencing is a technique to efficiently identify these genomic regions associated with a trait of interest. However, distinguishing true from spuriously linked genomic regions and accurately delineating the genomic positions of these truly linked regions requires the use of complex statistical models currently implemented in software tools that are generally difficult to operate for non-expert users. To facilitate the exploration and analysis of data generated by bulked segregant analysis, we present EXPLoRA-web, a web service wrapped around our previously published algorithm EXPLoRA, which exploits linkage disequilibrium to increase the power and accuracy of quantitative trait loci identification in BSA analysis. EXPLoRA-web provides a user friendly interface that enables easy data upload and parallel processing of different parameter configurations. Results are provided graphically and as BED file and/or text file and the input is expected in widely used formats, enabling straightforward BSA data analysis. The web server is available at http://bioinformatics.intec.ugent.be/explora-web/. PMID:27105844

  13. EXPLoRA-web: linkage analysis of quantitative trait loci using bulk segregant analysis.

    PubMed

    Pulido-Tamayo, Sergio; Duitama, Jorge; Marchal, Kathleen

    2016-07-01

    Identification of genomic regions associated with a phenotype of interest is a fundamental step toward solving questions in biology and improving industrial research. Bulk segregant analysis (BSA) combined with high-throughput sequencing is a technique to efficiently identify these genomic regions associated with a trait of interest. However, distinguishing true from spuriously linked genomic regions and accurately delineating the genomic positions of these truly linked regions requires the use of complex statistical models currently implemented in software tools that are generally difficult to operate for non-expert users. To facilitate the exploration and analysis of data generated by bulked segregant analysis, we present EXPLoRA-web, a web service wrapped around our previously published algorithm EXPLoRA, which exploits linkage disequilibrium to increase the power and accuracy of quantitative trait loci identification in BSA analysis. EXPLoRA-web provides a user friendly interface that enables easy data upload and parallel processing of different parameter configurations. Results are provided graphically and as BED file and/or text file and the input is expected in widely used formats, enabling straightforward BSA data analysis. The web server is available at http://bioinformatics.intec.ugent.be/explora-web/. PMID:27105844

  14. Biomolecular computation with molecular beacons for quantitative analysis of target nucleic acids.

    PubMed

    Lim, Hee-Woong; Lee, Seung Hwan; Yang, Kyung-Ae; Yoo, Suk-In; Park, Tai Hyun; Zhang, Byoung-Tak

    2013-01-01

    Molecular beacons are efficient and useful tools for quantitative detection of specific target nucleic acids. Thanks to their simple protocol, molecular beacons have great potential as substrates for biomolecular computing. Here we present a molecular beacon-based biomolecular computing method for quantitative detection and analysis of target nucleic acids. Whereas the conventional quantitative assays using fluorescent dyes have been designed for single target detection or multiplexed detection, the proposed method enables us not only to detect multiple targets but also to compute their quantitative information by weighted-sum of the targets. The detection and computation are performed on a molecular level simultaneously, and the outputs are detected as fluorescence signals. Experimental results show the feasibility and effectiveness of our weighted detection and linear combination method using molecular beacons. Our method can serve as a primitive operation of molecular pattern analysis, and we demonstrate successful binary classifications of molecular patterns made of synthetic oligonucleotide DNA molecules.

  15. On the quantitative analysis and evaluation of magnetic hysteresis data

    NASA Astrophysics Data System (ADS)

    Jackson, Mike; Solheid, Peter

    2010-04-01

    Magnetic hysteresis data are centrally important in pure and applied rock magnetism, but to date, no objective quantitative methods have been developed for assessment of data quality and of the uncertainty in parameters calculated from imperfect data. We propose several initial steps toward such assessment, using loop symmetry as an important key. With a few notable exceptions (e.g., related to field cooling and exchange bias), magnetic hysteresis loops possess a high degree of inversion symmetry (M(H) = -M(-H)). This property enables us to treat the upper and lower half-loops as replicate measurements for quantification of random noise, drift, and offsets. This, in turn, makes it possible to evaluate the statistical significance of nonlinearity, either in the high-field region (due to nonsaturation of the ferromagnetic moment) or over the complete range of applied fields (due to nonnegligible contribution of ferromagnetic phases to the total magnetic signal). It also allows us to quantify the significance of fitting errors for model loops constructed from analytical basis functions. When a statistically significant high-field nonlinearity is found, magnetic parameters must be calculated by approach-to-saturation fitting, e.g., by a model of the form M(H) = Ms + χHFH + αHβ. This nonlinear high-field inverse modeling problem is strongly ill conditioned, resulting in large and strongly covariant uncertainties in the fitted parameters, which we characterize through bootstrap analyses. For a variety of materials, including ferrihydrite and mid-ocean ridge basalts, measured in applied fields up to about 1.5 T, we find that the calculated value of the exponent β is extremely sensitive to small differences in the data or in the method of processing and that the overall uncertainty exceeds the range of physically reasonable values. The "unknowability" of β is accompanied by relatively large uncertainties in the other parameters, which can be characterized, if not

  16. Optimized Protocol for Quantitative Multiple Reaction Monitoring-Based Proteomic Analysis of Formalin-Fixed, Paraffin-Embedded Tissues.

    PubMed

    Kennedy, Jacob J; Whiteaker, Jeffrey R; Schoenherr, Regine M; Yan, Ping; Allison, Kimberly; Shipley, Melissa; Lerch, Melissa; Hoofnagle, Andrew N; Baird, Geoffrey Stuart; Paulovich, Amanda G

    2016-08-01

    Despite a clinical, economic, and regulatory imperative to develop companion diagnostics, precious few new biomarkers have been successfully translated into clinical use, due in part to inadequate protein assay technologies to support large-scale testing of hundreds of candidate biomarkers in formalin-fixed paraffin-embedded (FFPE) tissues. Although the feasibility of using targeted, multiple reaction monitoring mass spectrometry (MRM-MS) for quantitative analyses of FFPE tissues has been demonstrated, protocols have not been systematically optimized for robust quantification across a large number of analytes, nor has the performance of peptide immuno-MRM been evaluated. To address this gap, we used a test battery approach coupled to MRM-MS with the addition of stable isotope-labeled standard peptides (targeting 512 analytes) to quantitatively evaluate the performance of three extraction protocols in combination with three trypsin digestion protocols (i.e., nine processes). A process based on RapiGest buffer extraction and urea-based digestion was identified to enable similar quantitation results from FFPE and frozen tissues. Using the optimized protocols for MRM-based analysis of FFPE tissues, median precision was 11.4% (across 249 analytes). There was excellent correlation between measurements made on matched FFPE and frozen tissues, both for direct MRM analysis (R(2) = 0.94) and immuno-MRM (R(2) = 0.89). The optimized process enables highly reproducible, multiplex, standardizable, quantitative MRM in archival tissue specimens.

  17. Optimized Protocol for Quantitative Multiple Reaction Monitoring-Based Proteomic Analysis of Formalin-Fixed, Paraffin-Embedded Tissues.

    PubMed

    Kennedy, Jacob J; Whiteaker, Jeffrey R; Schoenherr, Regine M; Yan, Ping; Allison, Kimberly; Shipley, Melissa; Lerch, Melissa; Hoofnagle, Andrew N; Baird, Geoffrey Stuart; Paulovich, Amanda G

    2016-08-01

    Despite a clinical, economic, and regulatory imperative to develop companion diagnostics, precious few new biomarkers have been successfully translated into clinical use, due in part to inadequate protein assay technologies to support large-scale testing of hundreds of candidate biomarkers in formalin-fixed paraffin-embedded (FFPE) tissues. Although the feasibility of using targeted, multiple reaction monitoring mass spectrometry (MRM-MS) for quantitative analyses of FFPE tissues has been demonstrated, protocols have not been systematically optimized for robust quantification across a large number of analytes, nor has the performance of peptide immuno-MRM been evaluated. To address this gap, we used a test battery approach coupled to MRM-MS with the addition of stable isotope-labeled standard peptides (targeting 512 analytes) to quantitatively evaluate the performance of three extraction protocols in combination with three trypsin digestion protocols (i.e., nine processes). A process based on RapiGest buffer extraction and urea-based digestion was identified to enable similar quantitation results from FFPE and frozen tissues. Using the optimized protocols for MRM-based analysis of FFPE tissues, median precision was 11.4% (across 249 analytes). There was excellent correlation between measurements made on matched FFPE and frozen tissues, both for direct MRM analysis (R(2) = 0.94) and immuno-MRM (R(2) = 0.89). The optimized process enables highly reproducible, multiplex, standardizable, quantitative MRM in archival tissue specimens. PMID:27462933

  18. Optimized protocol for quantitative multiple reaction monitoring-based proteomic analysis of formalin-fixed, paraffin embedded tissues

    PubMed Central

    Kennedy, Jacob J.; Whiteaker, Jeffrey R.; Schoenherr, Regine M.; Yan, Ping; Allison, Kimberly; Shipley, Melissa; Lerch, Melissa; Hoofnagle, Andrew N.; Baird, Geoffrey Stuart; Paulovich, Amanda G.

    2016-01-01

    Despite a clinical, economic, and regulatory imperative to develop companion diagnostics, precious few new biomarkers have been successfully translated into clinical use, due in part to inadequate protein assay technologies to support large-scale testing of hundreds of candidate biomarkers in formalin-fixed paraffin embedded (FFPE) tissues. While the feasibility of using targeted, multiple reaction monitoring-mass spectrometry (MRM-MS) for quantitative analyses of FFPE tissues has been demonstrated, protocols have not been systematically optimized for robust quantification across a large number of analytes, nor has the performance of peptide immuno-MRM been evaluated. To address this gap, we used a test battery approach coupled to MRM-MS with the addition of stable isotope labeled standard peptides (targeting 512 analytes) to quantitatively evaluate the performance of three extraction protocols in combination with three trypsin digestion protocols (i.e. 9 processes). A process based on RapiGest buffer extraction and urea-based digestion was identified to enable similar quantitation results from FFPE and frozen tissues. Using the optimized protocols for MRM-based analysis of FFPE tissues, median precision was 11.4% (across 249 analytes). There was excellent correlation between measurements made on matched FFPE and frozen tissues, both for direct MRM analysis (R2 = 0.94) and immuno-MRM (R2 = 0.89). The optimized process enables highly reproducible, multiplex, standardizable, quantitative MRM in archival tissue specimens. PMID:27462933

  19. Probabilistic reliability analysis, quantitative safety goals, and nuclear licensing in the United Kingdom.

    PubMed

    Cannell, W

    1987-09-01

    Although unpublicized, the use of quantitative safety goals and probabilistic reliability analysis for licensing nuclear reactors has become a reality in the United Kingdom. This conclusion results from an examination of the process leading to the licensing of the Sizewell B PWR in England. The licensing process for this reactor has substantial implications for nuclear safety standards in Britain, and is examined in the context of the growing trend towards quantitative safety goals in the United States. PMID:3685540

  20. Concentration analysis: A quantitative assessment of student states

    NASA Astrophysics Data System (ADS)

    Bao, Lei; Redish, Edward F.

    2001-07-01

    Multiple-choice tests such as the Force Concept Inventory (FCI) provide useful instruments to probe the distribution of student difficulties on a large scale. However, traditional analysis often relies solely on scores (number of students giving the correct answer). This ignores what can be significant and important information: the distribution of wrong answers given by the class. In this paper we introduce a new method, concentration analysis, to measure how students' responses on multiple-choice questions are distributed. This information can be used to study if the students have common incorrect models or if the question is effective in detecting student models. When combined with information obtained from qualitative research, the method allows us to identify cleanly what FCI results are telling us about student knowledge.

  1. Quantitative Immunofluorescence Analysis of Nucleolus-Associated Chromatin.

    PubMed

    Dillinger, Stefan; Németh, Attila

    2016-01-01

    The nuclear distribution of eu- and heterochromatin is nonrandom, heterogeneous, and dynamic, which is mirrored by specific spatiotemporal arrangements of histone posttranslational modifications (PTMs). Here we describe a semiautomated method for the analysis of histone PTM localization patterns within the mammalian nucleus using confocal laser scanning microscope images of fixed, immunofluorescence stained cells as data source. The ImageJ-based process includes the segmentation of the nucleus, furthermore measurements of total fluorescence intensities, the heterogeneity of the staining, and the frequency of the brightest pixels in the region of interest (ROI). In the presented image analysis pipeline, the perinucleolar chromatin is selected as primary ROI, and the nuclear periphery as secondary ROI. PMID:27576710

  2. Integrated quantitative fractal polarimetric analysis of monolayer lung cancer cells

    NASA Astrophysics Data System (ADS)

    Shrestha, Suman; Zhang, Lin; Quang, Tri; Farrahi, Tannaz; Narayan, Chaya; Deshpande, Aditi; Na, Ying; Blinzler, Adam; Ma, Junyu; Liu, Bo; Giakos, George C.

    2014-05-01

    Digital diagnostic pathology has become one of the most valuable and convenient advancements in technology over the past years. It allows us to acquire, store and analyze pathological information from the images of histological and immunohistochemical glass slides which are scanned to create digital slides. In this study, efficient fractal, wavelet-based polarimetric techniques for histological analysis of monolayer lung cancer cells will be introduced and different monolayer cancer lines will be studied. The outcome of this study indicates that application of fractal, wavelet polarimetric principles towards the analysis of squamous carcinoma and adenocarcinoma cancer cell lines may be proved extremely useful in discriminating among healthy and lung cancer cells as well as differentiating among different lung cancer cells.

  3. Watershed Planning within a Quantitative Scenario Analysis Framework.

    PubMed

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-01-01

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation.

  4. Watershed Planning within a Quantitative Scenario Analysis Framework.

    PubMed

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-01-01

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation. PMID:27501287

  5. Quantitative Analysis of PMLA Nanoconjugate Components after Backbone Cleavage

    PubMed Central

    Ding, Hui; Patil, Rameshwar; Portilla-Arias, Jose; Black, Keith L.; Ljubimova, Julia Y.; Holler, Eggehard

    2015-01-01

    Multifunctional polymer nanoconjugates containing multiple components show great promise in cancer therapy, but in most cases complete analysis of each component is difficult. Polymalic acid (PMLA) based nanoconjugates have demonstrated successful brain and breast cancer treatment. They consist of multiple components including targeting antibodies, Morpholino antisense oligonucleotides (AONs), and endosome escape moieties. The component analysis of PMLA nanoconjugates is extremely difficult using conventional spectrometry and HPLC method. Taking advantage of the nature of polyester of PMLA, which can be cleaved by ammonium hydroxide, we describe a method to analyze the content of antibody and AON within nanoconjugates simultaneously using SEC-HPLC by selectively cleaving the PMLA backbone. The selected cleavage conditions only degrade PMLA without affecting the integrity and biological activity of the antibody. Although the amount of antibody could also be determined using the bicinchoninic acid (BCA) method, our selective cleavage method gives more reliable results and is more powerful. Our approach provides a new direction for the component analysis of polymer nanoconjugates and nanoparticles. PMID:25894227

  6. Quantitative PCR analysis of CYP1A induction in Atlantic salmon (Salmo salar)

    USGS Publications Warehouse

    Rees, C.B.; McCormick, S.D.; Vanden, Heuvel J.P.; Li, W.

    2003-01-01

    Environmental pollutants are hypothesized to be one of the causes of recent declines in wild populations of Atlantic salmon (Salmo salar) across Eastern Canada and the United States. Some of these pollutants, such as polychlorinated biphenyls and dioxins, are known to induce expression of the CYP1A subfamily of genes. We applied a highly sensitive technique, quantitative reverse transcription-polymerase chain reaction (RT-PCR), for measuring the levels of CYP1A induction in Atlantic salmon. This assay was used to detect patterns of CYP1A mRNA levels, a direct measure of CYP1A expression, in Atlantic salmon exposed to pollutants under both laboratory and field conditions. Two groups of salmon were acclimated to 11 and 17??C, respectively. Each subject then received an intraperitoneal injection (50 mg kg-1) of either ??-naphthoflavone (BNF) in corn oil (10 mg BNF ml-1 corn oil) or corn oil alone. After 48 h, salmon gill, kidney, liver, and brain were collected for RNA isolation and analysis. All tissues showed induction of CYP1A by BNF. The highest base level of CYP1A expression (2.56??1010 molecules/??g RNA) was found in gill tissue. Kidney had the highest mean induction at five orders of magnitude while gill tissue showed the lowest mean induction at two orders of magnitude. The quantitative RT-PCR was also applied to salmon sampled from two streams in Massachusetts, USA. Salmon liver and gill tissue sampled from Millers River (South Royalston, Worcester County), known to contain polychlorinated biphenyls (PCBs), showed on average a two orders of magnitude induction over those collected from a stream with no known contamination (Fourmile Brook, Northfield, Franklin County). Overall, the data show CYP1A exists and is inducible in Atlantic salmon gill, brain, kidney, and liver tissue. In addition, the results obtained demonstrate that quantitative PCR analysis of CYP1A expression is useful in studying ecotoxicity in populations of Atlantic salmon in the wild. ?? 2003

  7. Quantitative analysis of nitrocellulose and pulp in gunpowder by using thermogravimetric analysis/FTIR

    NASA Astrophysics Data System (ADS)

    Johnson, David J.; Compton, David A.

    1989-12-01

    Thermogravimetric Analysis (TGA) has routinely been used to quantitatively determine the presence of a specific component within a material by direct measurement from the weight loss profile. This technique works well when it is known that the detected weight loss was caused only by that component. If more than one material evolves during a single weight loss it is impossible to quantify the contribution of each individual component by using stand-alone TGA. However by coupling an FT-IR to the TGA one may assign evolved gases to a detected weight loss and potentially isolate each iny dividual material. Although a number of gases may evolve during one weight loss, the judicious selection of "Specific Gas Profiles" may allow the experimentalist to isolate each gas. The SGP is a measure of IR absorbance within specific frequency regions as a function of time. Through the use of standards, integration of theseprofiles allows the operator to quantitate the various components in an unknownp. Data from this research will show that nitrocellulose andpulp content in gun powder samples may be measured using the TGA/FT-IR technique.

  8. A Quantitative Analysis of the Extrinsic and Intrinsic Turnover Factors of Relational Database Support Professionals

    ERIC Educational Resources Information Center

    Takusi, Gabriel Samuto

    2010-01-01

    This quantitative analysis explored the intrinsic and extrinsic turnover factors of relational database support specialists. Two hundred and nine relational database support specialists were surveyed for this research. The research was conducted based on Hackman and Oldham's (1980) Job Diagnostic Survey. Regression analysis and a univariate ANOVA…

  9. Integrating Data Analysis (IDA): Working with Sociology Departments to Address the Quantitative Literacy Gap

    ERIC Educational Resources Information Center

    Howery, Carla B.; Rodriguez, Havidan

    2006-01-01

    The NSF-funded Integrating Data Analysis (IDA) Project undertaken by the American Sociological Association (ASA) and the Social Science Data Analysis Network sought to close the quantitative literacy gap for sociology majors. Working with twelve departments, the project built on lessons learned from ASA's Minority Opportunities through School…

  10. A fully automated method for quantitative cerebral hemodynamic analysis using DSC-MRI.

    PubMed

    Bjørnerud, Atle; Emblem, Kyrre E

    2010-05-01

    Dynamic susceptibility contrast (DSC)-based perfusion analysis from MR images has become an established method for analysis of cerebral blood volume (CBV) in glioma patients. To date, little emphasis has, however, been placed on quantitative perfusion analysis of these patients, mainly due to the associated increased technical complexity and lack of sufficient stability in a clinical setting. The aim of our study was to develop a fully automated analysis framework for quantitative DSC-based perfusion analysis. The method presented here generates quantitative hemodynamic maps without user interaction, combined with automatic segmentation of normal-appearing cerebral tissue. Validation of 101 patients with confirmed glioma after surgery gave mean values for CBF, CBV, and MTT, extracted automatically from normal-appearing whole-brain white and gray matter, in good agreement with literature values. The measured age- and gender-related variations in the same parameters were also in agreement with those in the literature. Several established analysis methods were compared and the resulting perfusion metrics depended significantly on method and parameter choice. In conclusion, we present an accurate, fast, and automatic quantitative perfusion analysis method where all analysis steps are based on raw DSC data only. PMID:20087370

  11. Kinetic Analysis of Amylase Using Quantitative Benedict's and Iodine Starch Reagents

    ERIC Educational Resources Information Center

    Cochran, Beverly; Lunday, Deborah; Miskevich, Frank

    2008-01-01

    Quantitative analysis of carbohydrates is a fundamental analytical tool used in many aspects of biology and chemistry. We have adapted a technique developed by Mathews et al. using an inexpensive scanner and open-source image analysis software to quantify amylase activity using both the breakdown of starch and the appearance of glucose. Breakdown…

  12. Digitally Enhanced Thin-Layer Chromatography: An Inexpensive, New Technique for Qualitative and Quantitative Analysis

    ERIC Educational Resources Information Center

    Hess, Amber Victoria Irish

    2007-01-01

    A study conducted shows that if digital photography is combined with regular thin-layer chromatography (TLC), it could perform highly improved qualitative analysis as well as make accurate quantitative analysis possible for a much lower cost than commercial equipment. The findings suggest that digitally enhanced TLC (DE-TLC) is low-cost and easy…

  13. A Quantitative Content Analysis of Mercer University MEd, EdS, and Doctoral Theses

    ERIC Educational Resources Information Center

    Randolph, Justus J.; Gaiek, Lura S.; White, Torian A.; Slappey, Lisa A.; Chastain, Andrea; Harris, Rose Prejean

    2010-01-01

    Quantitative content analysis of a body of research not only helps budding researchers understand the culture, language, and expectations of scholarship, it helps identify deficiencies and inform policy and practice. Because of these benefits, an analysis of a census of 980 Mercer University MEd, EdS, and doctoral theses was conducted. Each thesis…

  14. Revisiting the quantitative features of surface-assisted laser desorption/ionization mass spectrometric analysis.

    PubMed

    Wu, Ching-Yi; Lee, Kai-Chieh; Kuo, Yen-Ling; Chen, Yu-Chie

    2016-10-28

    Surface-assisted laser desorption/ionization (SALDI) coupled with mass spectrometry (MS) is frequently used to analyse small organics owing to its clean background. Inorganic materials can be used as energy absorbers and the transfer medium to facilitate the desorption/ionization of analytes; thus, they are used as SALDI-assisting materials. Many studies have demonstrated the usefulness of SALDI-MS in quantitative analysis of small organics. However, some characteristics occurring in SALDI-MS require certain attention to ensure the reliability of the quantitative analysis results. The appearance of a coffee-ring effect in SALDI sample preparation is the primary factor that can affect quantitative SALDI-MS analysis results. However, to the best of our knowledge, there are no reports relating to quantitative SALDI-MS analysis that discuss or consider this effect. In this study, the coffee-ring effect is discussed using nanoparticles and nanostructured substrates as SALDI-assisting materials to show how this effect influences SALDI-MS analysis results. Potential solutions for overcoming the existing problems are also suggested.This article is part of the themed issue 'Quantitative mass spectrometry'.

  15. Revisiting the quantitative features of surface-assisted laser desorption/ionization mass spectrometric analysis.

    PubMed

    Wu, Ching-Yi; Lee, Kai-Chieh; Kuo, Yen-Ling; Chen, Yu-Chie

    2016-10-28

    Surface-assisted laser desorption/ionization (SALDI) coupled with mass spectrometry (MS) is frequently used to analyse small organics owing to its clean background. Inorganic materials can be used as energy absorbers and the transfer medium to facilitate the desorption/ionization of analytes; thus, they are used as SALDI-assisting materials. Many studies have demonstrated the usefulness of SALDI-MS in quantitative analysis of small organics. However, some characteristics occurring in SALDI-MS require certain attention to ensure the reliability of the quantitative analysis results. The appearance of a coffee-ring effect in SALDI sample preparation is the primary factor that can affect quantitative SALDI-MS analysis results. However, to the best of our knowledge, there are no reports relating to quantitative SALDI-MS analysis that discuss or consider this effect. In this study, the coffee-ring effect is discussed using nanoparticles and nanostructured substrates as SALDI-assisting materials to show how this effect influences SALDI-MS analysis results. Potential solutions for overcoming the existing problems are also suggested.This article is part of the themed issue 'Quantitative mass spectrometry'. PMID:27644973

  16. Some remarks on the quantitative analysis of behavior.

    PubMed

    Marr, M J

    1989-01-01

    This paper discusses similarities between the mathematization of operant behavior and the early history of the most mathematical of sciences-physics. Galileo explored the properties of motion without dealing with the causes of motion, focusing on changes in motion. Newton's dynamics were concerned with the action of forces as causes of change. Skinner's rationale for using rate to describe behavior derived from an interest in changes in rate. Reinforcement has played the role of force in the dynamics of behavior. Behavioral momentum and maximization have received mathematical formulations in behavior analysis. Yet to be worked out are the relations between molar and molecular formulations of behavioral theory. PMID:22478028

  17. Some remarks on the quantitative analysis of behavior.

    PubMed

    Marr, M J

    1989-01-01

    This paper discusses similarities between the mathematization of operant behavior and the early history of the most mathematical of sciences-physics. Galileo explored the properties of motion without dealing with the causes of motion, focusing on changes in motion. Newton's dynamics were concerned with the action of forces as causes of change. Skinner's rationale for using rate to describe behavior derived from an interest in changes in rate. Reinforcement has played the role of force in the dynamics of behavior. Behavioral momentum and maximization have received mathematical formulations in behavior analysis. Yet to be worked out are the relations between molar and molecular formulations of behavioral theory.

  18. Some remarks on the quantitative analysis of behavior

    PubMed Central

    Marr, M. Jackson

    1989-01-01

    This paper discusses similarities between the mathematization of operant behavior and the early history of the most mathematical of sciences—physics. Galileo explored the properties of motion without dealing with the causes of motion, focusing on changes in motion. Newton's dynamics were concerned with the action of forces as causes of change. Skinner's rationale for using rate to describe behavior derived from an interest in changes in rate. Reinforcement has played the role of force in the dynamics of behavior. Behavioral momentum and maximization have received mathematical formulations in behavior analysis. Yet to be worked out are the relations between molar and molecular formulations of behavioral theory. PMID:22478028

  19. Quantitative Trait Locus Analysis of Mating Behavior and Male Sex Pheromones in Nasonia Wasps

    PubMed Central

    Diao, Wenwen; Mousset, Mathilde; Horsburgh, Gavin J.; Vermeulen, Cornelis J.; Johannes, Frank; van de Zande, Louis; Ritchie, Michael G.; Schmitt, Thomas; Beukeboom, Leo W.

    2016-01-01

    A major focus in speciation genetics is to identify the chromosomal regions and genes that reduce hybridization and gene flow. We investigated the genetic architecture of mating behavior in the parasitoid wasp species pair Nasonia giraulti and Nasonia oneida that exhibit strong prezygotic isolation. Behavioral analysis showed that N. oneida females had consistently higher latency times, and broke off the mating sequence more often in the mounting stage when confronted with N. giraulti males compared with males of their own species. N. oneida males produce a lower quantity of the long-range male sex pheromone (4R,5S)-5-hydroxy-4-decanolide (RS-HDL). Crosses between the two species yielded hybrid males with various pheromone quantities, and these males were used in mating trials with females of either species to measure female mate discrimination rates. A quantitative trait locus (QTL) analysis involving 475 recombinant hybrid males (F2), 2148 reciprocally backcrossed females (F3), and a linkage map of 52 equally spaced neutral single nucleotide polymorphism (SNP) markers plus SNPs in 40 candidate mating behavior genes revealed four QTL for male pheromone amount, depending on partner species. Our results demonstrate that the RS-HDL pheromone plays a role in the mating system of N. giraulti and N. oneida, but also that additional communication cues are involved in mate choice. No QTL were found for female mate discrimination, which points at a polygenic architecture of female choice with strong environmental influences. PMID:27172207

  20. The quantitative assessment of the pre- and postoperative craniosynostosis using the methods of image analysis.

    PubMed

    Fabijańska, Anna; Węgliński, Tomasz

    2015-12-01

    This paper considers the problem of the CT based quantitative assessment of the craniosynostosis before and after the surgery. First, fast and efficient brain segmentation approach is proposed. The algorithm is robust to discontinuity of skull. As a result it can be applied both in pre- and post-operative cases. Additionally, image processing and analysis algorithms are proposed for describing the disease based on CT scans. The proposed algorithms automate determination of the standard linear indices used for assessment of the craniosynostosis (i.e. cephalic index CI and head circumference HC) and allow for planar and volumetric analysis which so far have not been reported. Results of applying the introduced methods to sample craniosynostotic cases before and after the surgery are presented and discussed. The results show that the proposed brain segmentation algorithm is characterized by high accuracy when applied both in the pre- and postoperative craniosynostosis, while the introduced planar and volumetric indices for the disease description may be helpful to distinguish between the types of the disease.

  1. Quantitative Trait Locus Analysis of Mating Behavior and Male Sex Pheromones in Nasonia Wasps.

    PubMed

    Diao, Wenwen; Mousset, Mathilde; Horsburgh, Gavin J; Vermeulen, Cornelis J; Johannes, Frank; van de Zande, Louis; Ritchie, Michael G; Schmitt, Thomas; Beukeboom, Leo W

    2016-01-01

    A major focus in speciation genetics is to identify the chromosomal regions and genes that reduce hybridization and gene flow. We investigated the genetic architecture of mating behavior in the parasitoid wasp species pair Nasonia giraulti and Nasonia oneida that exhibit strong prezygotic isolation. Behavioral analysis showed that N. oneida females had consistently higher latency times, and broke off the mating sequence more often in the mounting stage when confronted with N. giraulti males compared with males of their own species. N. oneida males produce a lower quantity of the long-range male sex pheromone (4R,5S)-5-hydroxy-4-decanolide (RS-HDL). Crosses between the two species yielded hybrid males with various pheromone quantities, and these males were used in mating trials with females of either species to measure female mate discrimination rates. A quantitative trait locus (QTL) analysis involving 475 recombinant hybrid males (F2), 2148 reciprocally backcrossed females (F3), and a linkage map of 52 equally spaced neutral single nucleotide polymorphism (SNP) markers plus SNPs in 40 candidate mating behavior genes revealed four QTL for male pheromone amount, depending on partner species. Our results demonstrate that the RS-HDL pheromone plays a role in the mating system of N. giraulti and N. oneida, but also that additional communication cues are involved in mate choice. No QTL were found for female mate discrimination, which points at a polygenic architecture of female choice with strong environmental influences. PMID:27172207

  2. Quantitative proteomic analysis of cold-responsive proteins in rice.

    PubMed

    Neilson, Karlie A; Mariani, Michael; Haynes, Paul A

    2011-05-01

    Rice is susceptible to cold stress and with a future of climatic instability we will be unable to produce enough rice to satisfy increasing demand. A thorough understanding of the molecular responses to thermal stress is imperative for engineering cultivars, which have greater resistance to low temperature stress. In this study we investigated the proteomic response of rice seedlings to 48, 72 and 96 h of cold stress at 12-14°C. The use of both label-free and iTRAQ approaches in the analysis of global protein expression enabled us to assess the complementarity of the two techniques for use in plant proteomics. The approaches yielded a similar biological response to cold stress despite a disparity in proteins identified. The label-free approach identified 236 cold-responsive proteins compared to 85 in iTRAQ results, with only 24 proteins in common. Functional analysis revealed differential expression of proteins involved in transport, photosynthesis, generation of precursor metabolites and energy; and, more specifically, histones and vitamin B biosynthetic proteins were observed to be affected by cold stress. PMID:21433000

  3. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  4. Quantitative assessment of human body shape using Fourier analysis

    NASA Astrophysics Data System (ADS)

    Friess, Martin; Rohlf, F. J.; Hsiao, Hongwei

    2004-04-01

    Fall protection harnesses are commonly used to reduce the number and severity of injuries. Increasing the efficiency of harness design requires the size and shape variation of the user population to be assessed as detailed and as accurately as possible. In light of the unsatisfactory performance of traditional anthropometry with respect to such assessments, we propose the use of 3D laser surface scans of whole bodies and the statistical analysis of elliptic Fourier coefficients. Ninety-eight male and female adults were scanned. Key features of each torso were extracted as a 3D curve along front, back and the thighs. A 3D extension of Elliptic Fourier analysis4 was used to quantify their shape through multivariate statistics. Shape change as a function of size (allometry) was predicted by regressing the coefficients onto stature, weight and hip circumference. Upper and lower limits of torso shape variation were determined and can be used to redefine the design of the harness that will fit most individual body shapes. Observed allometric changes are used for adjustments to the harness shape in each size. Finally, the estimated outline data were used as templates for a free-form deformation of the complete torso surface using NURBS models (non-uniform rational B-splines).

  5. Quantitive analysis of gully long profiles on Earth and Mars

    NASA Astrophysics Data System (ADS)

    Conway, Susan; Balme, Matthew; Murray, John; Towner, Martin

    2010-05-01

    We investigated the scale, slope and curvature properties of gully long profiles on Earth and Mars to ascertain whether gullies on Mars are formed by alluvial and/or debris flow processes. During this investigation we also compared generic slope profiles on Mars to those with gullies. To perform these analyses we used digital elevation models (DEMs) for Earth, the majority of which were derived from ~ 1 m resolution LiDAR data from the UK's NERC ARSF and the USA's NSF NCALM. For Mars we used a technique developed by Kreslavsky [1] to extract elevation data from pairs of RDR HiRISE images. We successfully validated this technique by comparing its results to those from HiRISE DEMs made using automated stereo photogrammetry techniques [2]. We found that gullies produced by debris flow have properties distinct from those formed by alluvial processes on Earth. In general, debris flow gullies are less concave than alluvial gullies, have a basal concavity and have higher slopes than alluvial gullies. We then compared our results from Earth to the gully profiles on Mars and found that properties of gullies on Mars overlap those of both debris flow and alluvial gullies on Earth, however, gullies on Mars are slightly more similar to terrestrial debris flow gullies. In addition gully long profiles on Mars are distinct from generic slope profiles on Mars with some overlap - this shows that the gully forming process has a marked morphological impact on martian slopes. Our observed latitudinal patterns in gully formation are in agreement with previous investigations [3, 4], with greater numbers of gullies found at ~40° and ~70° latitude north and south. In addition we found that gullies at mid-latitudes are more densely packed and occur across whole slope sections (rather than isolated patches), suggesting this region has preferential conditions for gully formation. Our morphological comparison with gullies on Earth suggests that the formation process shifts from pure water

  6. [Quantitative analysis of seven phenolic acids in eight Yinqiao Jiedu serial preparations by quantitative analysis of multi-components with single-marker].

    PubMed

    Wang, Jun-jun; Zhang, Li; Guo, Qing; Kou, Jun-ping; Yu, Bo-yang; Gu, Dan-hua

    2015-04-01

    The study aims to develop a unified method to determine seven phenolic acids (neochlorogenic acid, chlorogenic acid, 4-caffeoylquinic acid, caffeic acid, isochlorogenic acid B, isochlorogenic acid A and isochlorogenic acid C) contained in honeysuckle flower that is the monarch drug of all the eight Yinqiao Jiedu serial preparations using quantitative analysis of multi-components by single-marker (QAMS). Firstly, chlorogenic acid was used as a reference to get the average relative correction factors (RCFs) of the other phenolic acids in ratios to the reference; columns and instruments from different companies were used to validate the durability of the achieved RCFs in different levels of standard solutions; and honeysuckle flower extract was used as the reference substance to fix the positions of chromatographic peaks. Secondly, the contents of seven phenolic acids in eight different Yinqiao Jiedu serial preparations samples were calculated based on the RCFs durability. Finally, the quantitative results were compared between QAMS and the external standard (ES) method. The results have showed that the durability of the achieved RCFs is good (RSD during 0.80% - 2.56%), and there are no differences between the quantitative results of QAMS and ES (the relative average deviation < 0.93%). So it can be successfully used to the quantitative control of honeysuckle flower principally prescribed in Yinqiao Jiedu serial preparations. PMID:26223132

  7. Quantitative Analysis of Photoactivated Localization Microscopy (PALM) Datasets Using Pair-correlation Analysis

    PubMed Central

    Sengupta, Prabuddha; Lippincott-Schwartz, Jennifer

    2013-01-01

    Pointillistic approach based super-resolution techniques, such as photoactivated localization microscopy (PALM), involve multiple cycles of sequential activation, imaging and precise localization of single fluorescent molecules. A super-resolution image, having nanoscopic structural information, is then constructed by compiling all the image sequences. Because the final image resolution is determined by the localization precision of detected single molecules and their density, accurate image reconstruction requires imaging of biological structures labeled with fluorescent molecules at high density. In such image datasets, stochastic variations in photon emission and intervening dark states lead to uncertainties in identification of single molecules. This, in turn, prevents the proper utilization of the wealth of information on molecular distribution and quantity. A recent strategy for overcoming this problem is pair-correlation analysis applied to PALM. Using rigorous statistical algorithms to estimate the number of detected proteins, this approach allows the spatial organization of molecules to be quantitatively described. PMID:22447653

  8. A quantitative analysis of electrolyte exchange in the salivary duct

    PubMed Central

    Catalán, Marcelo A.; Melvin, James E.; Yule, David I.; Crampin, Edmund J.; Sneyd, James

    2012-01-01

    A healthy salivary gland secretes saliva in two stages. First, acinar cells generate primary saliva, a plasma-like, isotonic fluid high in Na+ and Cl−. In the second stage, the ducts exchange Na+ and Cl− for K+ and HCO3−, producing a hypotonic final saliva with no apparent loss in volume. We have developed a tool that aims to understand how the ducts achieve this electrolyte exchange while maintaining the same volume. This tool is part of a larger multiscale model of the salivary gland and can be used at the duct or gland level to investigate the effects of genetic and chemical alterations. In this study, we construct a radially symmetric mathematical model of the mouse salivary gland duct, representing the lumen, the cell, and the interstitium. For a given flow and primary saliva composition, we predict the potential differences and the luminal and cytosolic concentrations along a duct. Our model accounts well for experimental data obtained in wild-type animals as well as knockouts and chemical inhibitors. Additionally, the luminal membrane potential of the duct cells is predicted to be very depolarized compared with acinar cells. We investigate the effects of an electrogenic vs. electroneutral anion exchanger in the luminal membrane on concentration and the potential difference across the luminal membrane as well as how impairing the cystic fibrosis transmembrane conductance regulator channel affects other ion transporting mechanisms. Our model suggests the electrogenicity of the anion exchanger has little effect in the submandibular duct. PMID:22899825

  9. Automated monitoring and quantitative analysis of feeding behaviour in Drosophila.

    PubMed

    Itskov, Pavel M; Moreira, José-Maria; Vinnik, Ekaterina; Lopes, Gonçalo; Safarik, Steve; Dickinson, Michael H; Ribeiro, Carlos

    2014-08-04

    Food ingestion is one of the defining behaviours of all animals, but its quantification and analysis remain challenging. This is especially the case for feeding behaviour in small, genetically tractable animals such as Drosophila melanogaster. Here, we present a method based on capacitive measurements, which allows the detailed, automated and high-throughput quantification of feeding behaviour. Using this method, we were able to measure the volume ingested in single sips of an individual, and monitor the absorption of food with high temporal resolution. We demonstrate that flies ingest food by rhythmically extending their proboscis with a frequency that is not modulated by the internal state of the animal. Instead, hunger and satiety homeostatically modulate the microstructure of feeding. These results highlight similarities of food intake regulation between insects, rodents, and humans, pointing to a common strategy in how the nervous systems of different animals control food intake.

  10. Automated monitoring and quantitative analysis of feeding behaviour in Drosophila

    PubMed Central

    Itskov, Pavel M.; Moreira, José-Maria; Vinnik, Ekaterina; Lopes, Gonçalo; Safarik, Steve; Dickinson, Michael H.; Ribeiro, Carlos

    2014-01-01

    Food ingestion is one of the defining behaviours of all animals, but its quantification and analysis remain challenging. This is especially the case for feeding behaviour in small, genetically tractable animals such as Drosophila melanogaster. Here, we present a method based on capacitive measurements, which allows the detailed, automated and high-throughput quantification of feeding behaviour. Using this method, we were able to measure the volume ingested in single sips of an individual, and monitor the absorption of food with high temporal resolution. We demonstrate that flies ingest food by rhythmically extending their proboscis with a frequency that is not modulated by the internal state of the animal. Instead, hunger and satiety homeostatically modulate the microstructure of feeding. These results highlight similarities of food intake regulation between insects, rodents, and humans, pointing to a common strategy in how the nervous systems of different animals control food intake. PMID:25087594

  11. Quantitative Analysis of Spectral Impacts on Silicon Photodiode Radiometers: Preprint

    SciTech Connect

    Myers, D. R.

    2011-04-01

    Inexpensive broadband pyranometers with silicon photodiode detectors have a non-uniform spectral response over the spectral range of 300-1100 nm. The response region includes only about 70% to 75% of the total energy in the terrestrial solar spectral distribution from 300 nm to 4000 nm. The solar spectrum constantly changes with solar position and atmospheric conditions. Relative spectral distributions of diffuse hemispherical irradiance sky radiation and total global hemispherical irradiance are drastically different. This analysis convolves a typical photodiode response with SMARTS 2.9.5 spectral model spectra for different sites and atmospheric conditions. Differences in solar component spectra lead to differences on the order of 2% in global hemispherical and 5% or more in diffuse hemispherical irradiances from silicon radiometers. The result is that errors of more than 7% can occur in the computation of direct normal irradiance from global hemispherical irradiance and diffuse hemispherical irradiance using these radiometers.

  12. Quantitative analysis by mid-infrared spectrometry in food and agro-industrial fields

    NASA Astrophysics Data System (ADS)

    Dupuy, Nathalie; Huvenne, J. P.; Sombret, B.; Legrand, P.

    1993-03-01

    Thanks to what has been achieved by the Fourier transform, infrared spectroscopy can now become a state of the art device in the quality control laboratories if we consider its precision and the gain in time it ensures compared to traditional analysis methods such as HPLC chromatography. Moreover, the increasing number of new mathematical regression methods such as Partial Least Square ( PLS) regression allows the multicomponent quantitative analysis in mixtures. Nevertheless, the efficiency of infrared spectrometry as a quantitative analysis method often depends on the choice of an adequate presentation for the sample. In this document, we shall demonstrate several techniques such as diffuse reflectance and Attenuated Total Reflectance (ATR) which can be according to the various physical states of the mixtures. The quantitative analysis of real samples from the food industry enables us to estimate its precision. For instance, the analysis of the three main components (glucose, fructose and maltose) in the glucose syrups can be done (using ATR) with a precision in the region of 3% whereas the time required to obtain an analysis report is about 5 minutes. Finally multicomponent quantitative analysis is quite feasable by mid-IR spectroscopy.

  13. A quantitative analysis to objectively appraise drought indicators and model drought impacts

    NASA Astrophysics Data System (ADS)

    Bachmair, S.; Svensson, C.; Hannaford, J.; Barker, L. J.; Stahl, K.

    2016-07-01

    Drought monitoring and early warning is an important measure to enhance resilience towards drought. While there are numerous operational systems using different drought indicators, there is no consensus on which indicator best represents drought impact occurrence for any given sector. Furthermore, thresholds are widely applied in these indicators but, to date, little empirical evidence exists as to which indicator thresholds trigger impacts on society, the economy, and ecosystems. The main obstacle for evaluating commonly used drought indicators is a lack of information on drought impacts. Our aim was therefore to exploit text-based data from the European Drought Impact report Inventory (EDII) to identify indicators that are meaningful for region-, sector-, and season-specific impact occurrence, and to empirically determine indicator thresholds. In addition, we tested the predictability of impact occurrence based on the best-performing indicators. To achieve these aims we applied a correlation analysis and an ensemble regression tree approach, using Germany and the UK (the most data-rich countries in the EDII) as test beds. As candidate indicators we chose two meteorological indicators (Standardized Precipitation Index, SPI, and Standardized Precipitation Evaporation Index, SPEI) and two hydrological indicators (streamflow and groundwater level percentiles). The analysis revealed that accumulation periods of SPI and SPEI best linked to impact occurrence are longer for the UK compared with Germany, but there is variability within each country, among impact categories and, to some degree, seasons. The median of regression tree splitting values, which we regard as estimates of thresholds of impact occurrence, was around -1 for SPI and SPEI in the UK; distinct differences between northern/northeastern vs. southern/central regions were found for Germany. Predictions with the ensemble regression tree approach yielded reasonable results for regions with good impact data

  14. A quantitative analysis to objectively appraise drought indicators and model drought impacts

    NASA Astrophysics Data System (ADS)

    Bachmair, S.; Svensson, C.; Hannaford, J.; Barker, L. J.; Stahl, K.

    2015-09-01

    Drought monitoring and early warning is an important measure to enhance resilience towards drought. While there are numerous operational systems using different drought indicators, there is no consensus on which indicator best represents drought impact occurrence for any given sector. Furthermore, thresholds are widely applied in these indicators but, to date, little empirical evidence exists as to which indicator thresholds trigger impacts on society, the economy, and ecosystems. The main obstacle for evaluating commonly used drought indicators is a lack of information on drought impacts. Our aim was therefore to exploit text-based data from the European Drought Impact report Inventory (EDII) to identify indicators which are meaningful for region-, sector-, and season-specific impact occurrence, and to empirically determine indicator thresholds. In addition, we tested the predictability of impact occurrence based on the best performing indicators. To achieve these aims we applied a correlation analysis and an ensemble regression tree approach ("random forest"), using Germany and the UK (the most data-rich countries in the EDII) as a testbed. As candidate indicators we chose two meteorological indicators (Standardized Precipitation Index (SPI) and Standardized Precipitation Evaporation Index (SPEI)) and two hydrological indicators. The analysis revealed that accumulation periods of SPI and SPEI best linked to impact occurrence are longer for the UK compared with Germany, but there is variability within each country, among impact categories and, to some degree, seasons. The median of regression tree splitting values, which we regard as estimates of thresholds of impact occurrence, was around -1 for SPI and SPEI in the UK; distinct differences between northern/northeastern vs. southern/central regions were found for Germany. Predictions with the ensemble regression tree approach yielded reasonable results for regions with good impact data coverage. The predictions

  15. Putting tools in the toolbox: Development of a free, open-source toolbox for quantitative image analysis of porous media.

    NASA Astrophysics Data System (ADS)

    Iltis, G.; Caswell, T. A.; Dill, E.; Wilkins, S.; Lee, W. K.

    2014-12-01

    X-ray tomographic imaging of porous media has proven to be a valuable tool for investigating and characterizing the physical structure and state of both natural and synthetic porous materials, including glass bead packs, ceramics, soil and rock. Given that most synchrotron facilities have user programs which grant academic researchers access to facilities and x-ray imaging equipment free of charge, a key limitation or hindrance for small research groups interested in conducting x-ray imaging experiments is the financial cost associated with post-experiment data analysis. While the cost of high performance computing hardware continues to decrease, expenses associated with licensing commercial software packages for quantitative image analysis continue to increase, with current prices being as high as $24,000 USD, for a single user license. As construction of the Nation's newest synchrotron accelerator nears completion, a significant effort is being made here at the National Synchrotron Light Source II (NSLS-II), Brookhaven National Laboratory (BNL), to provide an open-source, experiment-to-publication toolbox that reduces the financial and technical 'activation energy' required for performing sophisticated quantitative analysis of multidimensional porous media data sets, collected using cutting-edge x-ray imaging techniques. Implementation focuses on leveraging existing open-source projects and developing additional tools for quantitative analysis. We will present an overview of the software suite that is in development here at BNL including major design decisions, a demonstration of several test cases illustrating currently available quantitative tools for analysis and characterization of multidimensional porous media image data sets and plans for their future development.

  16. Effects of atrazine in fish, amphibians, and reptiles: an analysis based on quantitative weight of evidence.

    PubMed

    Van Der Kraak, Glen J; Hosmer, Alan J; Hanson, Mark L; Kloas, Werner; Solomon, Keith R

    2014-12-01

    A quantitative weight of evidence (WoE) approach was developed to evaluate studies used for regulatory purposes, as well as those in the open literature, that report the effects of the herbicide atrazine on fish, amphibians, and reptiles. The methodology for WoE analysis incorporated a detailed assessment of the relevance of the responses observed to apical endpoints directly related to survival, growth, development, and reproduction, as well as the strength and appropriateness of the experimental methods employed. Numerical scores were assigned for strength and relevance. The means of the scores for relevance and strength were then used to summarize and weigh the evidence for atrazine contributing to ecologically significant responses in the organisms of interest. The summary was presented graphically in a two-dimensional graph which showed the distributions of all the reports for a response. Over 1290 individual responses from studies in 31 species of fish, 32 amphibians, and 8 reptiles were evaluated. Overall, the WoE showed that atrazine might affect biomarker-type responses, such as expression of genes and/or associated proteins, concentrations of hormones, and biochemical processes (e.g. induction of detoxification responses), at concentrations sometimes found in the environment. However, these effects were not translated to adverse outcomes in terms of apical endpoints. The WoE approach provided a quantitative, transparent, reproducible, and robust framework that can be used to assist the decision-making process when assessing environmental chemicals. In addition, the process allowed easy identification of uncertainty and inconsistency in observations, and thus clearly identified areas where future investigations can be best directed.

  17. Quantitative analysis of locomotor defects in neonatal mice lacking proprioceptive feedback

    PubMed Central

    Dallman, Marisela A.; Ladle, David R.

    2013-01-01

    Proprioceptive feedback derived from specialized receptors in skeletal muscle is critical in forming an accurate map of limb position in space, and is used by the central nervous system to plan future movements and to determine accuracy of executed movements. Knockout mouse strains for genes expressed by proprioceptive sensory neurons have been generated that result in generalized motor deficits, but these deficits have not been quantitatively characterized. Here we characterize a conditional knockout mouse model where proprioceptive sensory neuron synaptic transmission has been blocked by selective ablation of munc18-1, a synaptic vesicle associated protein required for fusion of synaptic vesicles with the plasma membrane. Proprioceptive munc18-1 conditional mutants are impaired in surface righting—a dynamic postural adjustment task—and display several specific deficits in pivoting, an early locomotor behavior. Before the emergence of forward locomotion during postnatal development, animals explore their surroundings through pivoting, or rotating the upper torso around the relatively immobile base of the hind limbs. 3-D kinematic analysis was used to quantitatively describe this pivoting behavior at postnatal days 5 and 8 in control and munc18-1 conditional mutants. Mutant animals also pivot, but demonstrate alterations in movement strategy and in postural placement of the forelimbs during pivoting when compared to controls. In addition, brief forelimb stepping movements associated with pivoting are altered in mutant animals. Step duration and step height is increased in mutant animals. These results underscore the importance of proprioceptive feedback even at early stages in postnatal development. PMID:23911806

  18. Quantitative Nutrient Limitation Analysis of Global Forests by Remote Sensing

    NASA Astrophysics Data System (ADS)

    Lopez, A. M.; Badgley, G. M.; Field, C. B.

    2015-12-01

    Nutrient availability in terrestrial ecosystems may be the primary determinant of the long-term carbon storage capacity of vegetation. Both nutrient availability and carbon storage capacity are highly uncertain and limit our ability to predict atmospheric CO2 concentrations. Terrestrial vegetation, especially forests, play a critical role in regulating the global carbon cycle and Earth's climate by sequestering carbon from the atmosphere. The broad relationship between nutrient availability and increased biomass production can be captured using remotely-sensed spectral information. We develop an approach to estimate total nutrient availability in 848 global forest sites at 1-km spatial resolution by combining the ecological principle of functional convergence with MODIS gross primary productivity (GPP) and evapotranspiration (ET) products from 2000-2013. Convergence in the relationship between maximum GPP and ET of nutrient-rich forests indicate that any sites deviating from this upper-limit are associated with a lower availability of nutrients. This method offers a way to examine the severity, as well as the spatial extent of nutrient limitation at the global scale. We find that the degree to which forests are nutrient limited range between 0% and 81% with an average limitation of 16 ± 17%. Our method agrees with regional nutrient gradients (i.e. SW-NE Amazon), but does not tightly correspond with recently published nutrient limitation classification standards (Fernandez-Martinez et al., 2014). A global terrestrial nutrient limitation map can assist in diagnosing the health of vegetation while removing the necessity for extensive field sampling or local nutrient addition experiments. Further research will expand the study sites to obtain a complete global terrestrial nutrient limitation map.

  19. Dried blood spot analysis of creatinine with LC-MS/MS in addition to immunosuppressants analysis.

    PubMed

    Koster, Remco A; Greijdanus, Ben; Alffenaar, Jan-Willem C; Touw, Daan J

    2015-02-01

    In order to monitor creatinine levels or to adjust the dosage of renally excreted or nephrotoxic drugs, the analysis of creatinine in dried blood spots (DBS) could be a useful addition to DBS analysis. We developed a LC-MS/MS method for the analysis of creatinine in the same DBS extract that was used for the analysis of tacrolimus, sirolimus, everolimus, and cyclosporine A in transplant patients with the use of Whatman FTA DMPK-C cards. The method was validated using three different strategies: a seven-point calibration curve using the intercept of the calibration to correct for the natural presence of creatinine in reference samples, a one-point calibration curve at an extremely high concentration in order to diminish the contribution of the natural presence of creatinine, and the use of creatinine-[(2)H3] with an eight-point calibration curve. The validated range for creatinine was 120 to 480 μmol/L (seven-point calibration curve), 116 to 7000 μmol/L (1-point calibration curve), and 1.00 to 400.0 μmol/L for creatinine-[(2)H3] (eight-point calibration curve). The precision and accuracy results for all three validations showed a maximum CV of 14.0% and a maximum bias of -5.9%. Creatinine in DBS was found stable at ambient temperature and 32 °C for 1 week and at -20 °C for 29 weeks. Good correlations were observed between patient DBS samples and routine enzymatic plasma analysis and showed the capability of the DBS method to be used as an alternative for creatinine plasma measurement.

  20. Mammographic quantitative image analysis and biologic image composition for breast lesion characterization and classification

    SciTech Connect

    Drukker, Karen Giger, Maryellen L.; Li, Hui; Duewer, Fred; Malkov, Serghei; Joe, Bonnie; Kerlikowske, Karla; Shepherd, John A.; Flowers, Chris I.; Drukteinis, Jennifer S.

    2014-03-15

    Purpose: To investigate whether biologic image composition of mammographic lesions can improve upon existing mammographic quantitative image analysis (QIA) in estimating the probability of malignancy. Methods: The study population consisted of 45 breast lesions imaged with dual-energy mammography prior to breast biopsy with final diagnosis resulting in 10 invasive ductal carcinomas, 5 ductal carcinomain situ, 11 fibroadenomas, and 19 other benign diagnoses. Analysis was threefold: (1) The raw low-energy mammographic images were analyzed with an established in-house QIA method, “QIA alone,” (2) the three-compartment breast (3CB) composition measure—derived from the dual-energy mammography—of water, lipid, and protein thickness were assessed, “3CB alone”, and (3) information from QIA and 3CB was combined, “QIA + 3CB.” Analysis was initiated from radiologist-indicated lesion centers and was otherwise fully automated. Steps of the QIA and 3CB methods were lesion segmentation, characterization, and subsequent classification for malignancy in leave-one-case-out cross-validation. Performance assessment included box plots, Bland–Altman plots, and Receiver Operating Characteristic (ROC) analysis. Results: The area under the ROC curve (AUC) for distinguishing between benign and malignant lesions (invasive and DCIS) was 0.81 (standard error 0.07) for the “QIA alone” method, 0.72 (0.07) for “3CB alone” method, and 0.86 (0.04) for “QIA+3CB” combined. The difference in AUC was 0.043 between “QIA + 3CB” and “QIA alone” but failed to reach statistical significance (95% confidence interval [–0.17 to + 0.26]). Conclusions: In this pilot study analyzing the new 3CB imaging modality, knowledge of the composition of breast lesions and their periphery appeared additive in combination with existing mammographic QIA methods for the distinction between different benign and malignant lesion types.

  1. Hydrological drought types in cold climates: quantitative analysis of causing factors and qualitative survey of impacts

    NASA Astrophysics Data System (ADS)

    Van Loon, A. F.; Ploum, S. W.; Parajka, J.; Fleig, A. K.; Garnier, E.; Laaha, G.; Van Lanen, H. A. J.

    2015-04-01

    For drought management and prediction, knowledge of causing factors and socio-economic impacts of hydrological droughts is crucial. Propagation of meteorological conditions in the hydrological cycle results in different hydrological drought types that require separate analysis. In addition to the existing hydrological drought typology, we here define two new drought types related to snow and ice. A snowmelt drought is a deficiency in the snowmelt discharge peak in spring in snow-influenced basins and a glaciermelt drought is a deficiency in the glaciermelt discharge peak in summer in glacierised basins. In 21 catchments in Austria and Norway we studied the meteorological conditions in the seasons preceding and at the time of snowmelt and glaciermelt drought events. Snowmelt droughts in Norway were mainly controlled by below-average winter precipitation, while in Austria both temperature and precipitation played a role. For glaciermelt droughts, the effect of below-average summer air temperature was dominant, both in Austria and Norway. Subsequently, we investigated the impacts of temperature-related drought types (i.e. snowmelt and glaciermelt drought, but also cold and warm snow season drought and rain-to-snow-season drought). In historical archives and drought databases for the US and Europe many impacts were found that can be attributed to these temperature-related hydrological drought types, mainly in the agriculture and electricity production (hydropower) sectors. However, drawing conclusions on the frequency of occurrence of different drought types from reported impacts is difficult, mainly because of reporting biases and the inevitably limited spatial and temporal scales of the information. Finally, this study shows that complete integration of quantitative analysis of causing factors and qualitative analysis of impacts of temperature-related droughts is not yet possible. Analysis of selected events, however, points out that it can be a promising research

  2. Interlake production established using quantitative hydrocarbon well-log analysis

    SciTech Connect

    Lancaster, J.; Atkinson, A.

    1988-07-01

    Production was established in a new pay zone of the basal Interlake Formation adjacent to production in Midway field in Williams County, North Dakota. Hydrocarbon saturation, which was computed using hydrocarbon well-log (mud-log) data, and computed permeability encouraged the operator to run casing and test this zone. By use of drilling rig parameters, drilling mud properties, hydrocarbon-show data from the mud log, drilled rock and porosity descriptions, and wireline log porosity, this new technique computes oil saturation (percent of porosity) and permeability to the invading filtrate, using the Darcy equation. The Leonardo Fee well was drilled to test the Devonian Duperow, the Silurian upper Interlake, and the Ordovician Red River. The upper two objectives were penetrated downdip from Midway production and there were no hydrocarbon shows. It was determined that the Red River was tight, based on sample examination by well site personnel. The basal Interlake, however, liberated hydrocarbon shows that were analyzed by this new technology. The results of this evaluation accurately predicted this well would be a commercial success when placed in production. Where geophysical log analysis might be questionable, this new evaluation technique may provide answers to anticipated oil saturation and producibility. The encouraging results of hydrocarbon saturation and permeability, produced by this technique, may be largely responsible for the well being in production today.

  3. Quantitative analysis of American woodcock nest and brood habitat

    USGS Publications Warehouse

    Bourgeois, A.; Keppie, Daniel M.; Owen, Ray B.

    1977-01-01

    Sixteen nest and 19 brood sites of American woodcock (Philohela minoI) were examined in northern lower Michigan between 15 April and 15 June 1974 to determine habitat structure associated with these sites. Woodcock hens utilized young, second-growth forest stands which were similar in species composition for both nesting and brood rearing. A multi-varIate discriminant function analysis revealed a significant (P< 0.05) difference, however, in habitat structure. Nest habitat was characterized by lower tree density (2176 trees/ha) and basal area (8.6 m2/ha), by being close to forest openings (7 m) and by being situated on dry, relatively well drained sites. In contrast, woodcock broods were located in sites that had nearly twice the tree density (3934 trees/hal and basal area (16.5 m2/ha), was located over twice as far from forest openings (18 m) and generally occurred on damp sites, near (8 m) standing water. Importance of the habitat features to the species and possible management implications are discussed.

  4. Mechanistic insights from a quantitative analysis of pollen tube guidance

    PubMed Central

    2010-01-01

    Background Plant biologists have long speculated about the mechanisms that guide pollen tubes to ovules. Although there is now evidence that ovules emit a diffusible attractant, little is known about how this attractant mediates interactions between the pollen tube and the ovules. Results We employ a semi-in vitro assay, in which ovules dissected from Arabidopsis thaliana are arranged around a cut style on artificial medium, to elucidate how ovules release the attractant and how pollen tubes respond to it. Analysis of microscopy images of the semi-in vitro system shows that pollen tubes are more attracted to ovules that are incubated on the medium for longer times before pollen tubes emerge from the cut style. The responses of tubes are consistent with their sensing a gradient of an attractant at 100-150 μm, farther than previously reported. Our microscopy images also show that pollen tubes slow their growth near the micropyles of functional ovules with a spatial range that depends on ovule incubation time. Conclusions We propose a stochastic model that captures these dynamics. In the model, a pollen tube senses a difference in the fraction of receptors bound to an attractant and changes its direction of growth in response; the attractant is continuously released from ovules and spreads isotropically on the medium. The model suggests that the observed slowing greatly enhances the ability of pollen tubes to successfully target ovules. The relation of the results to guidance in vivo is discussed. PMID:20170550

  5. Quantitative analysis of phenol oxidase activity in insect hemolymph.

    PubMed

    Sorrentino, Richard Paul; Small, Chiyedza N; Govind, Shubha

    2002-04-01

    We describe a simple, inexpensive, and robust protocol for the quantification of phenol oxidase activity in insect hemolymph. Discrete volumes of hemolymph from Drosophila melanogaster larvae are applied to pieces of filter paper soaked in an L-3, 4-dihydroxyphenylalanine (L-DOPA) solution. Phenol oxidase present in the samples catalyzes melanin synthesis from the L-DOPA precursor, resulting in the appearance of a roughly circular melanized spot on the filter paper. The filter paper is then scanned and analyzed with image-processing software. Each pixel in an image is assigned a grayscale value. The mean of the grayscale values for a circular region of pixels at the center of the image of each spot is used to compute a melanization index (MI) value, the computation is based on a comparison to an external standard (India ink). Numerical MI values for control and experimental larvae can then be pooled and subjected to statistical analysis. This protocol was used to evaluate phenol oxidase activity in larvae of different backgrounds: wild-type, lozenge, hopscotch(Tumorous-lethal) (which induces the formation of large melanotic tumors), and body-color mutations ebony and yellow. Our results demonstrate that this assay is sensitive enough for use in genetic screens with D. melanogaster and could conceivably be used for evaluation of MI from hemolymph of other insects.

  6. Machine learning methods for quantitative analysis of Raman spectroscopy data

    NASA Astrophysics Data System (ADS)

    Madden, Michael G.; Ryder, Alan G.

    2003-03-01

    The automated identification and quantification of illicit materials using Raman spectroscopy is of significant importance for law enforcement agencies. This paper explores the use of Machine Learning (ML) methods in comparison with standard statistical regression techniques for developing automated identification methods. In this work, the ML task is broken into two sub-tasks, data reduction and prediction. In well-conditioned data, the number of samples should be much larger than the number of attributes per sample, to limit the degrees of freedom in predictive models. In this spectroscopy data, the opposite is normally true. Predictive models based on such data have a high number of degrees of freedom, which increases the risk of models over-fitting to the sample data and having poor predictive power. In the work described here, an approach to data reduction based on Genetic Algorithms is described. For the prediction sub-task, the objective is to estimate the concentration of a component in a mixture, based on its Raman spectrum and the known concentrations of previously seen mixtures. Here, Neural Networks and k-Nearest Neighbours are used for prediction. Preliminary results are presented for the problem of estimating the concentration of cocaine in solid mixtures, and compared with previously published results in which statistical analysis of the same dataset was performed. Finally, this paper demonstrates how more accurate results may be achieved by using an ensemble of prediction techniques.

  7. Quantitative analysis of a transportable matter-wave gravimeter

    NASA Astrophysics Data System (ADS)

    Desruelle, B.; Le Moigne, N.; Bonvalot, S.; Menoret, V.; Vermeulen, P.; Merlet, S.

    2015-12-01

    This paper summarizes the latest results obtained with our second generation Absolute Quantum Gravimeter (AQG). This instrument relies on the utilization of advanced matter-wave interferometry techniques, which allow us to precisely characterize the vertical acceleration experienced by a cloud of cold atoms over a free-fall of 10 cm. A significant research effort was conducted over the last months to optimize the instrument sensitivity as well as the rejection of ground vibrations, and we will present the technological solutions that were selected to meet our objectives. We will then present a detailed review of the characterizations performed with this instrument. This data shows a very satisfactory sensitivity of the AQG (2 μGal standard deviation after 1000 s of data integration) and a very robust behavior against ground vibrations. We will also present a detailed analysis of the long term behavior of the instrument. These results clearly demonstrate the high potential of matter-wave gravimeter for high performance absolute gravity measurements. Eventually, we will discuss the research activities we are conducting to develop a field version of this instrument.

  8. Quantitative analysis of bloggers' collective behavior powered by emotions

    NASA Astrophysics Data System (ADS)

    Mitrović, Marija; Paltoglou, Georgios; Tadić, Bosiljka

    2011-02-01

    Large-scale data resulting from users' online interactions provide the ultimate source of information to study emergent social phenomena on the Web. From individual actions of users to observable collective behaviors, different mechanisms involving emotions expressed in the posted text play a role. Here we combine approaches of statistical physics with machine-learning methods of text analysis to study the emergence of emotional behavior among Web users. Mapping the high-resolution data from digg.com onto bipartite networks of users and their comments onto posted stories, we identify user communities centered around certain popular posts and determine emotional contents of the related comments by the emotion classifier developed for this type of text. Applied over different time periods, this framework reveals strong correlations between the excess of negative emotions and the evolution of communities. We observe avalanches of emotional comments exhibiting significant self-organized critical behavior and temporal correlations. To explore the robustness of these critical states, we design a network-automaton model on realistic network connections and several control parameters, which can be inferred from the dataset. Dissemination of emotions by a small fraction of very active users appears to critically tune the collective states.

  9. Direct Quantitative Analysis of Arsenic in Coal Fly Ash

    PubMed Central

    Hartuti, Sri; Kambara, Shinji; Takeyama, Akihiro; Kumabe, Kazuhiro; Moritomi, Hiroshi

    2012-01-01

    A rapid, simple method based on graphite furnace atomic absorption spectrometry is described for the direct determination of arsenic in coal fly ash. Solid samples were directly introduced into the atomizer without preliminary treatment. The direct analysis method was not always free of spectral matrix interference, but the stabilization of arsenic by adding palladium nitrate (chemical modifier) and the optimization of the parameters in the furnace program (temperature, rate of temperature increase, hold time, and argon gas flow) gave good results for the total arsenic determination. The optimal furnace program was determined by analyzing different concentrations of a reference material (NIST1633b), which showed the best linearity for calibration. The optimized parameters for the furnace programs for the ashing and atomization steps were as follows: temperatures of 500–1200 and 2150°C, heating rates of 100 and 500°C s−1, hold times of 90 and 7 s, and medium then maximum and medium argon gas flows, respectively. The calibration plots were linear with a correlation coefficient of 0.9699. This method was validated using arsenic-containing raw coal samples in accordance with the requirements of the mass balance calculation; the distribution rate of As in the fly ashes ranged from 101 to 119%. PMID:23251836

  10. Quantitative Analysis of the Microstructure of Auxetic Foams

    SciTech Connect

    Gaspar, N.; Smith, C.W.; Miller, E.A.; Seidler, G.T.; Evans, K.E.

    2008-07-28

    The auxetic foams first produced by Lakes have been modelled in a variety of ways, each model trying to reproduce some observed feature of the microscale of the foams. Such features include bent or broken ribs or inverted angles between ribs. These models can reproduce the Poisson's ratio or Poisson's function of auxetic foam if the model parameters are carefully chosen. However these model parameters may not actually reflect the internal structure of the foams. A big problem is that measurement of parameters such as lengths and angles is not straightforward within a 3-d sample. In this work a sample of auxetic foam has been imaged by 3-d X-ray computed tomography. The resulting image is translated to a form that emphasises the geometrical structure of connected ribs. This connected rib data are suitably analysed to describe both the microstructural construction of auxetic foams and the statistical spread of structure, that is, the heterogeneity of an auxetic foam. From the analysis of the microstructure, observations are made about the requirements for microstructural models and comparisons made to previous existing models. From the statistical data, measures of heterogeneity are made that will help with future modelling that includes the heterogeneous aspect of auxetic foams.

  11. Segmentation and learning in the quantitative analysis of microscopy images

    NASA Astrophysics Data System (ADS)

    Ruggiero, Christy; Ross, Amy; Porter, Reid

    2015-02-01

    In material science and bio-medical domains the quantity and quality of microscopy images is rapidly increasing and there is a great need to automatically detect, delineate and quantify particles, grains, cells, neurons and other functional "objects" within these images. These are challenging problems for image processing because of the variability in object appearance that inevitably arises in real world image acquisition and analysis. One of the most promising (and practical) ways to address these challenges is interactive image segmentation. These algorithms are designed to incorporate input from a human operator to tailor the segmentation method to the image at hand. Interactive image segmentation is now a key tool in a wide range of applications in microscopy and elsewhere. Historically, interactive image segmentation algorithms have tailored segmentation on an image-by-image basis, and information derived from operator input is not transferred between images. But recently there has been increasing interest to use machine learning in segmentation to provide interactive tools that accumulate and learn from the operator input over longer periods of time. These new learning algorithms reduce the need for operator input over time, and can potentially provide a more dynamic balance between customization and automation for different applications. This paper reviews the state of the art in this area, provides a unified view of these algorithms, and compares the segmentation performance of various design choices.

  12. The Quantitative Analysis of the Rotational Spectrum of Ncncs

    NASA Astrophysics Data System (ADS)

    Winnewisser, Manfred; Winnewisser, Brenda P.; Medvedev, Ivan R.; De Lucia, Frank C.; Ross, Stephen C.; Koput, Jacek

    2009-06-01

    The analysis of the rotational data which were the basis of our two previous publications about NCNCS as an example of quantum monodromy has been completed, and the data extended to include the 6th excited state of the quasilinear bending mode. This talk will present the results of fitting the data with the GSRB Hamiltonian, which provides structural and potential parameters. Ab initio calculations contributed some parameters that could not be determined from the data. The predicted variation of the expectation value of ρ, which is the complement of the CNC angle, and of the electric dipole transition moment, upon rovibrational excitation indicate the mapping of monodromy in the potential function into these properties of the molecule. B. P. Winnewisser, M. Winnewisser, I. R. Medvedev, M. Behnke, F. C. De Lucia, S. C. Ross and J. Koput Phys. Rev. Lett. 95 (243002), 2005. M. Winnewisser, B. P. Winnewisser, I. R. Medvedev, F. C. De Lucia, S. C. Ross and L. M. Bates J. Mol. Struct. 798 (1-26), 2006.

  13. Segmentation of vascular structures and hematopoietic cells in 3D microscopy images and quantitative analysis

    NASA Astrophysics Data System (ADS)

    Mu, Jian; Yang, Lin; Kamocka, Malgorzata M.; Zollman, Amy L.; Carlesso, Nadia; Chen, Danny Z.

    2015-03-01

    In this paper, we present image processing methods for quantitative study of how the bone marrow microenvironment changes (characterized by altered vascular structure and hematopoietic cell distribution) caused by diseases or various factors. We develop algorithms that automatically segment vascular structures and hematopoietic cells in 3-D microscopy images, perform quantitative analysis of the properties of the segmented vascular structures and cells, and examine how such properties change. In processing images, we apply local thresholding to segment vessels, and add post-processing steps to deal with imaging artifacts. We propose an improved watershed algorithm that relies on both intensity and shape information and can separate multiple overlapping cells better than common watershed methods. We then quantitatively compute various features of the vascular structures and hematopoietic cells, such as the branches and sizes of vessels and the distribution of cells. In analyzing vascular properties, we provide algorithms for pruning fake vessel segments and branches based on vessel skeletons. Our algorithms can segment vascular structures and hematopoietic cells with good quality. We use our methods to quantitatively examine the changes in the bone marrow microenvironment caused by the deletion of Notch pathway. Our quantitative analysis reveals property changes in samples with deleted Notch pathway. Our tool is useful for biologists to quantitatively measure changes in the bone marrow microenvironment, for developing possible therapeutic strategies to help the bone marrow microenvironment recovery.

  14. Hemato-critical issues in quantitative analysis of dried blood spots: challenges and solutions.

    PubMed

    De Kesel, Pieter Mm; Sadones, Nele; Capiau, Sara; Lambert, Willy E; Stove, Christophe P

    2013-08-01

    Dried blood spot (DBS) sampling for quantitative determination of drugs in blood has entered the bioanalytical arena at a fast pace during the last decade, primarily owing to progress in analytical instrumentation. Despite the many advantages associated with this new sampling strategy, several issues remain, of which the hematocrit issue is undoubtedly the most widely discussed challenge, since strongly deviating hematocrit values may significantly impact DBS-based quantitation. In this review, an overview is given of the different aspects of the 'hematocrit problem' in quantitative DBS analysis. The different strategies that try to cope with this problem are discussed, along with their potential and limitations. Implementation of some of these strategies in practice may help to overcome this important hurdle in DBS assays, further allowing DBS to become an established part of routine quantitative bioanalysis.

  15. Combination of quantitative analysis and chemometric analysis for the quality evaluation of three different frankincenses by ultra high performance liquid chromatography and quadrupole time of flight mass spectrometry.

    PubMed

    Zhang, Chao; Sun, Lei; Tian, Run-tao; Jin, Hong-yu; Ma, Shuang-Cheng; Gu, Bing-ren

    2015-10-01

    Frankincense has gained increasing attention in the pharmaceutical industry because of its pharmacologically active components such as boswellic acids. However, the identity and overall quality evaluation of three different frankincense species in different Pharmacopeias and the literature have less been reported. In this paper, quantitative analysis and chemometric evaluation were established and applied for the quality control of frankincense. Meanwhile, quantitative and chemometric analysis could be conducted under the same analytical conditions. In total 55 samples from four habitats (three species) of frankincense were collected and six boswellic acids were chosen for quantitative analysis. Chemometric analyses such as similarity analysis, hierarchical cluster analysis, and principal component analysis were used to identify frankincense of three species to reveal the correlation between its components and species. In addition, 12 chromatographic peaks have been tentatively identified explored by reference substances and quadrupole time-of-flight mass spectrometry. The results indicated that the total boswellic acid profiles of three species of frankincense are similar and their fingerprints can be used to differentiate between them. PMID:26228790

  16. Quantitative flux analysis reveals folate-dependent NADPH production

    NASA Astrophysics Data System (ADS)

    Fan, Jing; Ye, Jiangbin; Kamphorst, Jurre J.; Shlomi, Tomer; Thompson, Craig B.; Rabinowitz, Joshua D.

    2014-06-01

    ATP is the dominant energy source in animals for mechanical and electrical work (for example, muscle contraction or neuronal firing). For chemical work, there is an equally important role for NADPH, which powers redox defence and reductive biosynthesis. The most direct route to produce NADPH from glucose is the oxidative pentose phosphate pathway, with malic enzyme sometimes also important. Although the relative contribution of glycolysis and oxidative phosphorylation to ATP production has been extensively analysed, similar analysis of NADPH metabolism has been lacking. Here we demonstrate the ability to directly track, by liquid chromatography-mass spectrometry, the passage of deuterium from labelled substrates into NADPH, and combine this approach with carbon labelling and mathematical modelling to measure NADPH fluxes. In proliferating cells, the largest contributor to cytosolic NADPH is the oxidative pentose phosphate pathway. Surprisingly, a nearly comparable contribution comes from serine-driven one-carbon metabolism, in which oxidation of methylene tetrahydrofolate to 10-formyl-tetrahydrofolate is coupled to reduction of NADP+ to NADPH. Moreover, tracing of mitochondrial one-carbon metabolism revealed complete oxidation of 10-formyl-tetrahydrofolate to make NADPH. As folate metabolism has not previously been considered an NADPH producer, confirmation of its functional significance was undertaken through knockdown of methylenetetrahydrofolate dehydrogenase (MTHFD) genes. Depletion of either the cytosolic or mitochondrial MTHFD isozyme resulted in decreased cellular NADPH/NADP+ and reduced/oxidized glutathione ratios (GSH/GSSG) and increased cell sensitivity to oxidative stress. Thus, although the importance of folate metabolism for proliferating cells has been long recognized and attributed to its function of producing one-carbon units for nucleic acid synthesis, another crucial function of this pathway is generating reducing power.

  17. Quantitative ultrasound texture analysis for clinical decision making support

    NASA Astrophysics Data System (ADS)

    Wu, Jie Ying; Beland, Michael; Konrad, Joseph; Tuomi, Adam; Glidden, David; Grand, David; Merck, Derek

    2015-03-01

    We propose a general ultrasound (US) texture-analysis and machine-learning framework for detecting the presence of disease that is suitable for clinical application across clinicians, disease types, devices, and operators. Its stages are image selection, image filtering, ROI selection, feature parameterization, and classification. Each stage is modular and can be replaced with alternate methods. Thus, this framework is adaptable to a wide range of tasks. Our two preliminary clinical targets are hepatic steatosis and adenomyosis diagnosis. For steatosis, we collected US images from 288 patients and their pathology-determined values of steatosis (%) from biopsies. Two radiologists independently reviewed all images and identified the region of interest (ROI) most representative of the hepatic echotexture for each patient. To parameterize the images into comparable quantities, we filter the US images at multiple scales for various texture responses. For each response, we collect a histogram of pixel features within the ROI, and parameterize it as a Gaussian function using its mean, standard deviation, kurtosis, and skew to create a 36-feature vector. Our algorithm uses a support vector machine (SVM) for classification. Using a threshold of 10%, we achieved 72.81% overall accuracy, 76.18% sensitivity, and 65.96% specificity in identifying steatosis with leave-ten-out cross-validation (p<0.0001). Extending this framework to adenomyosis, we identified 38 patients with MR-confirmed findings of adenomyosis and previous US studies and 50 controls. A single rater picked the best US-image and ROI for each case. Using the same processing pipeline, we obtained 76.14% accuracy, 86.00% sensitivity, and 63.16% specificity with leave-one-out cross-validation (p<0.0001).

  18. Quantitative proteomic analysis of A549 cells infected with human respiratory syncytial virus.

    PubMed

    Munday, Diane C; Emmott, Edward; Surtees, Rebecca; Lardeau, Charles-Hugues; Wu, Weining; Duprex, W Paul; Dove, Brian K; Barr, John N; Hiscox, Julian A

    2010-11-01

    Human respiratory syncytial virus (HRSV) is a major cause of pediatric lower respiratory tract disease to which there is no vaccine or efficacious chemotherapeutic strategy. Although RNA synthesis and virus assembly occur in the cytoplasm, HRSV is known to induce nuclear responses in the host cell as replication alters global gene expression. Quantitative proteomics was used to take an unbiased overview of the protein changes in transformed human alveolar basal epithelial cells infected with HRSV. Underpinning this was the use of stable isotope labeling with amino acids in cell culture coupled to LC-MS/MS, which allowed the direct and simultaneous identification and quantification of both cellular and viral proteins. To reduce sample complexity and increase data return on potential protein localization, cells were fractionated into nuclear and cytoplasmic extracts. This resulted in the identification of 1,140 cellular proteins and six viral proteins. The proteomics data were analyzed using Ingenuity Pathways Analysis to identify defined canonical pathways and functional groupings. Selected data were validated using Western blot, direct and indirect immunofluorescence confocal microscopy, and functional assays. The study served to validate and expand upon known HRSV-host cell interactions, including those associated with the antiviral response and alterations in subnuclear structures such as the nucleolus and ND10 (promyelocytic leukemia bodies). In addition, novel changes were observed in mitochondrial proteins and functions, cell cycle regulatory molecules, nuclear pore complex proteins and nucleocytoplasmic trafficking proteins. These data shed light into how the cell is potentially altered to create conditions more favorable for infection. Additionally, the study highlights the application and advantage of stable isotope labeling with amino acids in cell culture coupled to LC-MS/MS for the analysis of virus-host interactions.

  19. Quantitative Expression Analysis in Brassica napus by Northern Blot Analysis and Reverse Transcription-Quantitative PCR in a Complex Experimental Setting

    PubMed Central

    Rumlow, Annekathrin; Keunen, Els; Klein, Jan; Pallmann, Philip; Riemenschneider, Anja; Cuypers, Ann

    2016-01-01

    Analysis of gene expression is one of the major ways to better understand plant reactions to changes in environmental conditions. The comparison of many different factors influencing plant growth challenges the gene expression analysis for specific gene-targeted experiments, especially with regard to the choice of suitable reference genes. The aim of this study is to compare expression results obtained by Northern blot, semi-quantitative PCR and RT-qPCR, and to identify a reliable set of reference genes for oilseed rape (Brassica napus L.) suitable for comparing gene expression under complex experimental conditions. We investigated the influence of several factors such as sulfur deficiency, different time points during the day, varying light conditions, and their interaction on gene expression in oilseed rape plants. The expression of selected reference genes was indeed influenced under these conditions in different ways. Therefore, a recently developed algorithm, called GrayNorm, was applied to validate a set of reference genes for normalizing results obtained by Northern blot analysis. After careful comparison of the three methods mentioned above, Northern blot analysis seems to be a reliable and cost-effective alternative for gene expression analysis under a complex growth regime. For using this method in a quantitative way a number of references was validated revealing that for our experiment a set of three references provides an appropriate normalization. Semi-quantitative PCR was prone to many handling errors and difficult to control while RT-qPCR was very sensitive to expression fluctuations of the reference genes. PMID:27685087

  20. Analysis of liver connexin expression using reverse transcription quantitative real-time polymerase chain reaction

    PubMed Central

    Maes, Michaël; Willebrords, Joost; Crespo Yanguas, Sara; Cogliati, Bruno; Vinken, Mathieu

    2016-01-01

    Summary Although connexin production is mainly regulated at the protein level, altered connexin gene expression has been identified as the underlying mechanism of several pathologies. When studying the latter, appropriate methods to quantify connexin mRNA levels are required. The present chapter describes a well-established reverse transcription quantitative real-time polymerase chain reaction procedure optimized for analysis of hepatic connexins. The method includes RNA extraction and subsequent quantification, generation of complementary DNA, quantitative real-time polymerase chain reaction and data analysis. PMID:27207283

  1. Analysis of Liver Connexin Expression Using Reverse Transcription Quantitative Real-Time Polymerase Chain Reaction.

    PubMed

    Maes, Michaël; Willebrords, Joost; Crespo Yanguas, Sara; Cogliati, Bruno; Vinken, Mathieu

    2016-01-01

    Although connexin production is mainly regulated at the protein level, altered connexin gene expression has been identified as the underlying mechanism of several pathologies. When studying the latter, appropriate methods to quantify connexin RNA levels are required. The present chapter describes a well-established reverse transcription quantitative real-time polymerase chain reaction procedure optimized for analysis of hepatic connexins. The method includes RNA extraction and subsequent quantification, generation of complementary DNA, quantitative real-time polymerase chain reaction, and data analysis. PMID:27207283

  2. Scattering influences in quantitative fission neutron radiography for the in situ analysis of hydrogen distribution in metal hydrides

    NASA Astrophysics Data System (ADS)

    Börries, S.; Metz, O.; Pranzas, P. K.; Bücherl, T.; Söllradl, S.; Dornheim, M.; Klassen, T.; Schreyer, A.

    2015-10-01

    In situ neutron radiography allows for the time-resolved study of hydrogen distribution in metal hydrides. However, for a precise quantitative investigation of a time-dependent hydrogen content within a host material, an exact knowledge of the corresponding attenuation coefficient is necessary. Additionally, the effect of scattering has to be considered as it is known to violate Beer's law, which is used to determine the amount of hydrogen from a measured intensity distribution. Within this study, we used a metal hydride inside two different hydrogen storage tanks as host systems, consisting of steel and aluminum. The neutron beam attenuation by hydrogen was investigated in these two different setups during the hydrogen absorption process. A linear correlation to the amount of absorbed hydrogen was found, allowing for a readily quantitative investigation. Further, an analysis of scattering contributions on the measured intensity distributions was performed and is described in detail.

  3. Persistence of Low Pathogenic Influenza A Virus in Water: A Systematic Review and Quantitative Meta-Analysis

    PubMed Central

    Dalziel, Antonia E.; Delean, Steven; Heinrich, Sarah; Cassey, Phillip

    2016-01-01

    Avian influenza viruses are able to persist in the environment, in-between the transmission of the virus among its natural hosts. Quantifying the environmental factors that affect the persistence of avian influenza virus is important for influencing our ability to predict future outbreaks and target surveillance and control methods. We conducted a systematic review and quantitative meta-analysis of the environmental factors that affect the decay of low pathogenic avian influenza virus (LPAIV) in water. Abiotic factors affecting the persistence of LPAIV have been investigated for nearly 40 years, yet published data was produced by only 26 quantitative studies. These studies have been conducted by a small number of principal authors (n = 17) and have investigated a narrow range of environmental conditions, all of which were based in laboratories with limited reflection of natural conditions. The use of quantitative meta-analytic techniques provided the opportunity to assess persistence across a greater range of conditions than each individual study can achieve, through the estimation of mean effect-sizes and relationships among multiple variables. Temperature was the most influential variable, for both the strength and magnitude of the effect-size. Moderator variables explained a large proportion of the heterogeneity among effect-sizes. Salinity and pH were important factors, although future work is required to broaden the range of abiotic factors examined, as well as including further diurnal variation and greater environmental realism generally. We were unable to extract a quantitative effect-size estimate for approximately half (50.4%) of the reported experimental outcomes and we strongly recommend a minimum set of quantitative reporting to be included in all studies, which will allow robust assimilation and analysis of future findings. In addition we suggest possible means of increasing the applicability of future studies to the natural environment, and

  4. Analysis of mixed cell cultures with quantitative digital holographic phase microscopy

    NASA Astrophysics Data System (ADS)

    Kemper, Björn; Wibbeling, Jana; Ketelhut, Steffi

    2014-05-01

    In order to study, for example, the influence of pharmaceuticals or pathogens on different cell types under identical measurement conditions and to analyze interactions between different cellular specimens a minimally-invasive quantitative observation of mixed cell cultures is of particular interest. Quantitative phase microscopy (QPM) provides high resolution detection of optical path length changes that is suitable for stain-free minimally-invasive live cell analysis. Due to low light intensities for object illumination, QPM minimizes the interaction with the sample and is in particular suitable for long term time-lapse investigations, e.g., for the detection of cell morphology alterations due to drugs and toxins. Furthermore, QPM has been demonstrated to be a versatile tool for the quantification of cellular growth, the extraction morphological parameters and cell motility. We studied the feasibility of QPM for the analysis of mixed cell cultures. It was explored if quantitative phase images provide sufficient information to distinguish between different cell types and to extract cell specific parameters. For the experiments quantitative phase imaging with digital holographic microscopy (DHM) was utilized. Mixed cell cultures with different types of human pancreatic tumor cells were observed with quantitative DHM phase contrast up to 35 h. The obtained series of quantitative phase images were evaluated by adapted algorithms for image segmentation. From the segmented images the cellular dry mass and the mean cell thickness were calculated and used in the further analysis as parameters to quantify the reliability the measurement principle. The obtained results demonstrate that it is possible to characterize the growth of cell types with different morphologies in a mixed cell culture separately by consideration of specimen size and cell thickness in the evaluation of quantitative DHM phase images.

  5. Quantitative Computed Tomography Protocols Affect Material Mapping and Quantitative Computed Tomography-Based Finite-Element Analysis Predicted Stiffness.

    PubMed

    Giambini, Hugo; Dragomir-Daescu, Dan; Nassr, Ahmad; Yaszemski, Michael J; Zhao, Chunfeng

    2016-09-01

    Quantitative computed tomography-based finite-element analysis (QCT/FEA) has become increasingly popular in an attempt to understand and possibly reduce vertebral fracture risk. It is known that scanning acquisition settings affect Hounsfield units (HU) of the CT voxels. Material properties assignments in QCT/FEA, relating HU to Young's modulus, are performed by applying empirical equations. The purpose of this study was to evaluate the effect of QCT scanning protocols on predicted stiffness values from finite-element models. One fresh frozen cadaveric torso and a QCT calibration phantom were scanned six times varying voltage and current and reconstructed to obtain a total of 12 sets of images. Five vertebrae from the torso were experimentally tested to obtain stiffness values. QCT/FEA models of the five vertebrae were developed for the 12 image data resulting in a total of 60 models. Predicted stiffness was compared to the experimental values. The highest percent difference in stiffness was approximately 480% (80 kVp, 110 mAs, U70), while the lowest outcome was ∼1% (80 kVp, 110 mAs, U30). There was a clear distinction between reconstruction kernels in predicted outcomes, whereas voltage did not present a clear influence on results. The potential of QCT/FEA as an improvement to conventional fracture risk prediction tools is well established. However, it is important to establish research protocols that can lead to results that can be translated to the clinical setting. PMID:27428281

  6. Quantitative underwater 3D motion analysis using submerged video cameras: accuracy analysis and trajectory reconstruction.

    PubMed

    Silvatti, Amanda P; Cerveri, Pietro; Telles, Thiago; Dias, Fábio A S; Baroni, Guido; Barros, Ricardo M L

    2013-01-01

    In this study we aim at investigating the applicability of underwater 3D motion capture based on submerged video cameras in terms of 3D accuracy analysis and trajectory reconstruction. Static points with classical direct linear transform (DLT) solution, a moving wand with bundle adjustment and a moving 2D plate with Zhang's method were considered for camera calibration. As an example of the final application, we reconstructed the hand motion trajectories in different swimming styles and qualitatively compared this with Maglischo's model. Four highly trained male swimmers performed butterfly, breaststroke and freestyle tasks. The middle fingertip trajectories of both hands in the underwater phase were considered. The accuracy (mean absolute error) of the two calibration approaches (wand: 0.96 mm - 2D plate: 0.73 mm) was comparable to out of water results and highly superior to the classical DLT results (9.74 mm). Among all the swimmers, the hands' trajectories of the expert swimmer in the style were almost symmetric and in good agreement with Maglischo's model. The kinematic results highlight symmetry or asymmetry between the two hand sides, intra- and inter-subject variability in terms of the motion patterns and agreement or disagreement with the model. The two outcomes, calibration results and trajectory reconstruction, both move towards the quantitative 3D underwater motion analysis.

  7. The correlation of contrast-enhanced ultrasound and MRI perfusion quantitative analysis in rabbit VX2 liver cancer.

    PubMed

    Xiang, Zhiming; Liang, Qianwen; Liang, Changhong; Zhong, Guimian

    2014-12-01

    Our objective is to explore the value of liver cancer contrast-enhanced ultrasound (CEUS) and MRI perfusion quantitative analysis in liver cancer and the correlation between these two analysis methods. Rabbit VX2 liver cancer model was established in this study. CEUS was applied. Sono Vue was applied in rabbits by ear vein to dynamically observe and record the blood perfusion and changes in the process of VX2 liver cancer and surrounding tissue. MRI perfusion quantitative analysis was used to analyze the mean enhancement time and change law of maximal slope increasing, which were further compared with the pathological examination results. Quantitative indicators of liver cancer CEUS and MRI perfusion quantitative analysis were compared, and the correlation between them was analyzed by correlation analysis. Rabbit VX2 liver cancer model was successfully established. CEUS showed that time-intensity curve of rabbit VX2 liver cancer showed "fast in, fast out" model while MRI perfusion quantitative analysis showed that quantitative parameter MTE of tumor tissue increased and MSI decreased: the difference was statistically significant (P < 0.01). The diagnostic results of CEUS and MRI perfusion quantitative analysis were not significantly different (P > 0.05). However, the quantitative parameter of them were significantly positively correlated (P < 0.05). CEUS and MRI perfusion quantitative analysis can both dynamically monitor the liver cancer lesion and surrounding liver parenchyma, and the quantitative parameters of them are correlated. The combined application of both is of importance in early diagnosis of liver cancer.

  8. Analysis of 129I in Groundwater Samples: Direct and Quantitative Results below the Drinking Water Standard

    SciTech Connect

    Brown, Christopher F.; Geiszler, Keith N.; Lindberg, Michael J.

    2007-03-03

    Due to its long half-life (15.7 million years) and relatively unencumbered migration in subsurface environments, 129I has been recognized as a contaminant of concern at numerous federal, private, and international facilities. In order to understand the long-term risk associated with 129I at these locations, quantitative analysis of groundwater samples must be performed. However, the ability to quantitatively assess the 129I content in groundwater samples requires specialized extraction and sophisticated analytical techniques, which are complicated and not always available to the general scientific community. This paper highlights an analytical method capable of directly quantifying 129I in groundwater samples at concentrations below the MCL without the need for sample pre-concentration. Samples were analyzed on a Perkin Elmer ELAN DRC II ICP-MS after minimal dilution using O2 as the reaction gas. Analysis of continuing calibration verification standards indicated that the DRC mode could be used for quantitative analysis of 129I in samples below the drinking water standard (0.0057 ng/ml or 1 pCi/L). The low analytical detection limit of 129I analysis in the DRC mode coupled with minimal sample dilution (1.02x) resulted in a final sample limit of quantification of 0.0051 ng/ml. Subsequent analysis of three groundwater samples containing 129I resulted in fully quantitative results in the DRC mode, and spike recovery analyses performed on all three samples confirmed that the groundwater matrix did not adversely impact the analysis of 129I in the DRC mode. This analytical approach has been proven to be a cost-effective, high-throughput technique for the direct, quantitative analysis of 129I in groundwater samples at concentrations below the current MCL.

  9. Possibility of quantitative estimation of blood cell forms by the spatial-frequency spectrum analysis

    NASA Astrophysics Data System (ADS)

    Spiridonov, Igor N.; Safonova, Larisa P.; Samorodov, Andrey V.

    2000-05-01

    At present in hematology there are no quantitative estimates of such important for the cell classification parameters: cell form and nuclear form. Due to the absence of the correlation between morphological parameters and parameters measured by hemoanalyzers, both flow cytometers and computer recognition systems, do not provide the completeness of the clinical blood analysis. Analysis of the spatial-frequency spectra of blood samples (smears and liquid probes) permit the estimate the forms quantitatively. On the results of theoretical and experimental researches carried out an algorithm of the form quantitative estimation by means of SFS parameters has been created. The criteria of the quality of these estimates have been proposed. A test bench based on the coherent optical and digital processors. The received results could be applied for the automated classification of ether normal or pathological blood cells in the standard blood smears.

  10. A novel rapid quantitative analysis of drug migration on tablets using laser induced breakdown spectroscopy.

    PubMed

    Yokoyama, Makoto; Tourigny, Martine; Moroshima, Kenji; Suzuki, Junsuke; Sakai, Miyako; Iwamoto, Kiyoshi; Takeuchi, Hirofumi

    2010-11-01

    There have been few reports wherein drug migration from the interior to the surface of a tablet has been analyzed quantitatively until now. In this paper, we propose a novel, rapid, quantitative analysis of drug migration in tablets using laser induced breakdown spectroscopy (LIBS). To evaluate drug migration, model tablets containing nicardipine hydrochloride as active pharmaceutical ingredient (API) were prepared by a conventional wet granulation method. Since the color of this API is pale yellow and all excipients are white, we can observe the degree of drug migration by visual inspection in these model tablets. In order to prepare tablets with different degrees of drug migration, the temperature of the drying process after tableting was varied between 50 to 80 °C. Using these manifold tablets, visual inspection, Fourier transform (FT)-IR mapping and LIBS analysis were carried out to evaluate the drug migration in the tablets. While drug migration could be observed using all methods, only LIBS analysis could provide quantitative analysis wherein the average LIBS intensity was correlated with the degree of drug migration obtained from the drying temperature. Moreover, in this work, we compared the sample preparation, data analysis process and measurement time for visual inspection, FT-IR mapping and LIBS analysis. The results of the comparison between these methods demonstrated that LIBS analysis is the simplest and the fastest method for migration monitoring. From the results obtained, we conclude that LIBS analysis is one of most useful process analytical technology (PAT) tools to solve the universal migration problem.

  11. Identification of Salmonella Typhimurium deubiquitinase SseL substrates by immunoaffinity enrichment and quantitative proteomic analysis

    SciTech Connect

    Nakayasu, Ernesto S.; Sydor, Michael A.; Brown, Roslyn N.; Sontag, Ryan L.; Sobreira, Tiago; Slysz, Gordon W.; Humphrys, Daniel R.; Skarina, Tatiana; Onoprienko, Olena; Di Leo, Rosa; Kaiser, Brooke L. Deatherage; Li, Jie; Ansong, Charles; Cambronne, Eric; Smith, Richard D.; Savchenko, Alexei; Adkins, Joshua N.

    2015-07-06

    Ubiquitination is a key protein post-translational modification that regulates many important cellular pathways and whose levels are regulated by equilibrium between the activities of ubiquitin ligases and deubiquitinases. Here we present a method to identify specific deubiquitinase substrates based on treatment of cell lysates with recombinant enzymes, immunoaffinity purification and global quantitative proteomic analysis. As model system to identify substrates, we used a virulence-related deubiquitinase secreted by Salmonella enterica serovar Typhimurium into the host cells, SseL. By using this approach two SseL substrates were identified in RAW 264.7 murine macrophage-like cell line, S100A6 and het-erogeneous nuclear ribonuclear protein K, in addition to the previously reported K63-linked ubiquitin chains. These substrates were further validated by a combination of enzymatic and binding assays. Finally, this method can be used for the systematic identification of substrates of deubiquitinases from other organisms and applied to study their functions in physiology and disease.

  12. Identification of Salmonella Typhimurium deubiquitinase SseL substrates by immunoaffinity enrichment and quantitative proteomic analysis

    DOE PAGES

    Nakayasu, Ernesto S.; Sydor, Michael A.; Brown, Roslyn N.; Sontag, Ryan L.; Sobreira, Tiago; Slysz, Gordon W.; Humphrys, Daniel R.; Skarina, Tatiana; Onoprienko, Olena; Di Leo, Rosa; et al

    2015-07-06

    Ubiquitination is a key protein post-translational modification that regulates many important cellular pathways and whose levels are regulated by equilibrium between the activities of ubiquitin ligases and deubiquitinases. Here we present a method to identify specific deubiquitinase substrates based on treatment of cell lysates with recombinant enzymes, immunoaffinity purification and global quantitative proteomic analysis. As model system to identify substrates, we used a virulence-related deubiquitinase secreted by Salmonella enterica serovar Typhimurium into the host cells, SseL. By using this approach two SseL substrates were identified in RAW 264.7 murine macrophage-like cell line, S100A6 and het-erogeneous nuclear ribonuclear protein K, inmore » addition to the previously reported K63-linked ubiquitin chains. These substrates were further validated by a combination of enzymatic and binding assays. Finally, this method can be used for the systematic identification of substrates of deubiquitinases from other organisms and applied to study their functions in physiology and disease.« less

  13. Quantitative Analysis of Myelin and Axonal Remodeling in the Uninjured Motor Network After Stroke.

    PubMed

    Lin, Ying-Chia; Daducci, Alessandro; Meskaldji, Djalel Eddine; Thiran, Jean-Philippe; Michel, Patrik; Meuli, Reto; Krueger, Gunnar; Menegaz, Gloria; Granziera, Cristina

    2015-09-01

    Contralesional brain connectivity plasticity was previously reported after stroke. This study aims at disentangling the biological mechanisms underlying connectivity plasticity in the uninjured motor network after an ischemic lesion. In particular, we measured generalized fractional anisotropy (GFA) and magnetization transfer ratio (MTR) to assess whether poststroke connectivity remodeling depends on axonal and/or myelin changes. Diffusion-spectrum imaging and magnetization transfer MRI at 3T were performed in 10 patients in acute phase, at 1 and 6 months after stroke, which was affecting motor cortical and/or subcortical areas. Ten age- and gender-matched healthy volunteers were scanned 1 month apart for longitudinal comparison. Clinical assessment was also performed in patients prior to magnetic resonance imaging (MRI). In the contralesional hemisphere, average measures and tract-based quantitative analysis of GFA and MTR were performed to assess axonal integrity and myelination along motor connections as well as their variations in time. Mean and tract-based measures of MTR and GFA showed significant changes in a number of contralesional motor connections, confirming both axonal and myelin plasticity in our cohort of patients. Moreover, density-derived features (peak height, standard deviation, and skewness) of GFA and MTR along the tracts showed additional correlation with clinical scores than mean values. These findings reveal the interplay between contralateral myelin and axonal remodeling after stroke.

  14. Quantitation and diversity analysis of ruminal methanogenic populations in response to the antimethanogenic compound bromochloromethane.

    PubMed

    Denman, Stuart E; Tomkins, Nigel W; McSweeney, Christopher S

    2007-12-01

    Methyl coenzyme-M reductase A (mcrA) clone libraries were generated from microbial DNA extracted from the rumen of cattle fed a roughage diet with and without supplementation of the antimethanogenic compound bromochloromethane. Bromochloromethane reduced total methane emissions by c. 30%, with a resultant increase in propionate and branched chain fatty acids. The mcrA clone libraries revealed that Methanobrevibacter spp. were the dominant species identified. A decrease in the incidence of Methanobrevibacter spp. from the clone library generated from bromochloromethane treatment was observed. In addition, a more diverse methanogenic population with representatives from Methanococcales, Methanomicrobiales and Methanosacinales orders was observed for the bromochloromethane library. Sequence data generated from these libraries aided in the design of an mcrA-targeted quantitative PCR (qPCR) assay. The reduction in methane production by bromochloromethane was associated with an average decrease of 34% in the number of methanogenic Archaea when monitored with this qPCR assay. Dissociation curve analysis of mcrA amplicons showed a clear difference in melting temperatures for Methanobrevibacter spp. (80-82 degrees C) and all other methanongens (84-86 degrees C). A decrease in the intensity of the Methanobrevibacter spp. specific peak and an increase for the other peak in the bromochloromethane-treated animals corresponded with the changes within the clone libraries.

  15. Quantitative time-course proteome analysis of Mesorhizobium loti during nodule maturation.

    PubMed

    Nambu, Mami; Tatsukami, Yohei; Morisaka, Hironobu; Kuroda, Kouichi; Ueda, Mitsuyoshi

    2015-07-01

    Rhizobia are nitrogen-fixing bacteria that establish a symbiotic relationship with leguminous plants. To understand the mechanism by which rhizobia alter their metabolism to establish successful nitrogen-fixing symbiotic relationship with hosts, Lotus japonicus were inoculated with Mesorhizobium loti. Bacteroids were isolated from nodules harvested at 2weeks (the early stage of nodule development), and at 3 and 4weeks (the intermediate stage of nodule development) post-inoculation. Using a quantitative time-course proteome analysis, we quantified the variations in the production of 537 proteins in M. loti bacteroids during the course of nodule maturation. The results revealed significant changes in the carbon and amino acid metabolisms by M. loti upon differentiating into bacteroids. Furthermore, our findings suggested that M. loti enters a nitrogen-deficient condition during the early stages of nodule development, and then a nitrogen-rich condition during the intermediate stages of nodule development. In addition, our data indicated that M. loti assimilated ammonia during the intermediate stages of nodule development. Our results provide new insights into the course of physiological transitions undergone by M. loti during nodule maturation.

  16. Elution profile analysis of SDS-induced subcomplexes by quantitative mass spectrometry.

    PubMed

    Texier, Yves; Toedt, Grischa; Gorza, Matteo; Mans, Dorus A; van Reeuwijk, Jeroen; Horn, Nicola; Willer, Jason; Katsanis, Nicholas; Roepman, Ronald; Gibson, Toby J; Ueffing, Marius; Boldt, Karsten

    2014-05-01

    Analyzing the molecular architecture of native multiprotein complexes via biochemical methods has so far been difficult and error prone. Protein complex isolation by affinity purification can define the protein repertoire of a given complex, yet, it remains difficult to gain knowledge of its substructure or modular composition. Here, we introduce SDS concentration gradient induced decomposition of protein complexes coupled to quantitative mass spectrometry and in silico elution profile distance analysis. By applying this new method to a cellular transport module, the IFT/lebercilin complex, we demonstrate its ability to determine modular composition as well as sensitively detect known and novel complex components. We show that the IFT/lebercilin complex can be separated into at least five submodules, the IFT complex A, the IFT complex B, the 14-3-3 protein complex and the CTLH complex, as well as the dynein light chain complex. Furthermore, we identify the protein TULP3 as a potential new member of the IFT complex A and showed that several proteins, classified as IFT complex B-associated, are integral parts of this complex. To further demonstrate EPASIS general applicability, we analyzed the modular substructure of two additional complexes, that of B-RAF and of 14-3-3-ε. The results show, that EPASIS provides a robust as well as sensitive strategy to dissect the substructure of large multiprotein complexes in a highly time- as well as cost-effective manner. PMID:24563533

  17. Quantitative assessment of chemical artefacts produced by propionylation of histones prior to mass spectrometry analysis.

    PubMed

    Soldi, Monica; Cuomo, Alessandro; Bonaldi, Tiziana

    2016-07-01

    Histone PTMs play a crucial role in regulating chromatin structure and function, with impact on gene expression. MS is nowadays widely applied to study histone PTMs systematically. Because histones are rich in arginine and lysine, classical shot-gun approaches based on trypsin digestion are typically not employed for histone modifications mapping. Instead, different protocols of chemical derivatization of lysines in combination with trypsin have been implemented to obtain "Arg-C like" digestion products that are more suitable for LC-MS/MS analysis. Although widespread, these strategies have been recently described to cause various side reactions that result in chemical modifications prone to be misinterpreted as native histone marks. These artefacts can also interfere with the quantification process, causing errors in histone PTMs profiling. The work of Paternoster V. et al. is a quantitative assessment of methyl-esterification and other side reactions occurring on histones after chemical derivatization of lysines with propionic anhydride [Proteomics 2016, 16, 2059-2063]. The authors estimate the effect of different solvents, incubation times, and pH on the extent of these side reactions. The results collected indicate that the replacement of methanol with isopropanol or ACN not only blocks methyl-esterification, but also significantly reduces other undesired unspecific reactions. Carefully titrating the pH after propionic anhydride addition is another way to keep methyl-esterification under control. Overall, the authors describe a set of experimental conditions that allow reducing the generation of various artefacts during histone propionylation. PMID:27373704

  18. Perspective - synthetic DEMs: a vital underpinning for the quantitative future of landform analysis?

    NASA Astrophysics Data System (ADS)

    Hillier, J. K.; Sofia, G.; Conway, S. J.

    2015-07-01

    Physical processes, including anthropogenic feedbacks, sculpt planetary surfaces (e.g., Earth's). A fundamental tenet of Geomorphology is that the shapes created, when combined with other measurements, can be used to understand those processes. Artificial or synthetic Digital Elevation Models (DEMs) might be vital in progressing further with this endeavour. Morphological data, including metrics and mapping (manual and automated) are a key resource, but at present their quality is typically weakly constrained (e.g., by mapper inter-comparison). In addition to examining inaccuracies caused by noise, relatively rare examples illustrate how synthetic DEMs containing a priori known, idealised morphologies can be used perform "synthetic tests" to make strong "absolute" statements about landform detection and quantification; e.g., 84 % of valley heads in the real landscape are identified correctly. From our perspective, it is vital to verify such statistics as ultimately they link physics-driven models of processes to morphological observations, allowing quantitative hypotheses to be formulated and tested. Synthetic DEMs built by directly using governing equations that encapsulate processes are another key part of forming this link. Thus, this note introduces synthetic tests and DEMs, then it outlines a typology of synthetic DEMs along with their benefits, challenges and future potential to provide constraints and insights. The aim is to discuss how we best proceed with uncertainty-aware landscape analysis to examine physical processes.

  19. Quantitative phosphokinome analysis of the Met pathway activated by the invasin internalin B from Listeria monocytogenes.

    PubMed

    Reinl, Tobias; Nimtz, Manfred; Hundertmark, Claudia; Johl, Thorsten; Kéri, György; Wehland, Jürgen; Daub, Henrik; Jänsch, Lothar

    2009-12-01

    Stimulated by its physiological ligand, hepatocyte growth factor, the transmembrane receptor tyrosine kinase Met activates a signaling machinery that leads to mitogenic, motogenic, and morphogenic responses. Remarkably, the food-borne human pathogen Listeria monocytogenes also promotes autophosphorylation of Met through its virulence factor internalin B (InlB) and subsequently exploits Met signaling to induce phagocytosis into a broad range of host cells. Although the interaction between InlB and Met has been studied in detail, the signaling specificity of components involved in InlB-triggered cellular responses remains poorly characterized. The analysis of regulated phosphorylation events on protein kinases is therefore of particular relevance, although this could not as yet be characterized systematically by proteomics. Here, we implemented a new pyridopyrimidine-based strategy that enabled the efficient capture of a considerable subset of the human kinome in a robust one-step affinity chromatographic procedure. Additionally, and to gain functional insights into the InlB/Met-induced bacterial invasion process, a quantitative survey of the phosphorylation pattern of these protein kinases was accomplished. In total, the experimental design of this study comprises affinity chromatographic procedures for the systematic enrichment of kinases, as well as phosphopeptides; the quantification of all peptides based on the iTRAQ reporter system; and a rational statistical strategy to evaluate the quality of phosphosite regulations. With this improved chemical proteomics strategy, we determined and relatively quantified 143 phosphorylation sites detected on 94 human protein kinases. Interestingly, InlB-mediated signaling shows striking similarities compared with the natural ligand hepatocyte growth factor that was intensively studied in the past. In addition, this systematic approach suggests a new subset of protein kinases including Nek9, which are differentially

  20. anNET: a tool for network-embedded thermodynamic analysis of quantitative metabolome data

    PubMed Central

    Zamboni, Nicola; Kümmel, Anne; Heinemann, Matthias

    2008-01-01

    Background Compared to other omics techniques, quantitative metabolomics is still at its infancy. Complex sample preparation and analytical procedures render exact quantification extremely difficult. Furthermore, not only the actual measurement but also the subsequent interpretation of quantitative metabolome data to obtain mechanistic insights is still lacking behind the current expectations. Recently, the method of network-embedded thermodynamic (NET) analysis was introduced to address some of these open issues. Building upon principles of thermodynamics, this method allows for a quality check of measured metabolite concentrations and enables to spot metabolic reactions where active regulation potentially controls metabolic flux. So far, however, widespread application of NET analysis in metabolomics labs was hindered by the absence of suitable software. Results We have developed in Matlab a generalized software called 'anNET' that affords a user-friendly implementation of the NET analysis algorithm. anNET supports the analysis of any metabolic network for which a stoichiometric model can be compiled. The model size can span from a single reaction to a complete genome-wide network reconstruction including compartments. anNET can (i) test quantitative data sets for thermodynamic consistency, (ii) predict metabolite concentrations beyond the actually measured data, (iii) identify putative sites of active regulation in the metabolic reaction network, and (iv) help in localizing errors in data sets that were found to be thermodynamically infeasible. We demonstrate the application of anNET with three published Escherichia coli metabolome data sets. Conclusion Our user-friendly and generalized implementation of the NET analysis method in the software anNET allows users to rapidly integrate quantitative metabolome data obtained from virtually any organism. We envision that use of anNET in labs working on quantitative metabolomics will provide the systems biology and

  1. Quantitative analysis of bristle number in Drosophila mutants identifies genes involved in neural development

    NASA Technical Reports Server (NTRS)

    Norga, Koenraad K.; Gurganus, Marjorie C.; Dilda, Christy L.; Yamamoto, Akihiko; Lyman, Richard F.; Patel, Prajal H.; Rubin, Gerald M.; Hoskins, Roger A.; Mackay, Trudy F.; Bellen, Hugo J.

    2003-01-01

    BACKGROUND: The identification of the function of all genes that contribute to specific biological processes and complex traits is one of the major challenges in the postgenomic era. One approach is to employ forward genetic screens in genetically tractable model organisms. In Drosophila melanogaster, P element-mediated insertional mutagenesis is a versatile tool for the dissection of molecular pathways, and there is an ongoing effort to tag every gene with a P element insertion. However, the vast majority of P element insertion lines are viable and fertile as homozygotes and do not exhibit obvious phenotypic defects, perhaps because of the tendency for P elements to insert 5' of transcription units. Quantitative genetic analysis of subtle effects of P element mutations that have been induced in an isogenic background may be a highly efficient method for functional genome annotation. RESULTS: Here, we have tested the efficacy of this strategy by assessing the extent to which screening for quantitative effects of P elements on sensory bristle number can identify genes affecting neural development. We find that such quantitative screens uncover an unusually large number of genes that are known to function in neural development, as well as genes with yet uncharacterized effects on neural development, and novel loci. CONCLUSIONS: Our findings establish the use of quantitative trait analysis for functional genome annotation through forward genetics. Similar analyses of quantitative effects of P element insertions will facilitate our understanding of the genes affecting many other complex traits in Drosophila.

  2. Quantitative Intersectionality: A Critical Race Analysis of the Chicana/o Educational Pipeline

    ERIC Educational Resources Information Center

    Covarrubias, Alejandro

    2011-01-01

    Utilizing the critical race framework of intersectionality, this research reexamines the Chicana/o educational pipeline through a quantitative intersectional analysis. This approach disaggregates data along the intersection of race, class, gender, and citizenship status to provide a detailed portrait of the educational trajectory of Mexican-origin…

  3. Qualitative and quantitative analysis of mixtures of compounds containing both hydrogen and deuterium

    NASA Technical Reports Server (NTRS)

    Crespi, H. L.; Harkness, L.; Katz, J. J.; Norman, G.; Saur, W.

    1969-01-01

    Method allows qualitative and quantitative analysis of mixtures of partially deuterated compounds. Nuclear magnetic resonance spectroscopy determines location and amount of deuterium in organic compounds but not fully deuterated compounds. Mass spectroscopy can detect fully deuterated species but not the location.

  4. A Computer Program for Calculation of Calibration Curves for Quantitative X-Ray Diffraction Analysis.

    ERIC Educational Resources Information Center

    Blanchard, Frank N.

    1980-01-01

    Describes a FORTRAN IV program written to supplement a laboratory exercise dealing with quantitative x-ray diffraction analysis of mixtures of polycrystalline phases in an introductory course in x-ray diffraction. Gives an example of the use of the program and compares calculated and observed calibration data. (Author/GS)

  5. A Quantitative Features Analysis of Recommended No- and Low-Cost Preschool E-Books

    ERIC Educational Resources Information Center

    Parette, Howard P.; Blum, Craig; Luthin, Katie

    2015-01-01

    In recent years, recommended e-books have drawn increasing attention from early childhood education professionals. This study applied a quantitative descriptive features analysis of cost (n = 70) and no-cost (n = 60) e-books recommended by the Texas Computer Education Association. While t tests revealed no statistically significant differences…

  6. A Quantitative Analysis of Cognitive Strategy Usage in the Marking of Two GCSE Examinations

    ERIC Educational Resources Information Center

    Suto, W. M. Irenka; Greatorex, Jackie

    2008-01-01

    Diverse strategies for marking GCSE examinations have been identified, ranging from simple automatic judgements to complex cognitive operations requiring considerable expertise. However, little is known about patterns of strategy usage or how such information could be utilised by examiners. We conducted a quantitative analysis of previous verbal…

  7. Quantitative and Qualitative Analysis of Nutrition and Food Safety Information in School Science Textbooks of India

    ERIC Educational Resources Information Center

    Subba Rao, G. M.; Vijayapushapm, T.; Venkaiah, K.; Pavarala, V.

    2012-01-01

    Objective: To assess quantity and quality of nutrition and food safety information in science textbooks prescribed by the Central Board of Secondary Education (CBSE), India for grades I through X. Design: Content analysis. Methods: A coding scheme was developed for quantitative and qualitative analyses. Two investigators independently coded the…

  8. ANSI/ASHRAE/IES Standard 90.1-2013 Preliminary Determination: Quantitative Analysis

    SciTech Connect

    Halverson, Mark A.; Rosenberg, Michael I.; Wang, Weimin; Zhang, Jian; Mendon, Vrushali V.; Athalye, Rahul A.; Xie, YuLong; Hart, Reid; Goel, Supriya

    2014-03-01

    This report provides a preliminary quantitative analysis to assess whether buildings constructed according to the requirements of ANSI/ASHRAE/IES Standard 90.1-2013 would result in energy savings compared with buildings constructed to ANSI/ASHRAE/IES Standard 90.1-2010.

  9. Gas chromatograph-mass spectrometer (GC/MS) system for quantitative analysis of reactive chemical compounds

    DOEpatents

    Grindstaff, Quirinus G.

    1992-01-01

    Described is a new gas chromatograph-mass spectrometer (GC/MS) system and method for quantitative analysis of reactive chemical compounds. All components of such a GC/MS system external to the oven of the gas chromatograph are programmably temperature controlled to operate at a volatilization temperature specific to the compound(s) sought to be separated and measured.

  10. Quantitative Analysis of Science and Chemistry Textbooks for Indicators of Reform: A Complementary Perspective

    ERIC Educational Resources Information Center

    Kahveci, Ajda

    2010-01-01

    In this study, multiple thematically based and quantitative analysis procedures were utilized to explore the effectiveness of Turkish chemistry and science textbooks in terms of their reflection of reform. The themes gender equity, questioning level, science vocabulary load, and readability level provided the conceptual framework for the analyses.…

  11. A Quantitative Categorical Analysis of Metadata Elements in Image-Applicable Metadata Schemas.

    ERIC Educational Resources Information Center

    Greenberg, Jane

    2001-01-01

    Reports on a quantitative categorical analysis of metadata elements in the Dublin Core, VRA (Visual Resource Association) Core, REACH (Record Export for Art and Cultural Heritage), and EAD (Encoded Archival Description) metadata schemas, all of which can be used for organizing and describing images. Introduces a new schema comparison methodology…

  12. Forty Years of the "Journal of Librarianship and Information Science": A Quantitative Analysis, Part I

    ERIC Educational Resources Information Center

    Furner, Jonathan

    2009-01-01

    This paper reports on the first part of a two-part quantitative analysis of volume 1-40 (1969-2008) of the "Journal of Librarianship and Information Science" (formerly the "Journal of Librarianship"). It provides an overview of the current state of LIS research journal publishing in the UK; a review of the publication and printing history of…

  13. Whose American Government? A Quantitative Analysis of Gender and Authorship in American Politics Texts

    ERIC Educational Resources Information Center

    Cassese, Erin C.; Bos, Angela L.; Schneider, Monica C.

    2014-01-01

    American government textbooks signal to students the kinds of topics that are important and, by omission, the kinds of topics that are not important to the discipline of political science. This article examines portrayals of women in introductory American politics textbooks through a quantitative content analysis of 22 widely used texts. We find…

  14. QUANTITATIVE PCR ANALYSIS OF MOLDS IN THE DUST FROM HOMES OF ASTHMATIC CHILDREN IN NORTH CAROLINA

    EPA Science Inventory

    The vacuum bag (VB) dust was analyzed by mold specific quantitative PCR. These results were compared to the analysis survey calculated for each of the homes. The mean and standard deviation (SD) of the ERMI values in the homes of the NC asthmatic children was 16.4 (6.77), compa...

  15. Mixing Qualitative and Quantitative Methods: Insights into Design and Analysis Issues

    ERIC Educational Resources Information Center

    Lieber, Eli

    2009-01-01

    This article describes and discusses issues related to research design and data analysis in the mixing of qualitative and quantitative methods. It is increasingly desirable to use multiple methods in research, but questions arise as to how best to design and analyze the data generated by mixed methods projects. I offer a conceptualization for such…

  16. Teaching Fundamental Skills in Microsoft Excel to First-Year Students in Quantitative Analysis

    ERIC Educational Resources Information Center

    Rubin, Samuel J.; Abrams, Binyomin

    2015-01-01

    Despite their technological savvy, most students entering university lack the necessary computer skills to succeed in a quantitative analysis course, in which they are often expected to input, analyze, and plot results of experiments without any previous formal education in Microsoft Excel or similar programs. This lack of formal education results…

  17. A Colorimetric Analysis Experiment Not Requiring a Spectrophotometer: Quantitative Determination of Albumin in Powdered Egg White

    ERIC Educational Resources Information Center

    Charlton, Amanda K.; Sevcik, Richard S.; Tucker, Dorie A.; Schultz, Linda D.

    2007-01-01

    A general science experiment for high school chemistry students might serve as an excellent review of the concepts of solution preparation, solubility, pH, and qualitative and quantitative analysis of a common food product. The students could learn to use safe laboratory techniques, collect and analyze data using proper scientific methodology and…

  18. Quantitative Analysis of Organic Compounds: A Simple and Rapid Method for Use in Schools

    ERIC Educational Resources Information Center

    Schmidt, Hans-Jurgen

    1973-01-01

    Describes the procedure for making a quantitative analysis of organic compounds suitable for secondary school chemistry classes. Using the Schoniger procedure, the organic compound, such as PVC, is decomposed in a conical flask with oxygen. The products are absorbed in a suitable liquid and analyzed by titration. (JR)

  19. [Development, optimization and application of the expression analysis platform based on multiplex quantitative RT-PCR using fluorescent universal primers].

    PubMed

    Wang, Qin-Xi; Li, Kai; Zhou, Yu-Xun; Xiao, Jun-Hua

    2009-05-01

    A multiplex quantitative RT-PCR technology with a universal fluorescent primer was established. This technology employs a chimeric-primer-induced-universal-primer amplification method that ensures target genes amplified in a constant ratio. This technique was cost-effective, moderate-throughput, and reliable in quantification of gene expression. It is complementary to cDNA chip, which has low quantitative accuracy , and Real-time quantitative PCR with low throughput, through improving the entire process of expression profiling analysis. Eleven genes within a QTL segment regulating mouse puberty onset on chromosome X were investigated to construct and optimize the method. The sensitivity of detection (102 copies) was determined, the concentration ratio of universal primer and chimeric forward primers (1:1) was optimized, and the accuracy and repeatability were validated. The method of Touchdown PCR with addition of universal primers significantly improved amplification of genes expressed in low abundance. After testing the expression profile of 11 genes in hypothalamus and testis in two mouse strains C3H/HeJ and C57BL/6J at the age of 15 d, one gene named PHF6 was found differentially expressed for further function analysis.

  20. Quantitative analysis of ciliary beating in primary ciliary dyskinesia: a pilot study

    PubMed Central

    2012-01-01

    Background Primary ciliary dyskinesia (PCD) is a rare congenital respiratory disorder characterized by abnormal ciliary motility leading to chronic airway infections. Qualitative evaluation of ciliary beat pattern based on digital high-speed videomicroscopy analysis has been proposed in the diagnosis process of PCD. Although this evaluation is easy in typical cases, it becomes difficult when ciliary beating is partially maintained. We postulated that a quantitative analysis of beat pattern would improve PCD diagnosis. We compared quantitative parameters with the qualitative evaluation of ciliary beat pattern in patients in whom the diagnosis of PCD was confirmed or excluded. Methods Nasal nitric oxide measurement, nasal brushings and biopsies were performed prospectively in 34 patients with suspected PCD. In combination with qualitative analysis, 12 quantitative parameters of ciliary beat pattern were determined on high-speed videomicroscopy recordings of beating ciliated edges. The combination of ciliary ultrastructural abnormalities on transmission electron microscopy analysis with low nasal nitric oxide levels was the “gold standard” used to establish the diagnosis of PCD. Results This “gold standard” excluded PCD in 15 patients (non-PCD patients), confirmed PCD in 10 patients (PCD patients) and was inconclusive in 9 patients. Among the 12 parameters, the distance traveled by the cilium tip weighted by the percentage of beating ciliated edges presented 96% sensitivity and 95% specificity. Qualitative evaluation and quantitative analysis were concordant in non-PCD patients. In 9/10 PCD patients, quantitative analysis was concordant with the “gold standard”, while the qualitative evaluation was discordant with the “gold standard” in 3/10 cases. Among the patients with an inconclusive “gold standard”, the use of quantitative parameters supported PCD diagnosis in 4/9 patients (confirmed by the identification of disease-causing mutations in one

  1. Distance measures and optimization spaces in quantitative fatty acid signature analysis.

    PubMed

    Bromaghin, Jeffrey F; Rode, Karyn D; Budge, Suzanne M; Thiemann, Gregory W

    2015-03-01

    Quantitative fatty acid signature analysis has become an important method of diet estimation in ecology, especially marine ecology. Controlled feeding trials to validate the method and estimate the calibration coefficients necessary to account for differential metabolism of individual fatty acids have been conducted with several species from diverse taxa. However, research into potential refinements of the estimation method has been limited. We compared the performance of the original method of estimating diet composition with that of five variants based on different combinations of distance measures and calibration-coefficient transformations between prey and predator fatty acid signature spaces. Fatty acid signatures of pseudopredators were constructed using known diet mixtures of two prey data sets previously used to estimate the diets of polar bears Ursus maritimus and gray seals Halichoerus grypus, and their diets were then estimated using all six variants. In addition, previously published diets of Chukchi Sea polar bears were re-estimated using all six methods. Our findings reveal that the selection of an estimation method can meaningfully influence estimates of diet composition. Among the pseudopredator results, which allowed evaluation of bias and precision, differences in estimator performance were rarely large, and no one estimator was universally preferred, although estimators based on the Aitchison distance measure tended to have modestly superior properties compared to estimators based on the Kullback-Leibler distance measure. However, greater differences were observed among estimated polar bear diets, most likely due to differential estimator sensitivity to assumption violations. Our results, particularly the polar bear example, suggest that additional research into estimator performance and model diagnostics is warranted. PMID:25859330

  2. Quantitative analysis of glycerol in dicarboxylic acid-rich cutins provides insights into Arabidopsis cutin structure.

    PubMed

    Yang, Weili; Pollard, Mike; Li-Beisson, Yonghua; Ohlrogge, John

    2016-10-01

    Cutin is an extracellular lipid polymer that contributes to protective cuticle barrier functions against biotic and abiotic stresses in land plants. Glycerol has been reported as a component of cutin, contributing up to 14% by weight of total released monomers. Previous studies using partial hydrolysis of cuticle-enriched preparations established the presence of oligomers with glycerol-aliphatic ester links. Furthermore, glycerol-3-phosphate 2-O-acyltransferases (sn-2-GPATs) are essential for cutin biosynthesis. However, precise roles of glycerol in cutin assembly and structure remain uncertain. Here, a stable isotope-dilution assay was developed for the quantitative analysis of glycerol by GC/MS of triacetin with simultaneous determination of aliphatic monomers. To provide clues about the role of glycerol in dicarboxylic acid (DCA)-rich cutins, this methodology was applied to compare wild-type (WT) Arabidopsis cutin with a series of mutants that are defective in cutin synthesis. The molar ratio of glycerol to total DCAs in WT cutins was 2:1. Even when allowing for a small additional contribution from hydroxy fatty acids, this is a substantially higher glycerol to aliphatic monomer ratio than previously reported for any cutin. Glycerol content was strongly reduced in both stem and leaf cutin from all Arabidopsis mutants analyzed (gpat4/gpat8, att1-2 and lacs2-3). In addition, the molar reduction of glycerol was proportional to the molar reduction of total DCAs. These results suggest "glycerol-DCA-glycerol" may be the dominant motif in DCA-rich cutins. The ramifications and caveats for this hypothesis are presented. PMID:27211345

  3. Distance measures and optimization spaces in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.; Rode, Karyn D.; Budge, Suzanne M.; Thiemann, Gregory W.

    2015-01-01

    Quantitative fatty acid signature analysis has become an important method of diet estimation in ecology, especially marine ecology. Controlled feeding trials to validate the method and estimate the calibration coefficients necessary to account for differential metabolism of individual fatty acids have been conducted with several species from diverse taxa. However, research into potential refinements of the estimation method has been limited. We compared the performance of the original method of estimating diet composition with that of five variants based on different combinations of distance measures and calibration-coefficient transformations between prey and predator fatty acid signature spaces. Fatty acid signatures of pseudopredators were constructed using known diet mixtures of two prey data sets previously used to estimate the diets of polar bears Ursus maritimus and gray seals Halichoerus grypus, and their diets were then estimated using all six variants. In addition, previously published diets of Chukchi Sea polar bears were re-estimated using all six methods. Our findings reveal that the selection of an estimation method can meaningfully influence estimates of diet composition. Among the pseudopredator results, which allowed evaluation of bias and precision, differences in estimator performance were rarely large, and no one estimator was universally preferred, although estimators based on the Aitchison distance measure tended to have modestly superior properties compared to estimators based on the Kullback-Leibler distance measure. However, greater differences were observed among estimated polar bear diets, most likely due to differential estimator sensitivity to assumption violations. Our results, particularly the polar bear example, suggest that additional research into estimator performance and model diagnostics is warranted.

  4. Distance measures and optimization spaces in quantitative fatty acid signature analysis

    PubMed Central

    Bromaghin, Jeffrey F; Rode, Karyn D; Budge, Suzanne M; Thiemann, Gregory W

    2015-01-01

    Quantitative fatty acid signature analysis has become an important method of diet estimation in ecology, especially marine ecology. Controlled feeding trials to validate the method and estimate the calibration coefficients necessary to account for differential metabolism of individual fatty acids have been conducted with several species from diverse taxa. However, research into potential refinements of the estimation method has been limited. We compared the performance of the original method of estimating diet composition with that of five variants based on different combinations of distance measures and calibration-coefficient transformations between prey and predator fatty acid signature spaces. Fatty acid signatures of pseudopredators were constructed using known diet mixtures of two prey data sets previously used to estimate the diets of polar bears Ursus maritimus and gray seals Halichoerus grypus, and their diets were then estimated using all six variants. In addition, previously published diets of Chukchi Sea polar bears were re-estimated using all six methods. Our findings reveal that the selection of an estimation method can meaningfully influence estimates of diet composition. Among the pseudopredator results, which allowed evaluation of bias and precision, differences in estimator performance were rarely large, and no one estimator was universally preferred, although estimators based on the Aitchison distance measure tended to have modestly superior properties compared to estimators based on the Kullback–Leibler distance measure. However, greater differences were observed among estimated polar bear diets, most likely due to differential estimator sensitivity to assumption violations. Our results, particularly the polar bear example, suggest that additional research into estimator performance and model diagnostics is warranted. PMID:25859330

  5. Qualitative and quantitative proteomic analysis of formalin-fixed paraffin-embedded (FFPE) tissue.

    PubMed

    Azimzadeh, Omid; Atkinson, Michael J; Tapio, Soile

    2015-01-01

    Formalin-fixed, paraffin-embedded (FFPE) tissue has recently gained interest as an alternative to fresh/frozen tissue for retrospective protein biomarker discovery. However, during the formalin fixation proteins undergo degradation and cross-linking, making conventional protein analysis technologies challenging. Cross-linking is even more challenging when quantitative proteome analysis of FFPE tissue is planned. The use of conventional protein labeling technologies on FFPE tissue has turned out to be problematic as the lysine residue labeling targets are frequently blocked by the formalin treatment. We have established a qualitative and quantitative proteomics analysis technique for FFPE tissues that combines label-free proteomic analysis with optimized protein extraction and separation conditions.

  6. Quantitative analysis of defects in silicon. Silicon sheet growth development for the large are silicon sheet task of the low-cost solar array project

    NASA Technical Reports Server (NTRS)

    Natesh, R.; Smith, J. M.; Bruce, T.; Oidwai, H. A.

    1980-01-01

    One hundred and seventy four silicon sheet samples were analyzed for twin boundary density, dislocation pit density, and grain boundary length. Procedures were developed for the quantitative analysis of the twin boundary and dislocation pit densities using a QTM-720 Quantitative Image Analyzing system. The QTM-720 system was upgraded with the addition of a PDP 11/03 mini-computer with dual floppy disc drive, a digital equipment writer high speed printer, and a field-image feature interface module. Three versions of a computer program that controls the data acquisition and analysis on the QTM-720 were written. Procedures for the chemical polishing and etching were also developed.

  7. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    PubMed Central

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. Conclusions The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil. PMID:25182282

  8. Quantitative analysis of the mixtures of illicit drugs using terahertz time-domain spectroscopy

    NASA Astrophysics Data System (ADS)

    Jiang, Dejun; Zhao, Shusen; Shen, Jingling

    2008-03-01

    A method was proposed to quantitatively inspect the mixtures of illicit drugs with terahertz time-domain spectroscopy technique. The mass percentages of all components in a mixture can be obtained by linear regression analysis, on the assumption that all components in the mixture and their absorption features be known. For illicit drugs were scarce and expensive, firstly we used common chemicals, Benzophenone, Anthraquinone, Pyridoxine hydrochloride and L-Ascorbic acid in the experiment. Then illicit drugs and a common adulterant, methamphetamine and flour, were selected for our experiment. Experimental results were in significant agreement with actual content, which suggested that it could be an effective method for quantitative identification of illicit drugs.

  9. The other half of the story: effect size analysis in quantitative research.

    PubMed

    Maher, Jessica Middlemis; Markey, Jonathan C; Ebert-May, Diane

    2013-01-01

    Statistical significance testing is the cornerstone of quantitative research, but studies that fail to report measures of effect size are potentially missing a robust part of the analysis. We provide a rationale for why effect size measures should be included in quantitative discipline-based education research. Examples from both biological and educational research demonstrate the utility of effect size for evaluating practical significance. We also provide details about some effect size indices that are paired with common statistical significance tests used in educational research and offer general suggestions for interpreting effect size measures. Finally, we discuss some inherent limitations of effect size measures and provide further recommendations about reporting confidence intervals.

  10. Tannin structural elucidation and quantitative ³¹P NMR analysis. 2. Hydrolyzable tannins and proanthocyanidins.

    PubMed

    Melone, Federica; Saladino, Raffaele; Lange, Heiko; Crestini, Claudia

    2013-10-01

    An unprecedented analytical method that allows simultaneous structural and quantitative characterization of all functional groups present in tannins is reported. In situ labeling of all labile H groups (aliphatic and phenolic hydroxyls and carboxylic acids) with a phosphorus-containing reagent (Cl-TMDP) followed by quantitative ³¹P NMR acquisition constitutes a novel fast and reliable analytical tool for the analysis of tannins and proanthocyanidins with significant implications for the fields of food and feed analyses, tannery, and the development of natural polyphenolics containing products. PMID:23998855

  11. Quantitative hopanoid analysis enables robust pattern detection and comparison between laboratories.

    PubMed

    Wu, C-H; Kong, L; Bialecka-Fornal, M; Park, S; Thompson, A L; Kulkarni, G; Conway, S J; Newman, D K

    2015-07-01

    Hopanoids are steroid-like lipids from the isoprenoid family that are produced primarily by bacteria. Hopanes, molecular fossils of hopanoids, offer the potential to provide insight into environmental transitions on the early Earth, if their sources and biological functions can be constrained. Semiquantitative methods for mass spectrometric analysis of hopanoids from cultures and environmental samples have been developed in the last two decades. However, the structural diversity of hopanoids, and possible variability in their ionization efficiencies on different instruments, have thus far precluded robust quantification and hindered comparison of results between laboratories. These ionization inconsistencies give rise to the need to calibrate individual instruments with purified hopanoids to reliably quantify hopanoids. Here, we present new approaches to obtain both purified and synthetic quantification standards. We optimized 2-methylhopanoid production in Rhodopseudomonas palustris TIE-1 and purified 2Me-diplopterol, 2Me-bacteriohopanetetrol (2Me-BHT), and their unmethylated species (diplopterol and BHT). We found that 2-methylation decreases the signal intensity of diplopterol between 2 and 34% depending on the instrument used to detect it, but decreases the BHT signal less than 5%. In addition, 2Me-diplopterol produces 10× higher ion counts than equivalent quantities of 2Me-BHT. Similar deviations were also observed using a flame ionization detector for signal quantification in GC. In LC-MS, however, 2Me-BHT produces 11× higher ion counts than 2Me-diplopterol but only 1.2× higher ion counts than the sterol standard pregnane acetate. To further improve quantification, we synthesized tetradeuterated (D4) diplopterol, a precursor for a variety of hopanoids. LC-MS analysis on a mixture of (D4)-diplopterol and phospholipids showed that under the influence of co-eluted phospholipids, the D4-diplopterol internal standard quantifies diplopterol more accurately than

  12. Quantitative trace analysis of fullerenes in river sediment from Spain and soils from Saudi Arabia.

    PubMed

    Sanchís, Josep; Božović, Dalibor; Al-Harbi, Naif A; Silva, Luis F; Farré, Marinella; Barceló, Damià

    2013-07-01

    A quantitative method based on ultrasound-assisted toluene extraction followed by liquid chromatography-electrospray ionization-tandem mass spectrometry for the analysis of C60 and C70 fullerenes, N-methylfulleropyrrolidine, [6, 6]-phenyl C61 butyric acid methyl ester and [6, 6]-thienyl C61 butyric acid methyl ester has been developed. The method was validated using fortified blank river sediments according to the criteria of Commission Decision 2002/657/EC. The method limits of detection ranged from 14 to 290 pg/g, making it suitable for its application in environmental analysis. The method has been applied to investigate fullerene content in 58 soil samples collected from different urban and industrial areas in Saudi Arabia and in river sediment from six different sites in the Llobregat River Basin. In addition, in the case of the Llobregat River, superficial water samples from the same sites of the sediments were collected and analysed using a previous method. In soils from Saudi Arabia, C60-fullerene was the only compound that was detected and quantified in 19% of samples. In the sediments of the Llobregat River, C60-fullerene was also the only one detected (33% of the samples), while in river water, C70-fullerene was the most frequent compound, and it was quantified in 67% of the samples. However, C60-fullerene was present in two of the six samples, but at higher concentrations than C70-fullerene, ranging from 0.9 to 7.8 ng/L. PMID:23545859

  13. Quantitative hopanoid analysis enables robust pattern detection and comparison between laboratories

    PubMed Central

    Wu, C-H; Kong, L; Bialecka-Fornal, M; Park, S; Thompson, A L; Kulkarni, G; Conway, S J; Newman, D K

    2015-01-01

    Hopanoids are steroid-like lipids from the isoprenoid family that are produced primarily by bacteria. Hopanes, molecular fossils of hopanoids, offer the potential to provide insight into environmental transitions on the early Earth, if their sources and biological functions can be constrained. Semiquantitative methods for mass spectrometric analysis of hopanoids from cultures and environmental samples have been developed in the last two decades. However, the structural diversity of hopanoids, and possible variability in their ionization efficiencies on different instruments, have thus far precluded robust quantification and hindered comparison of results between laboratories. These ionization inconsistencies give rise to the need to calibrate individual instruments with purified hopanoids to reliably quantify hopanoids. Here, we present new approaches to obtain both purified and synthetic quantification standards. We optimized 2-methylhopanoid production in Rhodopseudomonas palustris TIE-1 and purified 2Me-diplopterol, 2Me-bacteriohopanetetrol (2Me-BHT), and their unmethylated species (diplopterol and BHT). We found that 2-methylation decreases the signal intensity of diplopterol between 2 and 34% depending on the instrument used to detect it, but decreases the BHT signal less than 5%. In addition, 2Me-diplopterol produces 10× higher ion counts than equivalent quantities of 2Me-BHT. Similar deviations were also observed using a flame ionization detector for signal quantification in GC. In LC-MS, however, 2Me-BHT produces 11× higher ion counts than 2Me-diplopterol but only 1.2× higher ion counts than the sterol standard pregnane acetate. To further improve quantification, we synthesized tetradeuterated (D4) diplopterol, a precursor for a variety of hopanoids. LC-MS analysis on a mixture of (D4)-diplopterol and phospholipids showed that under the influence of co-eluted phospholipids, the D4-diplopterol internal standard quantifies diplopterol more accurately than

  14. Perspective - synthetic DEMs: A vital underpinning for the quantitative future of landform analysis?

    NASA Astrophysics Data System (ADS)

    Hillier, J. K.; Sofia, G.; Conway, S. J.

    2015-12-01

    Physical processes, including anthropogenic feedbacks, sculpt planetary surfaces (e.g. Earth's). A fundamental tenet of geomorphology is that the shapes created, when combined with other measurements, can be used to understand those processes. Artificial or synthetic digital elevation models (DEMs) might be vital in progressing further with this endeavour in two ways. First, synthetic DEMs can be built (e.g. by directly using governing equations) to encapsulate the processes, making predictions from theory. A second, arguably underutilised, role is to perform checks on accuracy and robustness that we dub "synthetic tests". Specifically, synthetic DEMs can contain a priori known, idealised morphologies that numerical landscape evolution models, DEM-analysis algorithms, and even manual mapping can be assessed against. Some such tests, for instance examining inaccuracies caused by noise, are moderately commonly employed, whilst others are much less so. Derived morphological properties, including metrics and mapping (manual and automated), are required to establish whether or not conceptual models represent reality well, but at present their quality is typically weakly constrained (e.g. by mapper inter-comparison). Relatively rare examples illustrate how synthetic tests can make strong "absolute" statements about landform detection and quantification; for example, 84 % of valley heads in the real landscape are identified correctly. From our perspective, it is vital to verify such statistics quantifying the properties of landscapes as ultimately this is the link between physics-driven models of processes and morphological observations that allows quantitative hypotheses to be tested. As such the additional rigour possible with this second usage of synthetic DEMs feeds directly into a problem central to the validity of much of geomorphology. Thus, this note introduces synthetic tests and DEMs and then outlines a typology of synthetic DEMs along with their benefits

  15. Quantitative analysis of proteome extracted from barley crowns grown under different drought conditions

    PubMed Central

    Vítámvás, Pavel; Urban, Milan O.; Škodáček, Zbynek; Kosová, Klára; Pitelková, Iva; Vítámvás, Jan; Renaut, Jenny; Prášil, Ilja T.

    2015-01-01

    Barley cultivar Amulet was used to study the quantitative proteome changes through different drought conditions utilizing two-dimensional difference gel electrophoresis (2D-DIGE). Plants were cultivated for 10 days under different drought conditions. To obtain control and differentially drought-treated plants, the soil water content was kept at 65, 35, and 30% of soil water capacity (SWC), respectively. Osmotic potential, water saturation deficit, 13C discrimination, and dehydrin accumulation were monitored during sampling of the crowns for proteome analysis. Analysis of the 2D-DIGE gels revealed 105 differentially abundant spots; most were differentially abundant between the controls and drought-treated plants, and 25 spots displayed changes between both drought conditions. Seventy-six protein spots were successfully identified by tandem mass spectrometry. The most frequent functional categories of the identified proteins can be put into the groups of: stress-associated proteins, amino acid metabolism, carbohydrate metabolism, as well as DNA and RNA regulation and processing. Their possible role in the response of barley to drought stress is discussed. Our study has shown that under drought conditions barley cv. Amulet decreased its growth and developmental rates, displayed a shift from aerobic to anaerobic metabolism, and exhibited increased levels of several protective proteins. Comparison of the two drought treatments revealed plant acclimation to milder drought (35% SWC); but plant damage under more severe drought treatment (30% SWC). The results obtained revealed that cv. Amulet is sensitive to drought stress. Additionally, four spots revealing a continuous and significant increase with decreasing SWC (UDP-glucose 6-dehydrogenase, glutathione peroxidase, and two non-identified) could be good candidates for testing of their protein phenotyping capacity together with proteins that were significantly distinguished in both drought treatments. PMID:26175745

  16. Advances in liquid chromatography-high-resolution mass spectrometry for quantitative and qualitative environmental analysis.

    PubMed

    Aceña, Jaume; Stampachiacchiere, Serena; Pérez, Sandra; Barceló, Damià

    2015-08-01

    This review summarizes the advances in environmental analysis by liquid chromatography-high-resolution mass spectrometry (LC-HRMS) during the last decade and discusses different aspects of their application. LC-HRMS has become a powerful tool for simultaneous quantitative and qualitative analysis of organic pollutants, enabling their quantitation and the search for metabolites and transformation products or the detection of unknown compounds. LC-HRMS provides more information than low-resolution (LR) MS for each sample because it can accurately determine the mass of the molecular ion and its fragment ions if it can be used for MS-MS. Another advantage is that the data can be processed using either target analysis, suspect screening, retrospective analysis, or non-target screening. With the growing popularity and acceptance of HRMS analysis, current guidelines for compound confirmation need to be revised for quantitative and qualitative purposes. Furthermore, new commercial software and user-built libraries are required to mine data in an efficient and comprehensive way. The scope of this critical review is not to provide a comprehensive overview of the many studies performed with LC-HRMS in the field of environmental analysis, but to reveal its advantages and limitations using different workflows. PMID:26138893

  17. Advances in liquid chromatography-high-resolution mass spectrometry for quantitative and qualitative environmental analysis.

    PubMed

    Aceña, Jaume; Stampachiacchiere, Serena; Pérez, Sandra; Barceló, Damià

    2015-08-01

    This review summarizes the advances in environmental analysis by liquid chromatography-high-resolution mass spectrometry (LC-HRMS) during the last decade and discusses different aspects of their application. LC-HRMS has become a powerful tool for simultaneous quantitative and qualitative analysis of organic pollutants, enabling their quantitation and the search for metabolites and transformation products or the detection of unknown compounds. LC-HRMS provides more information than low-resolution (LR) MS for each sample because it can accurately determine the mass of the molecular ion and its fragment ions if it can be used for MS-MS. Another advantage is that the data can be processed using either target analysis, suspect screening, retrospective analysis, or non-target screening. With the growing popularity and acceptance of HRMS analysis, current guidelines for compound confirmation need to be revised for quantitative and qualitative purposes. Furthermore, new commercial software and user-built libraries are required to mine data in an efficient and comprehensive way. The scope of this critical review is not to provide a comprehensive overview of the many studies performed with LC-HRMS in the field of environmental analysis, but to reveal its advantages and limitations using different workflows.

  18. Analysis methods for the determination of anthropogenic additions of P to agricultural soils

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Phosphorus additions and measurement in soil is of concern on lands where biosolids have been applied. Colorimetric analysis for plant-available P may be inadequate for the accurate assessment of soil P. Phosphate additions in a regulatory environment need to be accurately assessed as the reported...

  19. Application of BP Neural Network Based on Genetic Algorithm in Quantitative Analysis of Mixed GAS

    NASA Astrophysics Data System (ADS)

    Chen, Hongyan; Liu, Wenzhen; Qu, Jian; Zhang, Bing; Li, Zhibin

    Aiming at the problem of mixed gas detection in neural network and analysis on the principle of gas detection. Combining BP algorithm of genetic algorithm with hybrid gas sensors, a kind of quantitative analysis system of mixed gas is designed. The local minimum of network learning is the main reason which affects the precision of gas analysis. On the basis of the network study to improve the learning algorithms, the analyses and tests for CO, CO2 and HC compounds were tested. The results showed that the above measures effectively improve and enhance the accuracy of the neural network for gas analysis.

  20. Quantitative Analysis of Pork and Chicken Products by Droplet Digital PCR

    PubMed Central

    Cai, Yicun; Li, Xiang; Lv, Rong; Yang, Jielin; Li, Jian; He, Yuping; Pan, Liangwen

    2014-01-01

    In this project, a highly precise quantitative method based on the digital polymerase chain reaction (dPCR) technique was developed to determine the weight of pork and chicken in meat products. Real-time quantitative polymerase chain reaction (qPCR) is currently used for quantitative molecular analysis of the presence of species-specific DNAs in meat products. However, it is limited in amplification efficiency and relies on standard curves based Ct values, detecting and quantifying low copy number target DNA, as in some complex mixture meat products. By using the dPCR method, we find the relationships between the raw meat weight and DNA weight and between the DNA weight and DNA copy number were both close to linear. This enabled us to establish formulae to calculate the raw meat weight based on the DNA copy number. The accuracy and applicability of this method were tested and verified using samples of pork and chicken powder mixed in known proportions. Quantitative analysis indicated that dPCR is highly precise in quantifying pork and chicken in meat products and therefore has the potential to be used in routine analysis by government regulators and quality control departments of commercial food and feed enterprises. PMID:25243184